Anthony and Ben talk about SQL Server 2025 – Episode 2: Running Local LLMs with Ollama

In this episode, Ben and Anthony demonstrate how to integrate Ollama, a local LLM runtime, directly into your SQL Server workflows.

What’s Inside:

  • Full setup of Ollama with Docker and local REST access
  • SQL Server calls out to the Ollama API using sp_invoke_external_rest_api
  • Real-world use cases: text generation, summarization, classification
  • Security and latency benefits of keeping everything on-prem

πŸ”— Watch on YouTube

Leave a comment

Your email address will not be published. Required fields are marked *