Configure ollama to use external host instead of docker service

- Update all ollama clients to use configurable OLLAMA_URL environment variable
- Remove ollama service from docker-compose.yml to use external ollama instance
- Configure docker-compose to connect to host ollama via 172.17.0.1:11434 (Linux) or host.docker.internal (macOS/Windows)
- Add cross-platform compatibility with extra_hosts mapping
- Update embedding function fallback URL for consistency

🤖 Generated with [Claude Code](https://claude.ai/code)

Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
2025-10-02 20:29:48 -04:00
parent a640ae5fed
commit 03b033e9a4
5 changed files with 18 additions and 18 deletions

View File

@@ -1,12 +1,16 @@
import json
import os
from typing import Literal
import datetime
from ollama import chat, ChatResponse
from ollama import chat, ChatResponse, Client
from openai import OpenAI
from pydantic import BaseModel, Field
# Configure ollama client with URL from environment or default to localhost
ollama_client = Client(host=os.getenv("OLLAMA_URL", "http://localhost:11434"))
# This uses inferred filters — which means using LLM to create the metadata filters
@@ -109,7 +113,7 @@ class QueryGenerator:
print(response)
query = json.loads(response.output_parsed.extracted_metadata_fields)
# response: ChatResponse = chat(
# response: ChatResponse = ollama_client.chat(
# model="gemma3n:e4b",
# messages=[
# {"role": "system", "content": PROMPT},