Replace Ollama with llama-server (OpenAI-compatible API)
- Update llm.py to use OpenAI client with custom base_url for llama-server - Update agents.py to use ChatOpenAI instead of ChatOllama - Remove unused ollama imports from main.py, chunker.py, query.py - Add LLAMA_SERVER_URL and LLAMA_MODEL_NAME env vars - Remove ollama and langchain-ollama dependencies Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
@@ -9,7 +9,6 @@ dependencies = [
|
||||
"python-dotenv>=1.0.0",
|
||||
"flask>=3.1.2",
|
||||
"httpx>=0.28.1",
|
||||
"ollama>=0.6.0",
|
||||
"openai>=2.0.1",
|
||||
"pydantic>=2.11.9",
|
||||
"pillow>=10.0.0",
|
||||
@@ -34,7 +33,6 @@ dependencies = [
|
||||
"langchain-chroma>=1.0.0",
|
||||
"langchain-community>=0.4.1",
|
||||
"jq>=1.10.0",
|
||||
"langchain-ollama>=1.0.1",
|
||||
"tavily-python>=0.7.17",
|
||||
]
|
||||
|
||||
|
||||
Reference in New Issue
Block a user