Compare commits
34 Commits
user-suppo
...
1a026f76a1
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
1a026f76a1 | ||
|
|
da3a464897 | ||
|
|
913875188a | ||
|
|
f5e2d68cd2 | ||
|
|
70799ffb7d | ||
|
|
7f1d4fbdda | ||
|
|
5ebdd60ea0 | ||
|
|
289045e7d0 | ||
|
|
ceea83cb54 | ||
|
|
1b60aab97c | ||
|
|
210bfc1476 | ||
|
|
454fb1b52c | ||
|
|
c3f2501585 | ||
|
|
1da21fabee | ||
|
|
dd5690ee53 | ||
|
|
5e7ac28b6f | ||
|
|
29f8894e4a | ||
|
|
19d1df2f68 | ||
|
|
e577cb335b | ||
|
|
591788dfa4 | ||
|
|
561b5bddce | ||
|
|
ddd455a4c6 | ||
|
|
07424e77e0 | ||
|
|
a56f752917 | ||
|
|
e8264e80ce | ||
|
|
04350045d3 | ||
|
|
f16e13fccc | ||
|
|
245db92524 | ||
|
|
29ac724d50 | ||
|
|
7161c09a4e | ||
|
|
68d73b62e8 | ||
|
|
6b616137d3 | ||
|
|
841b6ebd4f | ||
|
|
45a5e92aee |
44
.env.example
Normal file
44
.env.example
Normal file
@@ -0,0 +1,44 @@
|
|||||||
|
# Database Configuration
|
||||||
|
# PostgreSQL is recommended (required for OIDC features)
|
||||||
|
DATABASE_URL=postgres://raggr:changeme@postgres:5432/raggr
|
||||||
|
|
||||||
|
# PostgreSQL credentials (if using docker-compose postgres service)
|
||||||
|
POSTGRES_USER=raggr
|
||||||
|
POSTGRES_PASSWORD=changeme
|
||||||
|
POSTGRES_DB=raggr
|
||||||
|
|
||||||
|
# JWT Configuration
|
||||||
|
JWT_SECRET_KEY=your-secret-key-here
|
||||||
|
|
||||||
|
# Paperless Configuration
|
||||||
|
PAPERLESS_TOKEN=your-paperless-token
|
||||||
|
BASE_URL=192.168.1.5:8000
|
||||||
|
|
||||||
|
# Ollama Configuration
|
||||||
|
OLLAMA_URL=http://192.168.1.14:11434
|
||||||
|
OLLAMA_HOST=http://192.168.1.14:11434
|
||||||
|
|
||||||
|
# ChromaDB Configuration
|
||||||
|
CHROMADB_PATH=/path/to/chromadb
|
||||||
|
|
||||||
|
# OpenAI Configuration
|
||||||
|
OPENAI_API_KEY=your-openai-api-key
|
||||||
|
|
||||||
|
# Immich Configuration
|
||||||
|
IMMICH_URL=http://192.168.1.5:2283
|
||||||
|
IMMICH_API_KEY=your-immich-api-key
|
||||||
|
SEARCH_QUERY=simba cat
|
||||||
|
DOWNLOAD_DIR=./simba_photos
|
||||||
|
|
||||||
|
# OIDC Configuration (Authelia)
|
||||||
|
OIDC_ISSUER=https://auth.example.com
|
||||||
|
OIDC_CLIENT_ID=simbarag
|
||||||
|
OIDC_CLIENT_SECRET=your-client-secret-here
|
||||||
|
OIDC_REDIRECT_URI=http://localhost:8080/
|
||||||
|
OIDC_USE_DISCOVERY=true
|
||||||
|
|
||||||
|
# Optional: Manual OIDC endpoints (if discovery is disabled)
|
||||||
|
# OIDC_AUTHORIZATION_ENDPOINT=https://auth.example.com/api/oidc/authorization
|
||||||
|
# OIDC_TOKEN_ENDPOINT=https://auth.example.com/api/oidc/token
|
||||||
|
# OIDC_USERINFO_ENDPOINT=https://auth.example.com/api/oidc/userinfo
|
||||||
|
# OIDC_JWKS_URI=https://auth.example.com/api/oidc/jwks
|
||||||
7
.gitignore
vendored
7
.gitignore
vendored
@@ -9,5 +9,10 @@ wheels/
|
|||||||
# Virtual environments
|
# Virtual environments
|
||||||
.venv
|
.venv
|
||||||
|
|
||||||
|
# Environment files
|
||||||
.env
|
.env
|
||||||
|
|
||||||
|
# Database files
|
||||||
|
chromadb/
|
||||||
|
database/
|
||||||
|
*.db
|
||||||
|
|||||||
110
DEV-README.md
Normal file
110
DEV-README.md
Normal file
@@ -0,0 +1,110 @@
|
|||||||
|
# Development Environment Setup
|
||||||
|
|
||||||
|
This guide explains how to run the application in development mode with hot reload enabled.
|
||||||
|
|
||||||
|
## Quick Start
|
||||||
|
|
||||||
|
### Development Mode (Hot Reload)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Start all services in development mode
|
||||||
|
docker-compose -f docker-compose.dev.yml up --build
|
||||||
|
|
||||||
|
# Or run in detached mode
|
||||||
|
docker-compose -f docker-compose.dev.yml up -d --build
|
||||||
|
```
|
||||||
|
|
||||||
|
### Production Mode
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Start production services
|
||||||
|
docker-compose up --build
|
||||||
|
```
|
||||||
|
|
||||||
|
## What's Different in Dev Mode?
|
||||||
|
|
||||||
|
### Backend (Quart/Flask)
|
||||||
|
- **Hot Reload**: Python code changes are automatically detected and the server restarts
|
||||||
|
- **Source Mounted**: Your local `services/raggr` directory is mounted as a volume
|
||||||
|
- **Debug Mode**: Flask runs with `debug=True` for better error messages
|
||||||
|
- **Environment**: `FLASK_ENV=development` and `PYTHONUNBUFFERED=1` for immediate log output
|
||||||
|
|
||||||
|
### Frontend (React + rsbuild)
|
||||||
|
- **Auto Rebuild**: Frontend automatically rebuilds when files change
|
||||||
|
- **Watch Mode**: rsbuild runs in watch mode, rebuilding to `dist/` on save
|
||||||
|
- **Source Mounted**: Your local `services/raggr/raggr-frontend` directory is mounted as a volume
|
||||||
|
- **Served by Backend**: Built files are served by the backend, no separate dev server
|
||||||
|
|
||||||
|
## Ports
|
||||||
|
|
||||||
|
- **Application**: 8080 (accessible at `http://localhost:8080` or `http://YOUR_IP:8080`)
|
||||||
|
|
||||||
|
The backend serves both the API and the auto-rebuilt frontend, making it accessible from other machines on your network.
|
||||||
|
|
||||||
|
## Useful Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# View logs
|
||||||
|
docker-compose -f docker-compose.dev.yml logs -f
|
||||||
|
|
||||||
|
# View logs for specific service
|
||||||
|
docker-compose -f docker-compose.dev.yml logs -f raggr-backend
|
||||||
|
docker-compose -f docker-compose.dev.yml logs -f raggr-frontend
|
||||||
|
|
||||||
|
# Rebuild after dependency changes
|
||||||
|
docker-compose -f docker-compose.dev.yml up --build
|
||||||
|
|
||||||
|
# Stop all services
|
||||||
|
docker-compose -f docker-compose.dev.yml down
|
||||||
|
|
||||||
|
# Stop and remove volumes (fresh start)
|
||||||
|
docker-compose -f docker-compose.dev.yml down -v
|
||||||
|
```
|
||||||
|
|
||||||
|
## Making Changes
|
||||||
|
|
||||||
|
### Backend Changes
|
||||||
|
1. Edit any Python file in `services/raggr/`
|
||||||
|
2. Save the file
|
||||||
|
3. The Quart server will automatically restart
|
||||||
|
4. Check logs to confirm reload
|
||||||
|
|
||||||
|
### Frontend Changes
|
||||||
|
1. Edit any file in `services/raggr/raggr-frontend/src/`
|
||||||
|
2. Save the file
|
||||||
|
3. The browser will automatically refresh (Hot Module Replacement)
|
||||||
|
4. No need to rebuild
|
||||||
|
|
||||||
|
### Dependency Changes
|
||||||
|
|
||||||
|
**Backend** (pyproject.toml):
|
||||||
|
```bash
|
||||||
|
# Rebuild the backend service
|
||||||
|
docker-compose -f docker-compose.dev.yml up --build raggr-backend
|
||||||
|
```
|
||||||
|
|
||||||
|
**Frontend** (package.json):
|
||||||
|
```bash
|
||||||
|
# Rebuild the frontend service
|
||||||
|
docker-compose -f docker-compose.dev.yml up --build raggr-frontend
|
||||||
|
```
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### Port Already in Use
|
||||||
|
If you see port binding errors, make sure no other services are running on ports 8080 or 3000.
|
||||||
|
|
||||||
|
### Changes Not Reflected
|
||||||
|
1. Check if the file is properly mounted (check docker-compose.dev.yml volumes)
|
||||||
|
2. Verify the file isn't in an excluded directory (node_modules, __pycache__)
|
||||||
|
3. Check container logs for errors
|
||||||
|
|
||||||
|
### Frontend Not Connecting to Backend
|
||||||
|
Make sure your frontend API calls point to the correct backend URL. If accessing from the same machine, use `http://localhost:8080`. If accessing from another device on the network, use `http://YOUR_IP:8080`.
|
||||||
|
|
||||||
|
## Notes
|
||||||
|
|
||||||
|
- Both services bind to `0.0.0.0` and expose ports, making them accessible on your network
|
||||||
|
- Node modules and Python cache are excluded from volume mounts to use container versions
|
||||||
|
- Database and ChromaDB data persist in Docker volumes across restarts
|
||||||
|
- Access the app from any device on your network using your host machine's IP address
|
||||||
44
app.py
44
app.py
@@ -1,44 +0,0 @@
|
|||||||
import os
|
|
||||||
|
|
||||||
from flask import Flask, request, jsonify, render_template, send_from_directory
|
|
||||||
|
|
||||||
from main import consult_simba_oracle
|
|
||||||
|
|
||||||
app = Flask(
|
|
||||||
__name__,
|
|
||||||
static_folder="raggr-frontend/dist/static",
|
|
||||||
template_folder="raggr-frontend/dist",
|
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
# Serve React static files
|
|
||||||
@app.route("/static/<path:filename>")
|
|
||||||
def static_files(filename):
|
|
||||||
return send_from_directory(app.static_folder, filename)
|
|
||||||
|
|
||||||
|
|
||||||
# Serve the React app for all routes (catch-all)
|
|
||||||
@app.route("/", defaults={"path": ""})
|
|
||||||
@app.route("/<path:path>")
|
|
||||||
def serve_react_app(path):
|
|
||||||
if path and os.path.exists(os.path.join(app.template_folder, path)):
|
|
||||||
return send_from_directory(app.template_folder, path)
|
|
||||||
return render_template("index.html")
|
|
||||||
|
|
||||||
|
|
||||||
@app.route("/api/query", methods=["POST"])
|
|
||||||
def query():
|
|
||||||
data = request.get_json()
|
|
||||||
query = data.get("query")
|
|
||||||
return jsonify({"response": consult_simba_oracle(query)})
|
|
||||||
|
|
||||||
|
|
||||||
@app.route("/api/ingest", methods=["POST"])
|
|
||||||
def webhook():
|
|
||||||
data = request.get_json()
|
|
||||||
print(data)
|
|
||||||
return jsonify({"status": "received"})
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
app.run(host="0.0.0.0", port=8080, debug=True)
|
|
||||||
102
docker-compose.dev.yml
Normal file
102
docker-compose.dev.yml
Normal file
@@ -0,0 +1,102 @@
|
|||||||
|
services:
|
||||||
|
postgres:
|
||||||
|
image: postgres:16-alpine
|
||||||
|
environment:
|
||||||
|
- POSTGRES_USER=raggr
|
||||||
|
- POSTGRES_PASSWORD=raggr_dev_password
|
||||||
|
- POSTGRES_DB=raggr
|
||||||
|
ports:
|
||||||
|
- "5432:5432"
|
||||||
|
volumes:
|
||||||
|
- postgres_data:/var/lib/postgresql/data
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD-SHELL", "pg_isready -U raggr"]
|
||||||
|
interval: 5s
|
||||||
|
timeout: 5s
|
||||||
|
retries: 5
|
||||||
|
|
||||||
|
raggr-backend:
|
||||||
|
build:
|
||||||
|
context: ./services/raggr
|
||||||
|
dockerfile: Dockerfile.dev
|
||||||
|
image: torrtle/simbarag:dev
|
||||||
|
ports:
|
||||||
|
- "8080:8080"
|
||||||
|
env_file:
|
||||||
|
- .env
|
||||||
|
environment:
|
||||||
|
- PAPERLESS_TOKEN=${PAPERLESS_TOKEN}
|
||||||
|
- BASE_URL=${BASE_URL}
|
||||||
|
- OLLAMA_URL=${OLLAMA_URL:-http://localhost:11434}
|
||||||
|
- CHROMADB_PATH=/app/chromadb
|
||||||
|
- OPENAI_API_KEY=${OPENAI_API_KEY}
|
||||||
|
- JWT_SECRET_KEY=${JWT_SECRET_KEY}
|
||||||
|
- OIDC_ISSUER=${OIDC_ISSUER}
|
||||||
|
- OIDC_CLIENT_ID=${OIDC_CLIENT_ID}
|
||||||
|
- OIDC_CLIENT_SECRET=${OIDC_CLIENT_SECRET}
|
||||||
|
- OIDC_REDIRECT_URI=${OIDC_REDIRECT_URI}
|
||||||
|
- OIDC_USE_DISCOVERY=${OIDC_USE_DISCOVERY:-true}
|
||||||
|
- DATABASE_URL=postgres://raggr:raggr_dev_password@postgres:5432/raggr
|
||||||
|
- FLASK_ENV=development
|
||||||
|
- PYTHONUNBUFFERED=1
|
||||||
|
depends_on:
|
||||||
|
postgres:
|
||||||
|
condition: service_healthy
|
||||||
|
volumes:
|
||||||
|
# Persist data only
|
||||||
|
- chromadb_data:/app/chromadb
|
||||||
|
# Share frontend dist with frontend container
|
||||||
|
- frontend_dist:/app/raggr-frontend/dist
|
||||||
|
develop:
|
||||||
|
watch:
|
||||||
|
# Sync Python source files
|
||||||
|
- action: sync
|
||||||
|
path: ./services/raggr
|
||||||
|
target: /app
|
||||||
|
ignore:
|
||||||
|
- raggr-frontend/
|
||||||
|
- __pycache__/
|
||||||
|
- "*.pyc"
|
||||||
|
- "*.pyo"
|
||||||
|
- "*.pyd"
|
||||||
|
- .git/
|
||||||
|
- chromadb/
|
||||||
|
# Sync+restart on frontend dist changes
|
||||||
|
- action: sync+restart
|
||||||
|
path: ./services/raggr/raggr-frontend/dist
|
||||||
|
target: /app/raggr-frontend/dist
|
||||||
|
# Restart on dependency changes
|
||||||
|
- action: rebuild
|
||||||
|
path: ./services/raggr/pyproject.toml
|
||||||
|
- action: rebuild
|
||||||
|
path: ./services/raggr/uv.lock
|
||||||
|
|
||||||
|
raggr-frontend:
|
||||||
|
build:
|
||||||
|
context: ./services/raggr/raggr-frontend
|
||||||
|
dockerfile: Dockerfile.dev
|
||||||
|
environment:
|
||||||
|
- NODE_ENV=development
|
||||||
|
volumes:
|
||||||
|
# Share dist folder with backend
|
||||||
|
- frontend_dist:/app/dist
|
||||||
|
develop:
|
||||||
|
watch:
|
||||||
|
# Sync frontend source files
|
||||||
|
- action: sync
|
||||||
|
path: ./services/raggr/raggr-frontend
|
||||||
|
target: /app
|
||||||
|
ignore:
|
||||||
|
- node_modules/
|
||||||
|
- dist/
|
||||||
|
- .git/
|
||||||
|
# Rebuild on dependency changes
|
||||||
|
- action: rebuild
|
||||||
|
path: ./services/raggr/raggr-frontend/package.json
|
||||||
|
- action: rebuild
|
||||||
|
path: ./services/raggr/raggr-frontend/yarn.lock
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
chromadb_data:
|
||||||
|
postgres_data:
|
||||||
|
frontend_dist:
|
||||||
@@ -1,7 +1,25 @@
|
|||||||
version: "3.8"
|
version: "3.8"
|
||||||
|
|
||||||
services:
|
services:
|
||||||
|
postgres:
|
||||||
|
image: postgres:16-alpine
|
||||||
|
environment:
|
||||||
|
- POSTGRES_USER=${POSTGRES_USER:-raggr}
|
||||||
|
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD:-changeme}
|
||||||
|
- POSTGRES_DB=${POSTGRES_DB:-raggr}
|
||||||
|
volumes:
|
||||||
|
- postgres_data:/var/lib/postgresql/data
|
||||||
|
healthcheck:
|
||||||
|
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-raggr}"]
|
||||||
|
interval: 10s
|
||||||
|
timeout: 5s
|
||||||
|
retries: 5
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
raggr:
|
raggr:
|
||||||
|
build:
|
||||||
|
context: ./services/raggr
|
||||||
|
dockerfile: Dockerfile
|
||||||
image: torrtle/simbarag:latest
|
image: torrtle/simbarag:latest
|
||||||
network_mode: host
|
network_mode: host
|
||||||
environment:
|
environment:
|
||||||
@@ -10,8 +28,20 @@ services:
|
|||||||
- OLLAMA_URL=${OLLAMA_URL:-http://localhost:11434}
|
- OLLAMA_URL=${OLLAMA_URL:-http://localhost:11434}
|
||||||
- CHROMADB_PATH=/app/chromadb
|
- CHROMADB_PATH=/app/chromadb
|
||||||
- OPENAI_API_KEY=${OPENAI_API_KEY}
|
- OPENAI_API_KEY=${OPENAI_API_KEY}
|
||||||
|
- JWT_SECRET_KEY=${JWT_SECRET_KEY}
|
||||||
|
- OIDC_ISSUER=${OIDC_ISSUER}
|
||||||
|
- OIDC_CLIENT_ID=${OIDC_CLIENT_ID}
|
||||||
|
- OIDC_CLIENT_SECRET=${OIDC_CLIENT_SECRET}
|
||||||
|
- OIDC_REDIRECT_URI=${OIDC_REDIRECT_URI}
|
||||||
|
- OIDC_USE_DISCOVERY=${OIDC_USE_DISCOVERY:-true}
|
||||||
|
- DATABASE_URL=${DATABASE_URL:-postgres://raggr:changeme@postgres:5432/raggr}
|
||||||
|
depends_on:
|
||||||
|
postgres:
|
||||||
|
condition: service_healthy
|
||||||
volumes:
|
volumes:
|
||||||
- chromadb_data:/app/chromadb
|
- chromadb_data:/app/chromadb
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
volumes:
|
volumes:
|
||||||
chromadb_data:
|
chromadb_data:
|
||||||
|
postgres_data:
|
||||||
|
|||||||
64
llm.py
64
llm.py
@@ -1,64 +0,0 @@
|
|||||||
import os
|
|
||||||
|
|
||||||
from ollama import Client
|
|
||||||
from openai import OpenAI
|
|
||||||
|
|
||||||
import logging
|
|
||||||
|
|
||||||
logging.basicConfig(level=logging.INFO)
|
|
||||||
|
|
||||||
|
|
||||||
class LLMClient:
|
|
||||||
def __init__(self):
|
|
||||||
try:
|
|
||||||
self.ollama_client = Client(
|
|
||||||
host=os.getenv("OLLAMA_URL", "http://localhost:11434"), timeout=10.0
|
|
||||||
)
|
|
||||||
self.ollama_client.chat(
|
|
||||||
model="gemma3:4b", messages=[{"role": "system", "content": "test"}]
|
|
||||||
)
|
|
||||||
self.PROVIDER = "ollama"
|
|
||||||
logging.info("Using Ollama as LLM backend")
|
|
||||||
except Exception as e:
|
|
||||||
print(e)
|
|
||||||
self.openai_client = OpenAI()
|
|
||||||
self.PROVIDER = "openai"
|
|
||||||
logging.info("Using OpenAI as LLM backend")
|
|
||||||
|
|
||||||
def chat(
|
|
||||||
self,
|
|
||||||
prompt: str,
|
|
||||||
system_prompt: str,
|
|
||||||
):
|
|
||||||
if self.PROVIDER == "ollama":
|
|
||||||
response = self.ollama_client.chat(
|
|
||||||
model="gemma3:4b",
|
|
||||||
messages=[
|
|
||||||
{
|
|
||||||
"role": "system",
|
|
||||||
"content": system_prompt,
|
|
||||||
},
|
|
||||||
{"role": "user", "content": prompt},
|
|
||||||
],
|
|
||||||
)
|
|
||||||
print(response)
|
|
||||||
output = response.message.content
|
|
||||||
elif self.PROVIDER == "openai":
|
|
||||||
response = self.openai_client.responses.create(
|
|
||||||
model="gpt-4o-mini",
|
|
||||||
input=[
|
|
||||||
{
|
|
||||||
"role": "system",
|
|
||||||
"content": system_prompt,
|
|
||||||
},
|
|
||||||
{"role": "user", "content": prompt},
|
|
||||||
],
|
|
||||||
)
|
|
||||||
output = response.output_text
|
|
||||||
|
|
||||||
return output
|
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
client = Client()
|
|
||||||
client.chat(model="gemma3:4b", messages=[{"role": "system", "promp": "hack"}])
|
|
||||||
@@ -1,19 +0,0 @@
|
|||||||
[project]
|
|
||||||
name = "raggr"
|
|
||||||
version = "0.1.0"
|
|
||||||
description = "Add your description here"
|
|
||||||
readme = "README.md"
|
|
||||||
requires-python = ">=3.13"
|
|
||||||
dependencies = [
|
|
||||||
"chromadb>=1.1.0",
|
|
||||||
"python-dotenv>=1.0.0",
|
|
||||||
"flask>=3.1.2",
|
|
||||||
"httpx>=0.28.1",
|
|
||||||
"ollama>=0.6.0",
|
|
||||||
"openai>=2.0.1",
|
|
||||||
"pydantic>=2.11.9",
|
|
||||||
"pillow>=10.0.0",
|
|
||||||
"pymupdf>=1.24.0",
|
|
||||||
"black>=25.9.0",
|
|
||||||
"pillow-heif>=1.1.1",
|
|
||||||
]
|
|
||||||
Binary file not shown.
@@ -1,155 +0,0 @@
|
|||||||
import { useState } from "react";
|
|
||||||
import axios from "axios";
|
|
||||||
import ReactMarkdown from "react-markdown";
|
|
||||||
|
|
||||||
import "./App.css";
|
|
||||||
|
|
||||||
type QuestionAnswer = {
|
|
||||||
question: string;
|
|
||||||
answer: string;
|
|
||||||
};
|
|
||||||
|
|
||||||
type QuestionBubbleProps = {
|
|
||||||
text: string;
|
|
||||||
};
|
|
||||||
|
|
||||||
type AnswerBubbleProps = {
|
|
||||||
text: string;
|
|
||||||
loading: string;
|
|
||||||
};
|
|
||||||
|
|
||||||
type QuestionAnswerPairProps = {
|
|
||||||
question: string;
|
|
||||||
answer: string;
|
|
||||||
loading: boolean;
|
|
||||||
};
|
|
||||||
|
|
||||||
const QuestionBubble = ({ text }: QuestionBubbleProps) => {
|
|
||||||
return <div className="rounded-md bg-stone-200 p-3">🤦: {text}</div>;
|
|
||||||
};
|
|
||||||
|
|
||||||
const AnswerBubble = ({ text, loading }: AnswerBubbleProps) => {
|
|
||||||
return (
|
|
||||||
<div className="rounded-md bg-orange-100 p-3">
|
|
||||||
{loading ? (
|
|
||||||
<div className="flex flex-col w-full animate-pulse gap-2">
|
|
||||||
<div className="flex flex-row gap-2 w-full">
|
|
||||||
<div className="bg-gray-400 w-1/2 p-3 rounded-lg" />
|
|
||||||
<div className="bg-gray-400 w-1/2 p-3 rounded-lg" />
|
|
||||||
</div>
|
|
||||||
<div className="flex flex-row gap-2 w-full">
|
|
||||||
<div className="bg-gray-400 w-1/3 p-3 rounded-lg" />
|
|
||||||
<div className="bg-gray-400 w-2/3 p-3 rounded-lg" />
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
) : (
|
|
||||||
<div className="flex flex-col">
|
|
||||||
<ReactMarkdown>{"🐈: " + text}</ReactMarkdown>
|
|
||||||
</div>
|
|
||||||
)}
|
|
||||||
</div>
|
|
||||||
);
|
|
||||||
};
|
|
||||||
|
|
||||||
const QuestionAnswerPair = ({
|
|
||||||
question,
|
|
||||||
answer,
|
|
||||||
loading,
|
|
||||||
}: QuestionAnswerPairProps) => {
|
|
||||||
return (
|
|
||||||
<div className="flex flex-col gap-4">
|
|
||||||
<QuestionBubble text={question} />
|
|
||||||
<AnswerBubble text={answer} loading={loading} />
|
|
||||||
</div>
|
|
||||||
);
|
|
||||||
};
|
|
||||||
|
|
||||||
const App = () => {
|
|
||||||
const [query, setQuery] = useState<string>("");
|
|
||||||
const [answer, setAnswer] = useState<string>("");
|
|
||||||
const [simbaMode, setSimbaMode] = useState<boolean>(false);
|
|
||||||
const [questionsAnswers, setQuestionsAnswers] = useState<QuestionAnswer[]>(
|
|
||||||
[]
|
|
||||||
);
|
|
||||||
|
|
||||||
const simbaAnswers = ["meow.", "hiss...", "purrrrrr", "yowOWROWWowowr"];
|
|
||||||
|
|
||||||
const handleQuestionSubmit = () => {
|
|
||||||
if (simbaMode) {
|
|
||||||
console.log("simba mode activated");
|
|
||||||
const randomIndex = Math.floor(Math.random() * simbaAnswers.length);
|
|
||||||
const randomElement = simbaAnswers[randomIndex];
|
|
||||||
setAnswer(randomElement);
|
|
||||||
setQuestionsAnswers(
|
|
||||||
questionsAnswers.concat([
|
|
||||||
{
|
|
||||||
question: query,
|
|
||||||
answer: randomElement,
|
|
||||||
},
|
|
||||||
])
|
|
||||||
);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
const payload = { query: query };
|
|
||||||
axios
|
|
||||||
.post("/api/query", payload)
|
|
||||||
.then((result) =>
|
|
||||||
setQuestionsAnswers(
|
|
||||||
questionsAnswers.concat([
|
|
||||||
{ question: query, answer: result.data.response },
|
|
||||||
])
|
|
||||||
)
|
|
||||||
);
|
|
||||||
};
|
|
||||||
const handleQueryChange = (event) => {
|
|
||||||
setQuery(event.target.value);
|
|
||||||
};
|
|
||||||
return (
|
|
||||||
<div className="h-screen bg-opacity-20">
|
|
||||||
<div className="bg-white/85 h-screen">
|
|
||||||
<div className="flex flex-row justify-center py-4">
|
|
||||||
<div className="flex flex-col gap-4 min-w-xl max-w-xl">
|
|
||||||
<div className="flex flex-row justify-center gap-2 grow">
|
|
||||||
<h1 className="text-3xl">ask simba!</h1>
|
|
||||||
</div>
|
|
||||||
{questionsAnswers.map((qa) => (
|
|
||||||
<QuestionAnswerPair
|
|
||||||
question={qa.question}
|
|
||||||
answer={qa.answer}
|
|
||||||
/>
|
|
||||||
))}
|
|
||||||
<footer className="flex flex-col gap-2 sticky bottom-0">
|
|
||||||
<div className="flex flex-row justify-between gap-2 grow">
|
|
||||||
<textarea
|
|
||||||
type="text"
|
|
||||||
className="p-4 border border-blue-200 rounded-md grow bg-white"
|
|
||||||
onChange={handleQueryChange}
|
|
||||||
/>
|
|
||||||
</div>
|
|
||||||
<div className="flex flex-row justify-between gap-2 grow">
|
|
||||||
<button
|
|
||||||
className="p-4 border border-blue-400 bg-blue-200 hover:bg-blue-400 cursor-pointer rounded-md flex-grow"
|
|
||||||
onClick={() => handleQuestionSubmit()}
|
|
||||||
type="submit"
|
|
||||||
>
|
|
||||||
Submit
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
<div className="flex flex-row justify-center gap-2 grow">
|
|
||||||
<input
|
|
||||||
type="checkbox"
|
|
||||||
onChange={(event) =>
|
|
||||||
setSimbaMode(event.target.checked)
|
|
||||||
}
|
|
||||||
/>
|
|
||||||
<p>simba mode?</p>
|
|
||||||
</div>
|
|
||||||
</footer>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
);
|
|
||||||
};
|
|
||||||
|
|
||||||
export default App;
|
|
||||||
File diff suppressed because it is too large
Load Diff
@@ -23,6 +23,8 @@ RUN uv pip install --system -e .
|
|||||||
|
|
||||||
# Copy application code
|
# Copy application code
|
||||||
COPY *.py ./
|
COPY *.py ./
|
||||||
|
COPY blueprints ./blueprints
|
||||||
|
COPY migrations ./migrations
|
||||||
COPY startup.sh ./
|
COPY startup.sh ./
|
||||||
RUN chmod +x startup.sh
|
RUN chmod +x startup.sh
|
||||||
|
|
||||||
@@ -32,8 +34,8 @@ WORKDIR /app/raggr-frontend
|
|||||||
RUN yarn install && yarn build
|
RUN yarn install && yarn build
|
||||||
WORKDIR /app
|
WORKDIR /app
|
||||||
|
|
||||||
# Create ChromaDB directory
|
# Create ChromaDB and database directories
|
||||||
RUN mkdir -p /app/chromadb
|
RUN mkdir -p /app/chromadb /app/database
|
||||||
|
|
||||||
# Expose port
|
# Expose port
|
||||||
EXPOSE 8080
|
EXPOSE 8080
|
||||||
39
services/raggr/Dockerfile.dev
Normal file
39
services/raggr/Dockerfile.dev
Normal file
@@ -0,0 +1,39 @@
|
|||||||
|
FROM python:3.13-slim
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Install system dependencies and uv
|
||||||
|
RUN apt-get update && apt-get install -y \
|
||||||
|
build-essential \
|
||||||
|
curl \
|
||||||
|
&& rm -rf /var/lib/apt/lists/* \
|
||||||
|
&& curl -LsSf https://astral.sh/uv/install.sh | sh
|
||||||
|
|
||||||
|
# Add uv to PATH
|
||||||
|
ENV PATH="/root/.local/bin:$PATH"
|
||||||
|
|
||||||
|
# Copy dependency files
|
||||||
|
COPY pyproject.toml ./
|
||||||
|
|
||||||
|
# Install Python dependencies using uv
|
||||||
|
RUN uv pip install --system -e .
|
||||||
|
|
||||||
|
# Create ChromaDB and database directories
|
||||||
|
RUN mkdir -p /app/chromadb /app/database
|
||||||
|
|
||||||
|
# Expose port
|
||||||
|
EXPOSE 8080
|
||||||
|
|
||||||
|
# Copy application source code
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
# Make startup script executable
|
||||||
|
RUN chmod +x /app/startup-dev.sh
|
||||||
|
|
||||||
|
# Set environment variables
|
||||||
|
ENV PYTHONPATH=/app
|
||||||
|
ENV CHROMADB_PATH=/app/chromadb
|
||||||
|
ENV PYTHONUNBUFFERED=1
|
||||||
|
|
||||||
|
# Default command
|
||||||
|
CMD ["/app/startup-dev.sh"]
|
||||||
54
services/raggr/MIGRATIONS.md
Normal file
54
services/raggr/MIGRATIONS.md
Normal file
@@ -0,0 +1,54 @@
|
|||||||
|
# Database Migrations with Aerich
|
||||||
|
|
||||||
|
## Initial Setup (Run Once)
|
||||||
|
|
||||||
|
1. Install dependencies:
|
||||||
|
```bash
|
||||||
|
uv pip install -e .
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Initialize Aerich:
|
||||||
|
```bash
|
||||||
|
aerich init-db
|
||||||
|
```
|
||||||
|
|
||||||
|
This will:
|
||||||
|
- Create a `migrations/` directory
|
||||||
|
- Generate the initial migration based on your models
|
||||||
|
- Create all tables in the database
|
||||||
|
|
||||||
|
## When You Add/Change Models
|
||||||
|
|
||||||
|
1. Generate a new migration:
|
||||||
|
```bash
|
||||||
|
aerich migrate --name "describe_your_changes"
|
||||||
|
```
|
||||||
|
|
||||||
|
Example:
|
||||||
|
```bash
|
||||||
|
aerich migrate --name "add_user_profile_model"
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Apply the migration:
|
||||||
|
```bash
|
||||||
|
aerich upgrade
|
||||||
|
```
|
||||||
|
|
||||||
|
## Common Commands
|
||||||
|
|
||||||
|
- `aerich init-db` - Initialize database (first time only)
|
||||||
|
- `aerich migrate --name "description"` - Generate new migration
|
||||||
|
- `aerich upgrade` - Apply pending migrations
|
||||||
|
- `aerich downgrade` - Rollback last migration
|
||||||
|
- `aerich history` - Show migration history
|
||||||
|
- `aerich heads` - Show current migration heads
|
||||||
|
|
||||||
|
## Docker Setup
|
||||||
|
|
||||||
|
In Docker, migrations run automatically on container startup via the startup script.
|
||||||
|
|
||||||
|
## Notes
|
||||||
|
|
||||||
|
- Migration files are stored in `migrations/models/`
|
||||||
|
- Always commit migration files to version control
|
||||||
|
- Don't modify migration files manually after they're created
|
||||||
146
services/raggr/add_user.py
Normal file
146
services/raggr/add_user.py
Normal file
@@ -0,0 +1,146 @@
|
|||||||
|
# GENERATED BY CLAUDE
|
||||||
|
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import uuid
|
||||||
|
import asyncio
|
||||||
|
from tortoise import Tortoise
|
||||||
|
from blueprints.users.models import User
|
||||||
|
|
||||||
|
from dotenv import load_dotenv
|
||||||
|
|
||||||
|
load_dotenv()
|
||||||
|
|
||||||
|
# Database configuration with environment variable support
|
||||||
|
DATABASE_PATH = os.getenv("DATABASE_PATH", "database/raggr.db")
|
||||||
|
DATABASE_URL = os.getenv("DATABASE_URL", f"sqlite://{DATABASE_PATH}")
|
||||||
|
|
||||||
|
print(DATABASE_URL)
|
||||||
|
|
||||||
|
|
||||||
|
async def add_user(username: str, email: str, password: str):
|
||||||
|
"""Add a new user to the database"""
|
||||||
|
await Tortoise.init(
|
||||||
|
db_url=DATABASE_URL,
|
||||||
|
modules={
|
||||||
|
"models": [
|
||||||
|
"blueprints.users.models",
|
||||||
|
"blueprints.conversation.models",
|
||||||
|
]
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Check if user already exists
|
||||||
|
existing_user = await User.filter(email=email).first()
|
||||||
|
if existing_user:
|
||||||
|
print(f"Error: User with email '{email}' already exists!")
|
||||||
|
return False
|
||||||
|
|
||||||
|
existing_username = await User.filter(username=username).first()
|
||||||
|
if existing_username:
|
||||||
|
print(f"Error: Username '{username}' is already taken!")
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Create new user
|
||||||
|
user = User(
|
||||||
|
id=uuid.uuid4(),
|
||||||
|
username=username,
|
||||||
|
email=email,
|
||||||
|
)
|
||||||
|
user.set_password(password)
|
||||||
|
await user.save()
|
||||||
|
|
||||||
|
print("✓ User created successfully!")
|
||||||
|
print(f" Username: {username}")
|
||||||
|
print(f" Email: {email}")
|
||||||
|
print(f" ID: {user.id}")
|
||||||
|
return True
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error creating user: {e}")
|
||||||
|
return False
|
||||||
|
finally:
|
||||||
|
await Tortoise.close_connections()
|
||||||
|
|
||||||
|
|
||||||
|
async def list_users():
|
||||||
|
"""List all users in the database"""
|
||||||
|
await Tortoise.init(
|
||||||
|
db_url=DATABASE_URL,
|
||||||
|
modules={
|
||||||
|
"models": [
|
||||||
|
"blueprints.users.models",
|
||||||
|
"blueprints.conversation.models",
|
||||||
|
]
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
users = await User.all()
|
||||||
|
if not users:
|
||||||
|
print("No users found in database.")
|
||||||
|
return
|
||||||
|
|
||||||
|
print(f"\nFound {len(users)} user(s):")
|
||||||
|
print("-" * 60)
|
||||||
|
for user in users:
|
||||||
|
print(f"Username: {user.username}")
|
||||||
|
print(f"Email: {user.email}")
|
||||||
|
print(f"ID: {user.id}")
|
||||||
|
print(f"Created: {user.created_at}")
|
||||||
|
print("-" * 60)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error listing users: {e}")
|
||||||
|
finally:
|
||||||
|
await Tortoise.close_connections()
|
||||||
|
|
||||||
|
|
||||||
|
def print_usage():
|
||||||
|
"""Print usage instructions"""
|
||||||
|
print("Usage:")
|
||||||
|
print(" python add_user.py add <username> <email> <password>")
|
||||||
|
print(" python add_user.py list")
|
||||||
|
print("\nExamples:")
|
||||||
|
print(" python add_user.py add ryan ryan@example.com mypassword123")
|
||||||
|
print(" python add_user.py list")
|
||||||
|
print("\nEnvironment Variables:")
|
||||||
|
print(" DATABASE_PATH - Path to database file (default: database/raggr.db)")
|
||||||
|
print(" DATABASE_URL - Full database URL (overrides DATABASE_PATH)")
|
||||||
|
print("\n Example with custom database:")
|
||||||
|
print(" DATABASE_PATH=dev.db python add_user.py list")
|
||||||
|
|
||||||
|
|
||||||
|
async def main():
|
||||||
|
if len(sys.argv) < 2:
|
||||||
|
print_usage()
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
command = sys.argv[1].lower()
|
||||||
|
|
||||||
|
if command == "add":
|
||||||
|
if len(sys.argv) != 5:
|
||||||
|
print("Error: Missing arguments for 'add' command")
|
||||||
|
print_usage()
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
username = sys.argv[2]
|
||||||
|
email = sys.argv[3]
|
||||||
|
password = sys.argv[4]
|
||||||
|
|
||||||
|
success = await add_user(username, email, password)
|
||||||
|
sys.exit(0 if success else 1)
|
||||||
|
|
||||||
|
elif command == "list":
|
||||||
|
await list_users()
|
||||||
|
sys.exit(0)
|
||||||
|
|
||||||
|
else:
|
||||||
|
print(f"Error: Unknown command '{command}'")
|
||||||
|
print_usage()
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
asyncio.run(main())
|
||||||
20
services/raggr/aerich_config.py
Normal file
20
services/raggr/aerich_config.py
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
import os
|
||||||
|
|
||||||
|
# Database configuration with environment variable support
|
||||||
|
# Use DATABASE_PATH for relative paths or DATABASE_URL for full connection strings
|
||||||
|
DATABASE_PATH = os.getenv("DATABASE_PATH", "database/raggr.db")
|
||||||
|
DATABASE_URL = os.getenv("DATABASE_URL", f"sqlite://{DATABASE_PATH}")
|
||||||
|
|
||||||
|
TORTOISE_ORM = {
|
||||||
|
"connections": {"default": DATABASE_URL},
|
||||||
|
"apps": {
|
||||||
|
"models": {
|
||||||
|
"models": [
|
||||||
|
"blueprints.conversation.models",
|
||||||
|
"blueprints.users.models",
|
||||||
|
"aerich.models",
|
||||||
|
],
|
||||||
|
"default_connection": "default",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
144
services/raggr/app.py
Normal file
144
services/raggr/app.py
Normal file
@@ -0,0 +1,144 @@
|
|||||||
|
import os
|
||||||
|
|
||||||
|
from quart import Quart, jsonify, render_template, request, send_from_directory
|
||||||
|
from quart_jwt_extended import JWTManager, get_jwt_identity, jwt_refresh_token_required
|
||||||
|
from tortoise.contrib.quart import register_tortoise
|
||||||
|
|
||||||
|
import blueprints.conversation
|
||||||
|
import blueprints.conversation.logic
|
||||||
|
import blueprints.users
|
||||||
|
import blueprints.users.models
|
||||||
|
from main import consult_simba_oracle
|
||||||
|
|
||||||
|
app = Quart(
|
||||||
|
__name__,
|
||||||
|
static_folder="raggr-frontend/dist/static",
|
||||||
|
template_folder="raggr-frontend/dist",
|
||||||
|
)
|
||||||
|
|
||||||
|
app.config["JWT_SECRET_KEY"] = os.getenv("JWT_SECRET_KEY", "SECRET_KEY")
|
||||||
|
jwt = JWTManager(app)
|
||||||
|
|
||||||
|
# Register blueprints
|
||||||
|
app.register_blueprint(blueprints.users.user_blueprint)
|
||||||
|
app.register_blueprint(blueprints.conversation.conversation_blueprint)
|
||||||
|
|
||||||
|
|
||||||
|
# Database configuration with environment variable support
|
||||||
|
DATABASE_URL = os.getenv(
|
||||||
|
"DATABASE_URL", "postgres://raggr:raggr_dev_password@localhost:5432/raggr"
|
||||||
|
)
|
||||||
|
|
||||||
|
TORTOISE_CONFIG = {
|
||||||
|
"connections": {"default": DATABASE_URL},
|
||||||
|
"apps": {
|
||||||
|
"models": {
|
||||||
|
"models": [
|
||||||
|
"blueprints.conversation.models",
|
||||||
|
"blueprints.users.models",
|
||||||
|
"aerich.models",
|
||||||
|
]
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
# Initialize Tortoise ORM
|
||||||
|
register_tortoise(
|
||||||
|
app,
|
||||||
|
config=TORTOISE_CONFIG,
|
||||||
|
generate_schemas=False, # Disabled - using Aerich for migrations
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
# Serve React static files
|
||||||
|
@app.route("/static/<path:filename>")
|
||||||
|
async def static_files(filename):
|
||||||
|
return await send_from_directory(app.static_folder, filename)
|
||||||
|
|
||||||
|
|
||||||
|
# Serve the React app for all routes (catch-all)
|
||||||
|
@app.route("/", defaults={"path": ""})
|
||||||
|
@app.route("/<path:path>")
|
||||||
|
async def serve_react_app(path):
|
||||||
|
if path and os.path.exists(os.path.join(app.template_folder, path)):
|
||||||
|
return await send_from_directory(app.template_folder, path)
|
||||||
|
return await render_template("index.html")
|
||||||
|
|
||||||
|
|
||||||
|
@app.route("/api/query", methods=["POST"])
|
||||||
|
@jwt_refresh_token_required
|
||||||
|
async def query():
|
||||||
|
current_user_uuid = get_jwt_identity()
|
||||||
|
user = await blueprints.users.models.User.get(id=current_user_uuid)
|
||||||
|
data = await request.get_json()
|
||||||
|
query = data.get("query")
|
||||||
|
conversation_id = data.get("conversation_id")
|
||||||
|
conversation = await blueprints.conversation.logic.get_conversation_by_id(
|
||||||
|
conversation_id
|
||||||
|
)
|
||||||
|
await conversation.fetch_related("messages")
|
||||||
|
await blueprints.conversation.logic.add_message_to_conversation(
|
||||||
|
conversation=conversation,
|
||||||
|
message=query,
|
||||||
|
speaker="user",
|
||||||
|
user=user,
|
||||||
|
)
|
||||||
|
|
||||||
|
transcript = await blueprints.conversation.logic.get_conversation_transcript(
|
||||||
|
user=user, conversation=conversation
|
||||||
|
)
|
||||||
|
|
||||||
|
response = consult_simba_oracle(input=query, transcript=transcript)
|
||||||
|
await blueprints.conversation.logic.add_message_to_conversation(
|
||||||
|
conversation=conversation,
|
||||||
|
message=response,
|
||||||
|
speaker="simba",
|
||||||
|
user=user,
|
||||||
|
)
|
||||||
|
return jsonify({"response": response})
|
||||||
|
|
||||||
|
|
||||||
|
@app.route("/api/messages", methods=["GET"])
|
||||||
|
@jwt_refresh_token_required
|
||||||
|
async def get_messages():
|
||||||
|
current_user_uuid = get_jwt_identity()
|
||||||
|
user = await blueprints.users.models.User.get(id=current_user_uuid)
|
||||||
|
|
||||||
|
conversation = await blueprints.conversation.logic.get_conversation_for_user(
|
||||||
|
user=user
|
||||||
|
)
|
||||||
|
# Prefetch related messages
|
||||||
|
await conversation.fetch_related("messages")
|
||||||
|
|
||||||
|
# Manually serialize the conversation with messages
|
||||||
|
messages = []
|
||||||
|
for msg in conversation.messages:
|
||||||
|
messages.append(
|
||||||
|
{
|
||||||
|
"id": str(msg.id),
|
||||||
|
"text": msg.text,
|
||||||
|
"speaker": msg.speaker.value,
|
||||||
|
"created_at": msg.created_at.isoformat(),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
name = conversation.name
|
||||||
|
if len(messages) > 8:
|
||||||
|
name = await blueprints.conversation.logic.rename_conversation(
|
||||||
|
user=user,
|
||||||
|
conversation=conversation,
|
||||||
|
)
|
||||||
|
|
||||||
|
return jsonify(
|
||||||
|
{
|
||||||
|
"id": str(conversation.id),
|
||||||
|
"name": name,
|
||||||
|
"messages": messages,
|
||||||
|
"created_at": conversation.created_at.isoformat(),
|
||||||
|
"updated_at": conversation.updated_at.isoformat(),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
app.run(host="0.0.0.0", port=8080, debug=True)
|
||||||
1
services/raggr/blueprints/__init__.py
Normal file
1
services/raggr/blueprints/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
# Blueprints package
|
||||||
83
services/raggr/blueprints/conversation/__init__.py
Normal file
83
services/raggr/blueprints/conversation/__init__.py
Normal file
@@ -0,0 +1,83 @@
|
|||||||
|
import datetime
|
||||||
|
|
||||||
|
from quart import Blueprint, jsonify
|
||||||
|
from quart_jwt_extended import (
|
||||||
|
get_jwt_identity,
|
||||||
|
jwt_refresh_token_required,
|
||||||
|
)
|
||||||
|
|
||||||
|
import blueprints.users.models
|
||||||
|
|
||||||
|
from .logic import rename_conversation
|
||||||
|
from .models import (
|
||||||
|
Conversation,
|
||||||
|
PydConversation,
|
||||||
|
PydListConversation,
|
||||||
|
)
|
||||||
|
|
||||||
|
conversation_blueprint = Blueprint(
|
||||||
|
"conversation_api", __name__, url_prefix="/api/conversation"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@conversation_blueprint.route("/<conversation_id>")
|
||||||
|
@jwt_refresh_token_required
|
||||||
|
async def get_conversation(conversation_id: str):
|
||||||
|
conversation = await Conversation.get(id=conversation_id)
|
||||||
|
current_user_uuid = get_jwt_identity()
|
||||||
|
user = await blueprints.users.models.User.get(id=current_user_uuid)
|
||||||
|
await conversation.fetch_related("messages")
|
||||||
|
|
||||||
|
# Manually serialize the conversation with messages
|
||||||
|
messages = []
|
||||||
|
for msg in conversation.messages:
|
||||||
|
messages.append(
|
||||||
|
{
|
||||||
|
"id": str(msg.id),
|
||||||
|
"text": msg.text,
|
||||||
|
"speaker": msg.speaker.value,
|
||||||
|
"created_at": msg.created_at.isoformat(),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
name = conversation.name
|
||||||
|
if len(messages) > 8 and "datetime" in name.lower():
|
||||||
|
name = await rename_conversation(
|
||||||
|
user=user,
|
||||||
|
conversation=conversation,
|
||||||
|
)
|
||||||
|
print(name)
|
||||||
|
|
||||||
|
return jsonify(
|
||||||
|
{
|
||||||
|
"id": str(conversation.id),
|
||||||
|
"name": name,
|
||||||
|
"messages": messages,
|
||||||
|
"created_at": conversation.created_at.isoformat(),
|
||||||
|
"updated_at": conversation.updated_at.isoformat(),
|
||||||
|
}
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@conversation_blueprint.post("/")
|
||||||
|
@jwt_refresh_token_required
|
||||||
|
async def create_conversation():
|
||||||
|
user_uuid = get_jwt_identity()
|
||||||
|
user = await blueprints.users.models.User.get(id=user_uuid)
|
||||||
|
conversation = await Conversation.create(
|
||||||
|
name=f"{user.username} {datetime.datetime.now().timestamp}",
|
||||||
|
user=user,
|
||||||
|
)
|
||||||
|
|
||||||
|
serialized_conversation = await PydConversation.from_tortoise_orm(conversation)
|
||||||
|
return jsonify(serialized_conversation.model_dump())
|
||||||
|
|
||||||
|
|
||||||
|
@conversation_blueprint.get("/")
|
||||||
|
@jwt_refresh_token_required
|
||||||
|
async def get_all_conversations():
|
||||||
|
user_uuid = get_jwt_identity()
|
||||||
|
user = await blueprints.users.models.User.get(id=user_uuid)
|
||||||
|
conversations = Conversation.filter(user=user)
|
||||||
|
serialized_conversations = await PydListConversation.from_queryset(conversations)
|
||||||
|
|
||||||
|
return jsonify(serialized_conversations.model_dump())
|
||||||
80
services/raggr/blueprints/conversation/logic.py
Normal file
80
services/raggr/blueprints/conversation/logic.py
Normal file
@@ -0,0 +1,80 @@
|
|||||||
|
import tortoise.exceptions
|
||||||
|
from langchain_openai import ChatOpenAI
|
||||||
|
|
||||||
|
import blueprints.users.models
|
||||||
|
|
||||||
|
from .models import Conversation, ConversationMessage, RenameConversationOutputSchema
|
||||||
|
|
||||||
|
|
||||||
|
async def create_conversation(name: str = "") -> Conversation:
|
||||||
|
conversation = await Conversation.create(name=name)
|
||||||
|
return conversation
|
||||||
|
|
||||||
|
|
||||||
|
async def add_message_to_conversation(
|
||||||
|
conversation: Conversation,
|
||||||
|
message: str,
|
||||||
|
speaker: str,
|
||||||
|
user: blueprints.users.models.User,
|
||||||
|
) -> ConversationMessage:
|
||||||
|
print(conversation, message, speaker)
|
||||||
|
message = await ConversationMessage.create(
|
||||||
|
text=message,
|
||||||
|
speaker=speaker,
|
||||||
|
conversation=conversation,
|
||||||
|
)
|
||||||
|
|
||||||
|
return message
|
||||||
|
|
||||||
|
|
||||||
|
async def get_the_only_conversation() -> Conversation:
|
||||||
|
try:
|
||||||
|
conversation = await Conversation.all().first()
|
||||||
|
if conversation is None:
|
||||||
|
conversation = await Conversation.create(name="simba_chat")
|
||||||
|
except Exception as _e:
|
||||||
|
conversation = await Conversation.create(name="simba_chat")
|
||||||
|
|
||||||
|
return conversation
|
||||||
|
|
||||||
|
|
||||||
|
async def get_conversation_for_user(user: blueprints.users.models.User) -> Conversation:
|
||||||
|
try:
|
||||||
|
return await Conversation.get(user=user)
|
||||||
|
except tortoise.exceptions.DoesNotExist:
|
||||||
|
await Conversation.get_or_create(name=f"{user.username}'s chat", user=user)
|
||||||
|
|
||||||
|
return await Conversation.get(user=user)
|
||||||
|
|
||||||
|
|
||||||
|
async def get_conversation_by_id(id: str) -> Conversation:
|
||||||
|
return await Conversation.get(id=id)
|
||||||
|
|
||||||
|
|
||||||
|
async def get_conversation_transcript(
|
||||||
|
user: blueprints.users.models.User, conversation: Conversation
|
||||||
|
) -> str:
|
||||||
|
messages = []
|
||||||
|
for message in conversation.messages:
|
||||||
|
messages.append(f"{message.speaker} at {message.created_at}: {message.text}")
|
||||||
|
|
||||||
|
return "\n".join(messages)
|
||||||
|
|
||||||
|
|
||||||
|
async def rename_conversation(
|
||||||
|
user: blueprints.users.models.User,
|
||||||
|
conversation: Conversation,
|
||||||
|
) -> str:
|
||||||
|
messages: str = await get_conversation_transcript(
|
||||||
|
user=user, conversation=conversation
|
||||||
|
)
|
||||||
|
|
||||||
|
llm = ChatOpenAI(model="gpt-4o-mini")
|
||||||
|
structured_llm = llm.with_structured_output(RenameConversationOutputSchema)
|
||||||
|
|
||||||
|
prompt = f"Summarize the following conversation into a sassy one-liner title:\n\n{messages}"
|
||||||
|
response = structured_llm.invoke(prompt)
|
||||||
|
new_name: str = response.get("title")
|
||||||
|
conversation.name = new_name
|
||||||
|
await conversation.save()
|
||||||
|
return new_name
|
||||||
61
services/raggr/blueprints/conversation/models.py
Normal file
61
services/raggr/blueprints/conversation/models.py
Normal file
@@ -0,0 +1,61 @@
|
|||||||
|
import enum
|
||||||
|
from dataclasses import dataclass
|
||||||
|
|
||||||
|
from tortoise import fields
|
||||||
|
from tortoise.contrib.pydantic import (
|
||||||
|
pydantic_model_creator,
|
||||||
|
pydantic_queryset_creator,
|
||||||
|
)
|
||||||
|
from tortoise.models import Model
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class RenameConversationOutputSchema:
|
||||||
|
title: str
|
||||||
|
justification: str
|
||||||
|
|
||||||
|
|
||||||
|
class Speaker(enum.Enum):
|
||||||
|
USER = "user"
|
||||||
|
SIMBA = "simba"
|
||||||
|
|
||||||
|
|
||||||
|
class Conversation(Model):
|
||||||
|
id = fields.UUIDField(primary_key=True)
|
||||||
|
name = fields.CharField(max_length=255)
|
||||||
|
created_at = fields.DatetimeField(auto_now_add=True)
|
||||||
|
updated_at = fields.DatetimeField(auto_now=True)
|
||||||
|
user: fields.ForeignKeyRelation = fields.ForeignKeyField(
|
||||||
|
"models.User", related_name="conversations", null=True
|
||||||
|
)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
table = "conversations"
|
||||||
|
|
||||||
|
|
||||||
|
class ConversationMessage(Model):
|
||||||
|
id = fields.UUIDField(primary_key=True)
|
||||||
|
text = fields.TextField()
|
||||||
|
conversation = fields.ForeignKeyField(
|
||||||
|
"models.Conversation", related_name="messages"
|
||||||
|
)
|
||||||
|
created_at = fields.DatetimeField(auto_now_add=True)
|
||||||
|
speaker = fields.CharEnumField(enum_type=Speaker, max_length=10)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
table = "conversation_messages"
|
||||||
|
|
||||||
|
|
||||||
|
PydConversationMessage = pydantic_model_creator(ConversationMessage)
|
||||||
|
PydConversation = pydantic_model_creator(
|
||||||
|
Conversation, name="Conversation", allow_cycles=True, exclude=("user",)
|
||||||
|
)
|
||||||
|
PydConversationWithMessages = pydantic_model_creator(
|
||||||
|
Conversation,
|
||||||
|
name="ConversationWithMessages",
|
||||||
|
allow_cycles=True,
|
||||||
|
exclude=("user",),
|
||||||
|
include=("messages",),
|
||||||
|
)
|
||||||
|
PydListConversation = pydantic_queryset_creator(Conversation)
|
||||||
|
PydListConversationMessage = pydantic_queryset_creator(ConversationMessage)
|
||||||
180
services/raggr/blueprints/users/__init__.py
Normal file
180
services/raggr/blueprints/users/__init__.py
Normal file
@@ -0,0 +1,180 @@
|
|||||||
|
from quart import Blueprint, jsonify, request
|
||||||
|
from quart_jwt_extended import (
|
||||||
|
create_access_token,
|
||||||
|
create_refresh_token,
|
||||||
|
jwt_refresh_token_required,
|
||||||
|
get_jwt_identity,
|
||||||
|
)
|
||||||
|
from .models import User
|
||||||
|
from .oidc_service import OIDCUserService
|
||||||
|
from oidc_config import oidc_config
|
||||||
|
import secrets
|
||||||
|
import httpx
|
||||||
|
from urllib.parse import urlencode
|
||||||
|
import hashlib
|
||||||
|
import base64
|
||||||
|
|
||||||
|
|
||||||
|
user_blueprint = Blueprint("user_api", __name__, url_prefix="/api/user")
|
||||||
|
|
||||||
|
# In-memory storage for OIDC state/PKCE (production: use Redis or database)
|
||||||
|
# Format: {state: {"pkce_verifier": str, "redirect_after_login": str}}
|
||||||
|
_oidc_sessions = {}
|
||||||
|
|
||||||
|
|
||||||
|
@user_blueprint.route("/oidc/login", methods=["GET"])
|
||||||
|
async def oidc_login():
|
||||||
|
"""
|
||||||
|
Initiate OIDC login flow
|
||||||
|
Generates PKCE parameters and redirects to Authelia
|
||||||
|
"""
|
||||||
|
if not oidc_config.validate_config():
|
||||||
|
return jsonify({"error": "OIDC not configured"}), 500
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Generate PKCE parameters
|
||||||
|
code_verifier = secrets.token_urlsafe(64)
|
||||||
|
|
||||||
|
# For PKCE, we need code_challenge = BASE64URL(SHA256(code_verifier))
|
||||||
|
code_challenge = (
|
||||||
|
base64.urlsafe_b64encode(hashlib.sha256(code_verifier.encode()).digest())
|
||||||
|
.decode()
|
||||||
|
.rstrip("=")
|
||||||
|
)
|
||||||
|
|
||||||
|
# Generate state for CSRF protection
|
||||||
|
state = secrets.token_urlsafe(32)
|
||||||
|
|
||||||
|
# Store PKCE verifier and state for callback validation
|
||||||
|
_oidc_sessions[state] = {
|
||||||
|
"pkce_verifier": code_verifier,
|
||||||
|
"redirect_after_login": request.args.get("redirect", "/"),
|
||||||
|
}
|
||||||
|
|
||||||
|
# Get authorization endpoint from discovery
|
||||||
|
discovery = await oidc_config.get_discovery_document()
|
||||||
|
auth_endpoint = discovery.get("authorization_endpoint")
|
||||||
|
|
||||||
|
# Build authorization URL
|
||||||
|
params = {
|
||||||
|
"client_id": oidc_config.client_id,
|
||||||
|
"response_type": "code",
|
||||||
|
"redirect_uri": oidc_config.redirect_uri,
|
||||||
|
"scope": "openid email profile",
|
||||||
|
"state": state,
|
||||||
|
"code_challenge": code_challenge,
|
||||||
|
"code_challenge_method": "S256",
|
||||||
|
}
|
||||||
|
|
||||||
|
auth_url = f"{auth_endpoint}?{urlencode(params)}"
|
||||||
|
|
||||||
|
return jsonify({"auth_url": auth_url})
|
||||||
|
except Exception as e:
|
||||||
|
return jsonify({"error": f"OIDC login failed: {str(e)}"}), 500
|
||||||
|
|
||||||
|
|
||||||
|
@user_blueprint.route("/oidc/callback", methods=["GET"])
|
||||||
|
async def oidc_callback():
|
||||||
|
"""
|
||||||
|
Handle OIDC callback from Authelia
|
||||||
|
Exchanges authorization code for tokens, verifies ID token, and creates/updates user
|
||||||
|
"""
|
||||||
|
# Get authorization code and state from callback
|
||||||
|
code = request.args.get("code")
|
||||||
|
state = request.args.get("state")
|
||||||
|
error = request.args.get("error")
|
||||||
|
|
||||||
|
if error:
|
||||||
|
return jsonify({"error": f"OIDC error: {error}"}), 400
|
||||||
|
|
||||||
|
if not code or not state:
|
||||||
|
return jsonify({"error": "Missing code or state"}), 400
|
||||||
|
|
||||||
|
# Validate state and retrieve PKCE verifier
|
||||||
|
session = _oidc_sessions.pop(state, None)
|
||||||
|
if not session:
|
||||||
|
return jsonify({"error": "Invalid or expired state"}), 400
|
||||||
|
|
||||||
|
pkce_verifier = session["pkce_verifier"]
|
||||||
|
|
||||||
|
# Exchange authorization code for tokens
|
||||||
|
discovery = await oidc_config.get_discovery_document()
|
||||||
|
token_endpoint = discovery.get("token_endpoint")
|
||||||
|
|
||||||
|
token_data = {
|
||||||
|
"grant_type": "authorization_code",
|
||||||
|
"code": code,
|
||||||
|
"redirect_uri": oidc_config.redirect_uri,
|
||||||
|
"client_id": oidc_config.client_id,
|
||||||
|
"client_secret": oidc_config.client_secret,
|
||||||
|
"code_verifier": pkce_verifier,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Use client_secret_post method (credentials in POST body)
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
token_response = await client.post(token_endpoint, data=token_data)
|
||||||
|
|
||||||
|
if token_response.status_code != 200:
|
||||||
|
return jsonify({"error": f"Failed to exchange code for token: {token_response.text}"}), 400
|
||||||
|
|
||||||
|
tokens = token_response.json()
|
||||||
|
|
||||||
|
id_token = tokens.get("id_token")
|
||||||
|
if not id_token:
|
||||||
|
return jsonify({"error": "No ID token received"}), 400
|
||||||
|
|
||||||
|
# Verify ID token
|
||||||
|
try:
|
||||||
|
claims = await oidc_config.verify_id_token(id_token)
|
||||||
|
except Exception as e:
|
||||||
|
return jsonify({"error": f"ID token verification failed: {str(e)}"}), 400
|
||||||
|
|
||||||
|
# Get or create user from OIDC claims
|
||||||
|
user = await OIDCUserService.get_or_create_user_from_oidc(claims)
|
||||||
|
|
||||||
|
# Issue backend JWT tokens
|
||||||
|
access_token = create_access_token(identity=str(user.id))
|
||||||
|
refresh_token = create_refresh_token(identity=str(user.id))
|
||||||
|
|
||||||
|
# Return tokens to frontend
|
||||||
|
# Frontend will handle storing these and redirecting
|
||||||
|
return jsonify(
|
||||||
|
access_token=access_token,
|
||||||
|
refresh_token=refresh_token,
|
||||||
|
user={"id": str(user.id), "username": user.username, "email": user.email},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@user_blueprint.route("/refresh", methods=["POST"])
|
||||||
|
@jwt_refresh_token_required
|
||||||
|
async def refresh():
|
||||||
|
"""Refresh access token (unchanged from original)"""
|
||||||
|
user_id = get_jwt_identity()
|
||||||
|
new_token = create_access_token(identity=user_id)
|
||||||
|
return jsonify(access_token=new_token)
|
||||||
|
|
||||||
|
|
||||||
|
# Legacy username/password login - kept for backward compatibility during migration
|
||||||
|
@user_blueprint.route("/login", methods=["POST"])
|
||||||
|
async def login():
|
||||||
|
"""
|
||||||
|
Legacy username/password login
|
||||||
|
This can be removed after full OIDC migration is complete
|
||||||
|
"""
|
||||||
|
data = await request.get_json()
|
||||||
|
username = data.get("username")
|
||||||
|
password = data.get("password")
|
||||||
|
|
||||||
|
user = await User.filter(username=username).first()
|
||||||
|
|
||||||
|
if not user or not user.verify_password(password):
|
||||||
|
return jsonify({"msg": "Invalid credentials"}), 401
|
||||||
|
|
||||||
|
access_token = create_access_token(identity=str(user.id))
|
||||||
|
refresh_token = create_refresh_token(identity=str(user.id))
|
||||||
|
|
||||||
|
return jsonify(
|
||||||
|
access_token=access_token,
|
||||||
|
refresh_token=refresh_token,
|
||||||
|
user={"id": str(user.id), "username": user.username},
|
||||||
|
)
|
||||||
33
services/raggr/blueprints/users/models.py
Normal file
33
services/raggr/blueprints/users/models.py
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
from tortoise.models import Model
|
||||||
|
from tortoise import fields
|
||||||
|
|
||||||
|
|
||||||
|
import bcrypt
|
||||||
|
|
||||||
|
|
||||||
|
class User(Model):
|
||||||
|
id = fields.UUIDField(primary_key=True)
|
||||||
|
username = fields.CharField(max_length=255)
|
||||||
|
password = fields.BinaryField(null=True) # Hashed - nullable for OIDC users
|
||||||
|
email = fields.CharField(max_length=100, unique=True)
|
||||||
|
|
||||||
|
# OIDC fields
|
||||||
|
oidc_subject = fields.CharField(max_length=255, unique=True, null=True, index=True) # "sub" claim from OIDC
|
||||||
|
auth_provider = fields.CharField(max_length=50, default="local") # "local" or "oidc"
|
||||||
|
|
||||||
|
created_at = fields.DatetimeField(auto_now_add=True)
|
||||||
|
updated_at = fields.DatetimeField(auto_now=True)
|
||||||
|
|
||||||
|
class Meta:
|
||||||
|
table = "users"
|
||||||
|
|
||||||
|
def set_password(self, plain_password: str):
|
||||||
|
self.password = bcrypt.hashpw(
|
||||||
|
plain_password.encode("utf-8"),
|
||||||
|
bcrypt.gensalt(),
|
||||||
|
)
|
||||||
|
|
||||||
|
def verify_password(self, plain_password: str):
|
||||||
|
if not self.password:
|
||||||
|
return False
|
||||||
|
return bcrypt.checkpw(plain_password.encode("utf-8"), self.password)
|
||||||
76
services/raggr/blueprints/users/oidc_service.py
Normal file
76
services/raggr/blueprints/users/oidc_service.py
Normal file
@@ -0,0 +1,76 @@
|
|||||||
|
"""
|
||||||
|
OIDC User Management Service
|
||||||
|
"""
|
||||||
|
from typing import Dict, Any, Optional
|
||||||
|
from uuid import uuid4
|
||||||
|
from .models import User
|
||||||
|
|
||||||
|
|
||||||
|
class OIDCUserService:
|
||||||
|
"""Service for managing OIDC user authentication and provisioning"""
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
async def get_or_create_user_from_oidc(claims: Dict[str, Any]) -> User:
|
||||||
|
"""
|
||||||
|
Get existing user by OIDC subject, or create new user from OIDC claims
|
||||||
|
|
||||||
|
Args:
|
||||||
|
claims: Decoded OIDC ID token claims
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
User object (existing or newly created)
|
||||||
|
"""
|
||||||
|
oidc_subject = claims.get("sub")
|
||||||
|
if not oidc_subject:
|
||||||
|
raise ValueError("No 'sub' claim in ID token")
|
||||||
|
|
||||||
|
# Try to find existing user by OIDC subject
|
||||||
|
user = await User.filter(oidc_subject=oidc_subject).first()
|
||||||
|
|
||||||
|
if user:
|
||||||
|
# Update user info from latest claims (optional)
|
||||||
|
user.email = claims.get("email", user.email)
|
||||||
|
user.username = (
|
||||||
|
claims.get("preferred_username")
|
||||||
|
or claims.get("name")
|
||||||
|
or user.username
|
||||||
|
)
|
||||||
|
await user.save()
|
||||||
|
return user
|
||||||
|
|
||||||
|
# Check if user exists by email (migration case)
|
||||||
|
email = claims.get("email")
|
||||||
|
if email:
|
||||||
|
user = await User.filter(email=email, auth_provider="local").first()
|
||||||
|
if user:
|
||||||
|
# Migrate existing local user to OIDC
|
||||||
|
user.oidc_subject = oidc_subject
|
||||||
|
user.auth_provider = "oidc"
|
||||||
|
user.password = None # Clear password
|
||||||
|
await user.save()
|
||||||
|
return user
|
||||||
|
|
||||||
|
# Create new user from OIDC claims
|
||||||
|
username = (
|
||||||
|
claims.get("preferred_username")
|
||||||
|
or claims.get("name")
|
||||||
|
or claims.get("email", "").split("@")[0]
|
||||||
|
or f"user_{oidc_subject[:8]}"
|
||||||
|
)
|
||||||
|
|
||||||
|
user = await User.create(
|
||||||
|
id=uuid4(),
|
||||||
|
username=username,
|
||||||
|
email=email
|
||||||
|
or f"{oidc_subject}@oidc.local", # Fallback if no email claim
|
||||||
|
oidc_subject=oidc_subject,
|
||||||
|
auth_provider="oidc",
|
||||||
|
password=None,
|
||||||
|
)
|
||||||
|
|
||||||
|
return user
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
async def find_user_by_oidc_subject(oidc_subject: str) -> Optional[User]:
|
||||||
|
"""Find user by OIDC subject ID"""
|
||||||
|
return await User.filter(oidc_subject=oidc_subject).first()
|
||||||
@@ -14,7 +14,7 @@ from llm import LLMClient
|
|||||||
load_dotenv()
|
load_dotenv()
|
||||||
|
|
||||||
ollama_client = Client(
|
ollama_client = Client(
|
||||||
host=os.getenv("OLLAMA_HOST", "http://localhost:11434"), timeout=10.0
|
host=os.getenv("OLLAMA_HOST", "http://localhost:11434"), timeout=1.0
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
@@ -27,7 +27,7 @@ headers = {"x-api-key": API_KEY, "Content-Type": "application/json"}
|
|||||||
VISITED = {}
|
VISITED = {}
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
conn = sqlite3.connect("./visited.db")
|
conn = sqlite3.connect("./database/visited.db")
|
||||||
c = conn.cursor()
|
c = conn.cursor()
|
||||||
c.execute("select immich_id from visited")
|
c.execute("select immich_id from visited")
|
||||||
rows = c.fetchall()
|
rows = c.fetchall()
|
||||||
73
services/raggr/llm.py
Normal file
73
services/raggr/llm.py
Normal file
@@ -0,0 +1,73 @@
|
|||||||
|
import os
|
||||||
|
|
||||||
|
from ollama import Client
|
||||||
|
from openai import OpenAI
|
||||||
|
|
||||||
|
import logging
|
||||||
|
from dotenv import load_dotenv
|
||||||
|
|
||||||
|
load_dotenv()
|
||||||
|
|
||||||
|
logging.basicConfig(level=logging.INFO)
|
||||||
|
|
||||||
|
TRY_OLLAMA = os.getenv("TRY_OLLAMA", False)
|
||||||
|
|
||||||
|
|
||||||
|
class LLMClient:
|
||||||
|
def __init__(self):
|
||||||
|
try:
|
||||||
|
self.ollama_client = Client(
|
||||||
|
host=os.getenv("OLLAMA_URL", "http://localhost:11434"), timeout=1.0
|
||||||
|
)
|
||||||
|
self.ollama_client.chat(
|
||||||
|
model="gemma3:4b", messages=[{"role": "system", "content": "test"}]
|
||||||
|
)
|
||||||
|
self.PROVIDER = "ollama"
|
||||||
|
logging.info("Using Ollama as LLM backend")
|
||||||
|
except Exception as e:
|
||||||
|
print(e)
|
||||||
|
self.openai_client = OpenAI()
|
||||||
|
self.PROVIDER = "openai"
|
||||||
|
logging.info("Using OpenAI as LLM backend")
|
||||||
|
|
||||||
|
def chat(
|
||||||
|
self,
|
||||||
|
prompt: str,
|
||||||
|
system_prompt: str,
|
||||||
|
):
|
||||||
|
# Instituting a fallback if my gaming PC is not on
|
||||||
|
if self.PROVIDER == "ollama":
|
||||||
|
try:
|
||||||
|
response = self.ollama_client.chat(
|
||||||
|
model="gemma3:4b",
|
||||||
|
messages=[
|
||||||
|
{
|
||||||
|
"role": "system",
|
||||||
|
"content": system_prompt,
|
||||||
|
},
|
||||||
|
{"role": "user", "content": prompt},
|
||||||
|
],
|
||||||
|
)
|
||||||
|
output = response.message.content
|
||||||
|
return output
|
||||||
|
except Exception as e:
|
||||||
|
logging.error(f"Could not connect to OLLAMA: {str(e)}")
|
||||||
|
|
||||||
|
response = self.openai_client.responses.create(
|
||||||
|
model="gpt-4o-mini",
|
||||||
|
input=[
|
||||||
|
{
|
||||||
|
"role": "system",
|
||||||
|
"content": system_prompt,
|
||||||
|
},
|
||||||
|
{"role": "user", "content": prompt},
|
||||||
|
],
|
||||||
|
)
|
||||||
|
output = response.output_text
|
||||||
|
|
||||||
|
return output
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
client = Client()
|
||||||
|
client.chat(model="gemma3:4b", messages=[{"role": "system", "promp": "hack"}])
|
||||||
@@ -7,6 +7,8 @@ import argparse
|
|||||||
import chromadb
|
import chromadb
|
||||||
import ollama
|
import ollama
|
||||||
|
|
||||||
|
import time
|
||||||
|
|
||||||
|
|
||||||
from request import PaperlessNGXService
|
from request import PaperlessNGXService
|
||||||
from chunker import Chunker
|
from chunker import Chunker
|
||||||
@@ -36,6 +38,7 @@ parser.add_argument("query", type=str, help="questions about simba's health")
|
|||||||
parser.add_argument(
|
parser.add_argument(
|
||||||
"--reindex", action="store_true", help="re-index the simba documents"
|
"--reindex", action="store_true", help="re-index the simba documents"
|
||||||
)
|
)
|
||||||
|
parser.add_argument("--classify", action="store_true", help="test classification")
|
||||||
parser.add_argument("--index", help="index a file")
|
parser.add_argument("--index", help="index a file")
|
||||||
|
|
||||||
ppngx = PaperlessNGXService()
|
ppngx = PaperlessNGXService()
|
||||||
@@ -77,7 +80,7 @@ def chunk_data(docs, collection, doctypes):
|
|||||||
|
|
||||||
logging.info(f"chunking {len(docs)} documents")
|
logging.info(f"chunking {len(docs)} documents")
|
||||||
texts: list[str] = [doc["content"] for doc in docs]
|
texts: list[str] = [doc["content"] for doc in docs]
|
||||||
with sqlite3.connect("visited.db") as conn:
|
with sqlite3.connect("database/visited.db") as conn:
|
||||||
to_insert = []
|
to_insert = []
|
||||||
c = conn.cursor()
|
c = conn.cursor()
|
||||||
for index, text in enumerate(texts):
|
for index, text in enumerate(texts):
|
||||||
@@ -113,9 +116,22 @@ def chunk_text(texts: list[str], collection):
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
def consult_oracle(input: str, collection):
|
def classify_query(query: str, transcript: str) -> bool:
|
||||||
import time
|
logging.info("Starting query generation")
|
||||||
|
qg_start = time.time()
|
||||||
|
qg = QueryGenerator()
|
||||||
|
query_type = qg.get_query_type(input=query, transcript=transcript)
|
||||||
|
logging.info(query_type)
|
||||||
|
qg_end = time.time()
|
||||||
|
logging.info(f"Query generation took {qg_end - qg_start:.2f} seconds")
|
||||||
|
return query_type == "Simba"
|
||||||
|
|
||||||
|
|
||||||
|
def consult_oracle(
|
||||||
|
input: str,
|
||||||
|
collection,
|
||||||
|
transcript: str = "",
|
||||||
|
):
|
||||||
chunker = Chunker(collection)
|
chunker = Chunker(collection)
|
||||||
|
|
||||||
start_time = time.time()
|
start_time = time.time()
|
||||||
@@ -153,7 +169,10 @@ def consult_oracle(input: str, collection):
|
|||||||
logging.info("Starting LLM generation")
|
logging.info("Starting LLM generation")
|
||||||
llm_start = time.time()
|
llm_start = time.time()
|
||||||
system_prompt = "You are a helpful assistant that understands veterinary terms."
|
system_prompt = "You are a helpful assistant that understands veterinary terms."
|
||||||
prompt = f"Using the following data, help answer the user's query by providing as many details as possible. Using this data: {results}. Respond to this prompt: {input}"
|
transcript_prompt = f"Here is the message transcript thus far {transcript}."
|
||||||
|
prompt = f"""Using the following data, help answer the user's query by providing as many details as possible.
|
||||||
|
Using this data: {results}. {transcript_prompt if len(transcript) > 0 else ""}
|
||||||
|
Respond to this prompt: {input}"""
|
||||||
output = llm_client.chat(prompt=prompt, system_prompt=system_prompt)
|
output = llm_client.chat(prompt=prompt, system_prompt=system_prompt)
|
||||||
llm_end = time.time()
|
llm_end = time.time()
|
||||||
logging.info(f"LLM generation took {llm_end - llm_start:.2f} seconds")
|
logging.info(f"LLM generation took {llm_end - llm_start:.2f} seconds")
|
||||||
@@ -164,6 +183,16 @@ def consult_oracle(input: str, collection):
|
|||||||
return output
|
return output
|
||||||
|
|
||||||
|
|
||||||
|
def llm_chat(input: str, transcript: str = "") -> str:
|
||||||
|
system_prompt = "You are a helpful assistant that understands veterinary terms."
|
||||||
|
transcript_prompt = f"Here is the message transcript thus far {transcript}."
|
||||||
|
prompt = f"""Answer the user in as if you were a cat named Simba. Don't act too catlike. Be assertive.
|
||||||
|
{transcript_prompt if len(transcript) > 0 else ""}
|
||||||
|
Respond to this prompt: {input}"""
|
||||||
|
output = llm_client.chat(prompt=prompt, system_prompt=system_prompt)
|
||||||
|
return output
|
||||||
|
|
||||||
|
|
||||||
def paperless_workflow(input):
|
def paperless_workflow(input):
|
||||||
# Step 1: Get the text
|
# Step 1: Get the text
|
||||||
ppngx = PaperlessNGXService()
|
ppngx = PaperlessNGXService()
|
||||||
@@ -173,15 +202,24 @@ def paperless_workflow(input):
|
|||||||
consult_oracle(input, simba_docs)
|
consult_oracle(input, simba_docs)
|
||||||
|
|
||||||
|
|
||||||
def consult_simba_oracle(input: str):
|
def consult_simba_oracle(input: str, transcript: str = ""):
|
||||||
|
is_simba_related = classify_query(query=input, transcript=transcript)
|
||||||
|
|
||||||
|
if is_simba_related:
|
||||||
|
logging.info("Query is related to simba")
|
||||||
return consult_oracle(
|
return consult_oracle(
|
||||||
input=input,
|
input=input,
|
||||||
collection=simba_docs,
|
collection=simba_docs,
|
||||||
|
transcript=transcript,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
logging.info("Query is NOT related to simba")
|
||||||
|
|
||||||
|
return llm_chat(input=input, transcript=transcript)
|
||||||
|
|
||||||
|
|
||||||
def filter_indexed_files(docs):
|
def filter_indexed_files(docs):
|
||||||
with sqlite3.connect("visited.db") as conn:
|
with sqlite3.connect("database/visited.db") as conn:
|
||||||
c = conn.cursor()
|
c = conn.cursor()
|
||||||
c.execute(
|
c.execute(
|
||||||
"CREATE TABLE IF NOT EXISTS indexed_documents (id INTEGER PRIMARY KEY AUTOINCREMENT, paperless_id INTEGER)"
|
"CREATE TABLE IF NOT EXISTS indexed_documents (id INTEGER PRIMARY KEY AUTOINCREMENT, paperless_id INTEGER)"
|
||||||
@@ -194,12 +232,16 @@ def filter_indexed_files(docs):
|
|||||||
return [doc for doc in docs if doc["id"] not in visited]
|
return [doc for doc in docs if doc["id"] not in visited]
|
||||||
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
def reindex():
|
||||||
args = parser.parse_args()
|
with sqlite3.connect("database/visited.db") as conn:
|
||||||
if args.reindex:
|
|
||||||
with sqlite3.connect("./visited.db") as conn:
|
|
||||||
c = conn.cursor()
|
c = conn.cursor()
|
||||||
c.execute("DELETE FROM indexed_documents")
|
c.execute("DELETE FROM indexed_documents")
|
||||||
|
conn.commit()
|
||||||
|
|
||||||
|
# Delete all documents from the collection
|
||||||
|
all_docs = simba_docs.get()
|
||||||
|
if all_docs["ids"]:
|
||||||
|
simba_docs.delete(ids=all_docs["ids"])
|
||||||
|
|
||||||
logging.info("Fetching documents from Paperless-NGX")
|
logging.info("Fetching documents from Paperless-NGX")
|
||||||
ppngx = PaperlessNGXService()
|
ppngx = PaperlessNGXService()
|
||||||
@@ -215,21 +257,20 @@ if __name__ == "__main__":
|
|||||||
|
|
||||||
# Chunk documents
|
# Chunk documents
|
||||||
logging.info("Chunking documents now ...")
|
logging.info("Chunking documents now ...")
|
||||||
tag_lookup = ppngx.get_tags()
|
|
||||||
doctype_lookup = ppngx.get_doctypes()
|
doctype_lookup = ppngx.get_doctypes()
|
||||||
chunk_data(docs, collection=simba_docs, doctypes=doctype_lookup)
|
chunk_data(docs, collection=simba_docs, doctypes=doctype_lookup)
|
||||||
logging.info("Done chunking documents")
|
logging.info("Done chunking documents")
|
||||||
|
|
||||||
# if args.index:
|
|
||||||
# with open(args.index) as file:
|
if __name__ == "__main__":
|
||||||
# extension = args.index.split(".")[-1]
|
args = parser.parse_args()
|
||||||
# if extension == "pdf":
|
if args.reindex:
|
||||||
# pdf_path = ppngx.download_pdf_from_id(id=document_id)
|
reindex()
|
||||||
# image_paths = pdf_to_image(filepath=pdf_path)
|
|
||||||
# print(f"summarizing {file}")
|
if args.classify:
|
||||||
# generated_summary = summarize_pdf_image(filepaths=image_paths)
|
consult_simba_oracle(input="yohohoho testing")
|
||||||
# elif extension in [".md", ".txt"]:
|
consult_simba_oracle(input="write an email")
|
||||||
# chunk_text(texts=[file.readall()], collection=simba_docs)
|
consult_simba_oracle(input="how much does simba weigh")
|
||||||
|
|
||||||
if args.query:
|
if args.query:
|
||||||
logging.info("Consulting oracle ...")
|
logging.info("Consulting oracle ...")
|
||||||
71
services/raggr/migrations/models/0_20251225052005_init.py
Normal file
71
services/raggr/migrations/models/0_20251225052005_init.py
Normal file
@@ -0,0 +1,71 @@
|
|||||||
|
from tortoise import BaseDBAsyncClient
|
||||||
|
|
||||||
|
RUN_IN_TRANSACTION = True
|
||||||
|
|
||||||
|
|
||||||
|
async def upgrade(db: BaseDBAsyncClient) -> str:
|
||||||
|
return """
|
||||||
|
CREATE TABLE IF NOT EXISTS "users" (
|
||||||
|
"id" UUID NOT NULL PRIMARY KEY,
|
||||||
|
"username" VARCHAR(255) NOT NULL,
|
||||||
|
"password" BYTEA,
|
||||||
|
"email" VARCHAR(100) NOT NULL UNIQUE,
|
||||||
|
"oidc_subject" VARCHAR(255) UNIQUE,
|
||||||
|
"auth_provider" VARCHAR(50) NOT NULL DEFAULT 'local',
|
||||||
|
"created_at" TIMESTAMPTZ NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
"updated_at" TIMESTAMPTZ NOT NULL DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
CREATE INDEX IF NOT EXISTS "idx_users_oidc_su_5aec5a" ON "users" ("oidc_subject");
|
||||||
|
CREATE TABLE IF NOT EXISTS "conversations" (
|
||||||
|
"id" UUID NOT NULL PRIMARY KEY,
|
||||||
|
"name" VARCHAR(255) NOT NULL,
|
||||||
|
"created_at" TIMESTAMPTZ NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
"updated_at" TIMESTAMPTZ NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
"user_id" UUID REFERENCES "users" ("id") ON DELETE CASCADE
|
||||||
|
);
|
||||||
|
CREATE TABLE IF NOT EXISTS "conversation_messages" (
|
||||||
|
"id" UUID NOT NULL PRIMARY KEY,
|
||||||
|
"text" TEXT NOT NULL,
|
||||||
|
"created_at" TIMESTAMPTZ NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
"speaker" VARCHAR(10) NOT NULL,
|
||||||
|
"conversation_id" UUID NOT NULL REFERENCES "conversations" ("id") ON DELETE CASCADE
|
||||||
|
);
|
||||||
|
COMMENT ON COLUMN "conversation_messages"."speaker" IS 'USER: user\nSIMBA: simba';
|
||||||
|
CREATE TABLE IF NOT EXISTS "aerich" (
|
||||||
|
"id" SERIAL NOT NULL PRIMARY KEY,
|
||||||
|
"version" VARCHAR(255) NOT NULL,
|
||||||
|
"app" VARCHAR(100) NOT NULL,
|
||||||
|
"content" JSONB NOT NULL
|
||||||
|
);"""
|
||||||
|
|
||||||
|
|
||||||
|
async def downgrade(db: BaseDBAsyncClient) -> str:
|
||||||
|
return """
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
MODELS_STATE = (
|
||||||
|
"eJztmmtP4zgUhv9KlE+MxCLoUGaEViulpex0Z9qO2nR3LjuK3MRtvSROJnYGKsR/X9u5J0"
|
||||||
|
"56AUqL+gXosU9sPz7OeY/Lveq4FrTJSdvFv6BPAEUuVi+VexUDB7I/pO3Higo8L23lBgom"
|
||||||
|
"tnAwMz1FC5gQ6gOTssYpsAlkJgsS00deNBgObJsbXZN1RHiWmgKMfgbQoO4M0jn0WcP3H8"
|
||||||
|
"yMsAXvIIk/ejfGFEHbys0bWXxsYTfowhO28bh7dS168uEmhunagYPT3t6Czl2cdA8CZJ1w"
|
||||||
|
"H942gxj6gEIrsww+y2jZsSmcMTNQP4DJVK3UYMEpCGwOQ/19GmCTM1DESPzH+R/qGngYao"
|
||||||
|
"4WYcpZ3D+Eq0rXLKwqH6r9QRsevb14I1bpEjrzRaMgoj4IR0BB6Cq4piDF7xLK9hz4cpRx"
|
||||||
|
"/wJMNtFNMMaGlGMaQzHIGNBm1FQH3Bk2xDM6Zx8bzWYNxr+1oSDJegmULovrMOr7UVMjbO"
|
||||||
|
"NIU4SmD/mSDUDLIK9YC0UOlMPMexaQWpHrSfzHjgJma7AG2F5Eh6CGr97tdUa61vvMV+IQ"
|
||||||
|
"8tMWiDS9w1sawrooWI8uCluRPET5p6t/UPhH5dug3ynGftJP/6byOYGAugZ2bw1gZc5rbI"
|
||||||
|
"3B5DY28KwNNzbvedjYF93YaPKZfSXQN9bLIBmXR6SRaG5b3MTNkwZPvdMbac7gMMrwrl0f"
|
||||||
|
"ohn+CBcCYZfNA2BTliwi0TGOHrOr0FJrOgsf3CZqJBsUbHVsTZCG2VMbtbWrjioYToB5cw"
|
||||||
|
"t8y6iA6UBCwAySMtBW5Hn9cQjtRJrJWWYFXC984m6+VarYClZuw80wytErNzkNp2gBmK3b"
|
||||||
|
"isbmI9XQWaKCMxBXE8NGdiMPonivRTGFd5KUrzOrHGXcf19EcV0q73zRc1k8lr5HPe3Lm1"
|
||||||
|
"wm/zTo/xl3z0jl9qdB66CQX6OQKitk4kFwIxMDvIDs4MApSYHc7mbcX/joqONRZ3ip8Iz+"
|
||||||
|
"Lx51ey3tUiHImQB1tS3OVZlnpysUmWenlTUmbyocoGyiWe81L3F9ynf+nkpYs3Dh9UgpW7"
|
||||||
|
"w/21mKSzWtJFzW1bbPqeREzSCRbnEtUa3V+NE+aLP912Z8H9e9tMz67ItG28LFpQcIuXV9"
|
||||||
|
"SWS2EAb+Qg4z61WAOVnQsP7Z1ZJeBq/F9WpWbjFkrW5fG36VS964fzZuW1/1jlagCx2A7H"
|
||||||
|
"WiNHF4mhBdfuKfMkDPTlcTPXWqpyR7XGSZBgkm/0FTUjlUkyz6bQS0GKTb5fksB55p+bnh"
|
||||||
|
"+e4vZFWJdjnQkuP23qKq7ZrAfkQaynNtrhKmzeoobZa1+aG4fZ3F7eHrn1exscntcqlIWX"
|
||||||
|
"Y1X/pfh6e5n99lgbTde3kN+sicq5J6Lmo5rqvoQNpnZ0q6Lq64IpZWdBxzIRiinX9RYSe+"
|
||||||
|
"HfmtcXb+7vz924vz96yLmElieVfzMuj29SUVHD8I0muXav2RcTnUb6mcY0djHREXdt9PgM"
|
||||||
|
"9SX7ARKcSS9P7XaNCvvE6NXQogx5gt8LuFTHqs2IjQH7uJtYYiX3X9dz/Fr3kKuZk/oCW7"
|
||||||
|
"eN3mZeHD/9BpOYI="
|
||||||
|
)
|
||||||
113
services/raggr/oidc_config.py
Normal file
113
services/raggr/oidc_config.py
Normal file
@@ -0,0 +1,113 @@
|
|||||||
|
"""
|
||||||
|
OIDC Configuration for Authelia Integration
|
||||||
|
"""
|
||||||
|
import os
|
||||||
|
from typing import Dict, Any
|
||||||
|
from authlib.jose import jwt
|
||||||
|
from authlib.jose.errors import JoseError
|
||||||
|
import httpx
|
||||||
|
|
||||||
|
|
||||||
|
class OIDCConfig:
|
||||||
|
"""OIDC Configuration Manager"""
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
# Load from environment variables
|
||||||
|
self.issuer = os.getenv("OIDC_ISSUER") # e.g., https://auth.example.com
|
||||||
|
self.client_id = os.getenv("OIDC_CLIENT_ID")
|
||||||
|
self.client_secret = os.getenv("OIDC_CLIENT_SECRET")
|
||||||
|
self.redirect_uri = os.getenv(
|
||||||
|
"OIDC_REDIRECT_URI", "http://localhost:8080/api/user/oidc/callback"
|
||||||
|
)
|
||||||
|
|
||||||
|
# OIDC endpoints (can use discovery or manual config)
|
||||||
|
self.use_discovery = os.getenv("OIDC_USE_DISCOVERY", "true").lower() == "true"
|
||||||
|
|
||||||
|
# Manual endpoint configuration (fallback if discovery fails)
|
||||||
|
self.authorization_endpoint = os.getenv("OIDC_AUTHORIZATION_ENDPOINT")
|
||||||
|
self.token_endpoint = os.getenv("OIDC_TOKEN_ENDPOINT")
|
||||||
|
self.userinfo_endpoint = os.getenv("OIDC_USERINFO_ENDPOINT")
|
||||||
|
self.jwks_uri = os.getenv("OIDC_JWKS_URI")
|
||||||
|
|
||||||
|
# Cached discovery document and JWKS
|
||||||
|
self._discovery_doc: Dict[str, Any] | None = None
|
||||||
|
self._jwks: Dict[str, Any] | None = None
|
||||||
|
|
||||||
|
def validate_config(self) -> bool:
|
||||||
|
"""Validate that required configuration is present"""
|
||||||
|
if not self.issuer or not self.client_id or not self.client_secret:
|
||||||
|
return False
|
||||||
|
return True
|
||||||
|
|
||||||
|
async def get_discovery_document(self) -> Dict[str, Any]:
|
||||||
|
"""Fetch OIDC discovery document from .well-known endpoint"""
|
||||||
|
if self._discovery_doc:
|
||||||
|
return self._discovery_doc
|
||||||
|
|
||||||
|
if not self.use_discovery:
|
||||||
|
# Return manual configuration
|
||||||
|
return {
|
||||||
|
"issuer": self.issuer,
|
||||||
|
"authorization_endpoint": self.authorization_endpoint,
|
||||||
|
"token_endpoint": self.token_endpoint,
|
||||||
|
"userinfo_endpoint": self.userinfo_endpoint,
|
||||||
|
"jwks_uri": self.jwks_uri,
|
||||||
|
}
|
||||||
|
|
||||||
|
discovery_url = f"{self.issuer.rstrip('/')}/.well-known/openid-configuration"
|
||||||
|
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
response = await client.get(discovery_url)
|
||||||
|
response.raise_for_status()
|
||||||
|
self._discovery_doc = response.json()
|
||||||
|
return self._discovery_doc
|
||||||
|
|
||||||
|
async def get_jwks(self) -> Dict[str, Any]:
|
||||||
|
"""Fetch JSON Web Key Set for token verification"""
|
||||||
|
if self._jwks:
|
||||||
|
return self._jwks
|
||||||
|
|
||||||
|
discovery = await self.get_discovery_document()
|
||||||
|
jwks_uri = discovery.get("jwks_uri")
|
||||||
|
|
||||||
|
if not jwks_uri:
|
||||||
|
raise ValueError("No jwks_uri found in discovery document")
|
||||||
|
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
response = await client.get(jwks_uri)
|
||||||
|
response.raise_for_status()
|
||||||
|
self._jwks = response.json()
|
||||||
|
return self._jwks
|
||||||
|
|
||||||
|
async def verify_id_token(self, id_token: str) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Verify and decode ID token from OIDC provider
|
||||||
|
|
||||||
|
Returns the decoded claims if valid
|
||||||
|
Raises exception if invalid
|
||||||
|
"""
|
||||||
|
jwks = await self.get_jwks()
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Verify token signature and claims
|
||||||
|
claims = jwt.decode(
|
||||||
|
id_token,
|
||||||
|
jwks,
|
||||||
|
claims_options={
|
||||||
|
"iss": {"essential": True, "value": self.issuer},
|
||||||
|
"aud": {"essential": True, "value": self.client_id},
|
||||||
|
"exp": {"essential": True},
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
# Additional validation
|
||||||
|
claims.validate()
|
||||||
|
|
||||||
|
return claims
|
||||||
|
|
||||||
|
except JoseError as e:
|
||||||
|
raise ValueError(f"Invalid ID token: {str(e)}")
|
||||||
|
|
||||||
|
|
||||||
|
# Global instance
|
||||||
|
oidc_config = OIDCConfig()
|
||||||
39
services/raggr/pyproject.toml
Normal file
39
services/raggr/pyproject.toml
Normal file
@@ -0,0 +1,39 @@
|
|||||||
|
[project]
|
||||||
|
name = "raggr"
|
||||||
|
version = "0.1.0"
|
||||||
|
description = "Add your description here"
|
||||||
|
readme = "README.md"
|
||||||
|
requires-python = ">=3.13"
|
||||||
|
dependencies = [
|
||||||
|
"chromadb>=1.1.0",
|
||||||
|
"python-dotenv>=1.0.0",
|
||||||
|
"flask>=3.1.2",
|
||||||
|
"httpx>=0.28.1",
|
||||||
|
"ollama>=0.6.0",
|
||||||
|
"openai>=2.0.1",
|
||||||
|
"pydantic>=2.11.9",
|
||||||
|
"pillow>=10.0.0",
|
||||||
|
"pymupdf>=1.24.0",
|
||||||
|
"black>=25.9.0",
|
||||||
|
"pillow-heif>=1.1.1",
|
||||||
|
"flask-jwt-extended>=4.7.1",
|
||||||
|
"bcrypt>=5.0.0",
|
||||||
|
"pony>=0.7.19",
|
||||||
|
"flask-login>=0.6.3",
|
||||||
|
"quart>=0.20.0",
|
||||||
|
"tortoise-orm>=0.25.1",
|
||||||
|
"quart-jwt-extended>=0.1.0",
|
||||||
|
"pre-commit>=4.3.0",
|
||||||
|
"tortoise-orm-stubs>=1.0.2",
|
||||||
|
"aerich>=0.8.0",
|
||||||
|
"tomlkit>=0.13.3",
|
||||||
|
"authlib>=1.3.0",
|
||||||
|
"asyncpg>=0.30.0",
|
||||||
|
"langchain-openai>=1.1.6",
|
||||||
|
"langchain>=1.2.0",
|
||||||
|
]
|
||||||
|
|
||||||
|
[tool.aerich]
|
||||||
|
tortoise_orm = "app.TORTOISE_CONFIG"
|
||||||
|
location = "./migrations"
|
||||||
|
src_folder = "./."
|
||||||
@@ -49,11 +49,20 @@ DOCTYPE_OPTIONS = [
|
|||||||
"Letter",
|
"Letter",
|
||||||
]
|
]
|
||||||
|
|
||||||
|
QUERY_TYPE_OPTIONS = [
|
||||||
|
"Simba",
|
||||||
|
"Other",
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
class DocumentType(BaseModel):
|
class DocumentType(BaseModel):
|
||||||
type: list[str] = Field(description="type of document", enum=DOCTYPE_OPTIONS)
|
type: list[str] = Field(description="type of document", enum=DOCTYPE_OPTIONS)
|
||||||
|
|
||||||
|
|
||||||
|
class QueryType(BaseModel):
|
||||||
|
type: str = Field(desciption="type of query", enum=QUERY_TYPE_OPTIONS)
|
||||||
|
|
||||||
|
|
||||||
PROMPT = """
|
PROMPT = """
|
||||||
You are an information specialist that processes user queries. The current year is 2025. The user queries are all about
|
You are an information specialist that processes user queries. The current year is 2025. The user queries are all about
|
||||||
a cat, Simba, and its records. The types of records are listed below. Using the query, extract the
|
a cat, Simba, and its records. The types of records are listed below. Using the query, extract the
|
||||||
@@ -111,6 +120,27 @@ Query: "Who does Simba know?"
|
|||||||
Tags: ["Letter", "Documentation"]
|
Tags: ["Letter", "Documentation"]
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
QUERY_TYPE_PROMPT = f"""You are an information specialist that processes user queries.
|
||||||
|
A query can have one tag attached from the following options. Based on the query and the transcript which is listed below, determine
|
||||||
|
which of the following options is most appropriate: {",".join(QUERY_TYPE_OPTIONS)}
|
||||||
|
|
||||||
|
### Example 1
|
||||||
|
Query: "Who is Simba's current vet?"
|
||||||
|
Tags: ["Simba"]
|
||||||
|
|
||||||
|
|
||||||
|
### Example 2
|
||||||
|
Query: "What is the capital of Tokyo?"
|
||||||
|
Tags: ["Other"]
|
||||||
|
|
||||||
|
|
||||||
|
### Example 3
|
||||||
|
Query: "Can you help me write an email?"
|
||||||
|
Tags: ["Other"]
|
||||||
|
|
||||||
|
TRANSCRIPT:
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
class QueryGenerator:
|
class QueryGenerator:
|
||||||
def __init__(self) -> None:
|
def __init__(self) -> None:
|
||||||
@@ -154,6 +184,33 @@ class QueryGenerator:
|
|||||||
metadata_query = {"document_type": {"$in": type_data["type"]}}
|
metadata_query = {"document_type": {"$in": type_data["type"]}}
|
||||||
return metadata_query
|
return metadata_query
|
||||||
|
|
||||||
|
def get_query_type(self, input: str, transcript: str):
|
||||||
|
client = OpenAI()
|
||||||
|
response = client.chat.completions.create(
|
||||||
|
messages=[
|
||||||
|
{
|
||||||
|
"role": "system",
|
||||||
|
"content": "You are an information specialist that is really good at deciding what tags a query should have",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"role": "user",
|
||||||
|
"content": f"{QUERY_TYPE_PROMPT}\nTRANSCRIPT:\n{transcript}\nQUERY:{input}",
|
||||||
|
},
|
||||||
|
],
|
||||||
|
model="gpt-4o",
|
||||||
|
response_format={
|
||||||
|
"type": "json_schema",
|
||||||
|
"json_schema": {
|
||||||
|
"name": "query_type",
|
||||||
|
"schema": QueryType.model_json_schema(),
|
||||||
|
},
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
response_json_str = response.choices[0].message.content
|
||||||
|
type_data = json.loads(response_json_str)
|
||||||
|
return type_data["type"]
|
||||||
|
|
||||||
def get_query(self, input: str):
|
def get_query(self, input: str):
|
||||||
client = OpenAI()
|
client = OpenAI()
|
||||||
response = client.responses.parse(
|
response = client.responses.parse(
|
||||||
9
services/raggr/raggr-frontend/.dockerignore
Normal file
9
services/raggr/raggr-frontend/.dockerignore
Normal file
@@ -0,0 +1,9 @@
|
|||||||
|
.git
|
||||||
|
.gitignore
|
||||||
|
README.md
|
||||||
|
.DS_Store
|
||||||
|
node_modules
|
||||||
|
dist
|
||||||
|
.cache
|
||||||
|
coverage
|
||||||
|
*.log
|
||||||
@@ -6,6 +6,7 @@
|
|||||||
# Dist
|
# Dist
|
||||||
node_modules
|
node_modules
|
||||||
dist/
|
dist/
|
||||||
|
.yarn
|
||||||
|
|
||||||
# Profile
|
# Profile
|
||||||
.rspack-profile-*/
|
.rspack-profile-*/
|
||||||
1
services/raggr/raggr-frontend/.yarnrc.yml
Normal file
1
services/raggr/raggr-frontend/.yarnrc.yml
Normal file
@@ -0,0 +1 @@
|
|||||||
|
nodeLinker: node-modules
|
||||||
18
services/raggr/raggr-frontend/Dockerfile.dev
Normal file
18
services/raggr/raggr-frontend/Dockerfile.dev
Normal file
@@ -0,0 +1,18 @@
|
|||||||
|
FROM node:20-slim
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Copy package files
|
||||||
|
COPY package.json yarn.lock* ./
|
||||||
|
|
||||||
|
# Install dependencies
|
||||||
|
RUN yarn install
|
||||||
|
|
||||||
|
# Copy application source code
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
# Expose rsbuild dev server port (default 3000)
|
||||||
|
EXPOSE 3000
|
||||||
|
|
||||||
|
# Default command
|
||||||
|
CMD ["sh", "-c", "yarn build && yarn watch:build"]
|
||||||
@@ -0,0 +1,63 @@
|
|||||||
|
# Token Refresh Implementation
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
The API services now automatically handle token refresh when access tokens expire. This provides a seamless user experience without requiring manual re-authentication.
|
||||||
|
|
||||||
|
## How It Works
|
||||||
|
|
||||||
|
### 1. **userService.ts**
|
||||||
|
|
||||||
|
The `userService` now includes:
|
||||||
|
|
||||||
|
- **`refreshToken()`**: Automatically gets the refresh token from localStorage, calls the `/api/user/refresh` endpoint, and updates the access token
|
||||||
|
- **`fetchWithAuth()`**: A wrapper around `fetch()` that:
|
||||||
|
1. Automatically adds the Authorization header with the access token
|
||||||
|
2. Detects 401 (Unauthorized) responses
|
||||||
|
3. Automatically refreshes the token using the refresh token
|
||||||
|
4. Retries the original request with the new access token
|
||||||
|
5. Throws an error if refresh fails (e.g., refresh token expired)
|
||||||
|
|
||||||
|
### 2. **conversationService.ts**
|
||||||
|
|
||||||
|
Now uses `userService.fetchWithAuth()` for all API calls:
|
||||||
|
- `sendQuery()` - No longer needs token parameter
|
||||||
|
- `getMessages()` - No longer needs token parameter
|
||||||
|
|
||||||
|
### 3. **Components Updated**
|
||||||
|
|
||||||
|
**ChatScreen.tsx**:
|
||||||
|
- Removed manual token handling
|
||||||
|
- Now simply calls `conversationService.sendQuery(query)` and `conversationService.getMessages()`
|
||||||
|
|
||||||
|
## Benefits
|
||||||
|
|
||||||
|
✅ **Automatic token refresh** - Users stay logged in longer
|
||||||
|
✅ **Transparent retry logic** - Failed requests due to expired tokens are automatically retried
|
||||||
|
✅ **Cleaner code** - Components don't need to manage tokens
|
||||||
|
✅ **Better UX** - No interruptions when access token expires
|
||||||
|
✅ **Centralized auth logic** - All auth handling in one place
|
||||||
|
|
||||||
|
## Error Handling
|
||||||
|
|
||||||
|
- If refresh token is missing or invalid, the error is thrown
|
||||||
|
- Components can catch these errors and redirect to login
|
||||||
|
- LocalStorage is automatically cleared when refresh fails
|
||||||
|
|
||||||
|
## Usage Example
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// Old way (manual token management)
|
||||||
|
const token = localStorage.getItem("access_token");
|
||||||
|
const result = await conversationService.sendQuery(query, token);
|
||||||
|
|
||||||
|
// New way (automatic token refresh)
|
||||||
|
const result = await conversationService.sendQuery(query);
|
||||||
|
```
|
||||||
|
|
||||||
|
## Token Storage
|
||||||
|
|
||||||
|
- **Access Token**: `localStorage.getItem("access_token")`
|
||||||
|
- **Refresh Token**: `localStorage.getItem("refresh_token")`
|
||||||
|
|
||||||
|
Both are automatically managed by the services.
|
||||||
2677
services/raggr/raggr-frontend/package-lock.json
generated
Normal file
2677
services/raggr/raggr-frontend/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
@@ -6,21 +6,37 @@
|
|||||||
"scripts": {
|
"scripts": {
|
||||||
"build": "rsbuild build",
|
"build": "rsbuild build",
|
||||||
"dev": "rsbuild dev --open",
|
"dev": "rsbuild dev --open",
|
||||||
"preview": "rsbuild preview"
|
"preview": "rsbuild preview",
|
||||||
|
"watch": "npm-watch build",
|
||||||
|
"watch:build": "rsbuild build --watch"
|
||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"axios": "^1.12.2",
|
"axios": "^1.12.2",
|
||||||
"marked": "^16.3.0",
|
"marked": "^16.3.0",
|
||||||
|
"npm-watch": "^0.13.0",
|
||||||
"react": "^19.1.1",
|
"react": "^19.1.1",
|
||||||
"react-dom": "^19.1.1",
|
"react-dom": "^19.1.1",
|
||||||
"react-markdown": "^10.1.0"
|
"react-markdown": "^10.1.0",
|
||||||
|
"watch": "^1.0.2"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
|
"@biomejs/biome": "2.3.10",
|
||||||
"@rsbuild/core": "^1.5.6",
|
"@rsbuild/core": "^1.5.6",
|
||||||
"@rsbuild/plugin-react": "^1.4.0",
|
"@rsbuild/plugin-react": "^1.4.0",
|
||||||
"@tailwindcss/postcss": "^4.0.0",
|
"@tailwindcss/postcss": "^4.0.0",
|
||||||
"@types/react": "^19.1.13",
|
"@types/react": "^19.1.13",
|
||||||
"@types/react-dom": "^19.1.9",
|
"@types/react-dom": "^19.1.9",
|
||||||
"typescript": "^5.9.2"
|
"typescript": "^5.9.2"
|
||||||
|
},
|
||||||
|
"watch": {
|
||||||
|
"build": {
|
||||||
|
"patterns": [
|
||||||
|
"src"
|
||||||
|
],
|
||||||
|
"extensions": "ts,tsx,css,js,jsx",
|
||||||
|
"delay": 1000,
|
||||||
|
"quiet": false,
|
||||||
|
"inherit": true
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -3,4 +3,8 @@ import { pluginReact } from '@rsbuild/plugin-react';
|
|||||||
|
|
||||||
export default defineConfig({
|
export default defineConfig({
|
||||||
plugins: [pluginReact()],
|
plugins: [pluginReact()],
|
||||||
|
html: {
|
||||||
|
title: 'Raggr',
|
||||||
|
favicon: './src/assets/favicon.svg',
|
||||||
|
},
|
||||||
});
|
});
|
||||||
@@ -3,4 +3,5 @@
|
|||||||
body {
|
body {
|
||||||
margin: 0;
|
margin: 0;
|
||||||
font-family: Inter, Avenir, Helvetica, Arial, sans-serif;
|
font-family: Inter, Avenir, Helvetica, Arial, sans-serif;
|
||||||
|
background-color: #F9F5EB;
|
||||||
}
|
}
|
||||||
72
services/raggr/raggr-frontend/src/App.tsx
Normal file
72
services/raggr/raggr-frontend/src/App.tsx
Normal file
@@ -0,0 +1,72 @@
|
|||||||
|
import { useState, useEffect } from "react";
|
||||||
|
|
||||||
|
import "./App.css";
|
||||||
|
import { AuthProvider } from "./contexts/AuthContext";
|
||||||
|
import { ChatScreen } from "./components/ChatScreen";
|
||||||
|
import { LoginScreen } from "./components/LoginScreen";
|
||||||
|
import { conversationService } from "./api/conversationService";
|
||||||
|
|
||||||
|
const AppContainer = () => {
|
||||||
|
const [isAuthenticated, setAuthenticated] = useState<boolean>(false);
|
||||||
|
const [isChecking, setIsChecking] = useState<boolean>(true);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const checkAuth = async () => {
|
||||||
|
const accessToken = localStorage.getItem("access_token");
|
||||||
|
const refreshToken = localStorage.getItem("refresh_token");
|
||||||
|
|
||||||
|
// No tokens at all, not authenticated
|
||||||
|
if (!accessToken && !refreshToken) {
|
||||||
|
setIsChecking(false);
|
||||||
|
setAuthenticated(false);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try to verify token by making a request
|
||||||
|
try {
|
||||||
|
await conversationService.getAllConversations();
|
||||||
|
// If successful, user is authenticated
|
||||||
|
setAuthenticated(true);
|
||||||
|
} catch (error) {
|
||||||
|
// Token is invalid or expired
|
||||||
|
console.error("Authentication check failed:", error);
|
||||||
|
localStorage.removeItem("access_token");
|
||||||
|
localStorage.removeItem("refresh_token");
|
||||||
|
setAuthenticated(false);
|
||||||
|
} finally {
|
||||||
|
setIsChecking(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
checkAuth();
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
// Show loading state while checking authentication
|
||||||
|
if (isChecking) {
|
||||||
|
return (
|
||||||
|
<div className="h-screen flex items-center justify-center bg-white/85">
|
||||||
|
<div className="text-xl">Loading...</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
{isAuthenticated ? (
|
||||||
|
<ChatScreen setAuthenticated={setAuthenticated} />
|
||||||
|
) : (
|
||||||
|
<LoginScreen setAuthenticated={setAuthenticated} />
|
||||||
|
)}
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
const App = () => {
|
||||||
|
return (
|
||||||
|
<AuthProvider>
|
||||||
|
<AppContainer />
|
||||||
|
</AuthProvider>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
export default App;
|
||||||
115
services/raggr/raggr-frontend/src/api/conversationService.ts
Normal file
115
services/raggr/raggr-frontend/src/api/conversationService.ts
Normal file
@@ -0,0 +1,115 @@
|
|||||||
|
import { userService } from "./userService";
|
||||||
|
|
||||||
|
interface Message {
|
||||||
|
id: string;
|
||||||
|
text: string;
|
||||||
|
speaker: "user" | "simba";
|
||||||
|
created_at: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface Conversation {
|
||||||
|
id: string;
|
||||||
|
name: string;
|
||||||
|
messages?: Message[];
|
||||||
|
created_at: string;
|
||||||
|
updated_at: string;
|
||||||
|
user_id?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface QueryRequest {
|
||||||
|
query: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface QueryResponse {
|
||||||
|
response: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface CreateConversationRequest {
|
||||||
|
user_id: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
class ConversationService {
|
||||||
|
private baseUrl = "/api";
|
||||||
|
private conversationBaseUrl = "/api/conversation";
|
||||||
|
|
||||||
|
async sendQuery(
|
||||||
|
query: string,
|
||||||
|
conversation_id: string,
|
||||||
|
): Promise<QueryResponse> {
|
||||||
|
const response = await userService.fetchWithRefreshToken(
|
||||||
|
`${this.baseUrl}/query`,
|
||||||
|
{
|
||||||
|
method: "POST",
|
||||||
|
body: JSON.stringify({ query, conversation_id }),
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error("Failed to send query");
|
||||||
|
}
|
||||||
|
|
||||||
|
return await response.json();
|
||||||
|
}
|
||||||
|
|
||||||
|
async getMessages(): Promise<Conversation> {
|
||||||
|
const response = await userService.fetchWithRefreshToken(
|
||||||
|
`${this.baseUrl}/messages`,
|
||||||
|
{
|
||||||
|
method: "GET",
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error("Failed to fetch messages");
|
||||||
|
}
|
||||||
|
|
||||||
|
return await response.json();
|
||||||
|
}
|
||||||
|
|
||||||
|
async getConversation(conversationId: string): Promise<Conversation> {
|
||||||
|
const response = await userService.fetchWithRefreshToken(
|
||||||
|
`${this.conversationBaseUrl}/${conversationId}`,
|
||||||
|
{
|
||||||
|
method: "GET",
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error("Failed to fetch conversation");
|
||||||
|
}
|
||||||
|
|
||||||
|
return await response.json();
|
||||||
|
}
|
||||||
|
|
||||||
|
async createConversation(): Promise<Conversation> {
|
||||||
|
const response = await userService.fetchWithRefreshToken(
|
||||||
|
`${this.conversationBaseUrl}/`,
|
||||||
|
{
|
||||||
|
method: "POST",
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error("Failed to create conversation");
|
||||||
|
}
|
||||||
|
|
||||||
|
return await response.json();
|
||||||
|
}
|
||||||
|
|
||||||
|
async getAllConversations(): Promise<Conversation[]> {
|
||||||
|
const response = await userService.fetchWithRefreshToken(
|
||||||
|
`${this.conversationBaseUrl}/`,
|
||||||
|
{
|
||||||
|
method: "GET",
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error("Failed to fetch conversations");
|
||||||
|
}
|
||||||
|
|
||||||
|
return await response.json();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export const conversationService = new ConversationService();
|
||||||
94
services/raggr/raggr-frontend/src/api/oidcService.ts
Normal file
94
services/raggr/raggr-frontend/src/api/oidcService.ts
Normal file
@@ -0,0 +1,94 @@
|
|||||||
|
/**
|
||||||
|
* OIDC Authentication Service
|
||||||
|
* Handles OAuth 2.0 Authorization Code flow with PKCE
|
||||||
|
*/
|
||||||
|
|
||||||
|
interface OIDCLoginResponse {
|
||||||
|
auth_url: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
interface OIDCCallbackResponse {
|
||||||
|
access_token: string;
|
||||||
|
refresh_token: string;
|
||||||
|
user: {
|
||||||
|
id: string;
|
||||||
|
username: string;
|
||||||
|
email: string;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
class OIDCService {
|
||||||
|
private baseUrl = "/api/user/oidc";
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Initiate OIDC login flow
|
||||||
|
* Returns authorization URL to redirect user to
|
||||||
|
*/
|
||||||
|
async initiateLogin(redirectAfterLogin: string = "/"): Promise<string> {
|
||||||
|
const response = await fetch(
|
||||||
|
`${this.baseUrl}/login?redirect=${encodeURIComponent(redirectAfterLogin)}`,
|
||||||
|
{
|
||||||
|
method: "GET",
|
||||||
|
headers: { "Content-Type": "application/json" },
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error("Failed to initiate OIDC login");
|
||||||
|
}
|
||||||
|
|
||||||
|
const data: OIDCLoginResponse = await response.json();
|
||||||
|
return data.auth_url;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle OIDC callback
|
||||||
|
* Exchanges authorization code for tokens
|
||||||
|
*/
|
||||||
|
async handleCallback(
|
||||||
|
code: string,
|
||||||
|
state: string
|
||||||
|
): Promise<OIDCCallbackResponse> {
|
||||||
|
const response = await fetch(
|
||||||
|
`${this.baseUrl}/callback?code=${encodeURIComponent(code)}&state=${encodeURIComponent(state)}`,
|
||||||
|
{
|
||||||
|
method: "GET",
|
||||||
|
headers: { "Content-Type": "application/json" },
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error("OIDC callback failed");
|
||||||
|
}
|
||||||
|
|
||||||
|
return await response.json();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extract OIDC callback parameters from URL
|
||||||
|
*/
|
||||||
|
getCallbackParamsFromURL(): { code: string; state: string } | null {
|
||||||
|
const params = new URLSearchParams(window.location.search);
|
||||||
|
const code = params.get("code");
|
||||||
|
const state = params.get("state");
|
||||||
|
|
||||||
|
if (code && state) {
|
||||||
|
return { code, state };
|
||||||
|
}
|
||||||
|
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Clear callback parameters from URL without reload
|
||||||
|
*/
|
||||||
|
clearCallbackParams(): void {
|
||||||
|
const url = new URL(window.location.href);
|
||||||
|
url.searchParams.delete("code");
|
||||||
|
url.searchParams.delete("state");
|
||||||
|
url.searchParams.delete("error");
|
||||||
|
window.history.replaceState({}, "", url.toString());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export const oidcService = new OIDCService();
|
||||||
139
services/raggr/raggr-frontend/src/api/userService.ts
Normal file
139
services/raggr/raggr-frontend/src/api/userService.ts
Normal file
@@ -0,0 +1,139 @@
|
|||||||
|
interface LoginResponse {
|
||||||
|
access_token: string;
|
||||||
|
refresh_token: string;
|
||||||
|
user: {
|
||||||
|
id: string;
|
||||||
|
username: string;
|
||||||
|
email?: string;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
interface RefreshResponse {
|
||||||
|
access_token: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
class UserService {
|
||||||
|
private baseUrl = "/api/user";
|
||||||
|
|
||||||
|
async login(username: string, password: string): Promise<LoginResponse> {
|
||||||
|
const response = await fetch(`${this.baseUrl}/login`, {
|
||||||
|
method: "POST",
|
||||||
|
headers: { "Content-Type": "application/json" },
|
||||||
|
body: JSON.stringify({ username, password }),
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error("Invalid credentials");
|
||||||
|
}
|
||||||
|
|
||||||
|
return await response.json();
|
||||||
|
}
|
||||||
|
|
||||||
|
async refreshToken(): Promise<string> {
|
||||||
|
const refreshToken = localStorage.getItem("refresh_token");
|
||||||
|
|
||||||
|
if (!refreshToken) {
|
||||||
|
throw new Error("No refresh token available");
|
||||||
|
}
|
||||||
|
|
||||||
|
const response = await fetch(`${this.baseUrl}/refresh`, {
|
||||||
|
method: "POST",
|
||||||
|
headers: {
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
Authorization: `Bearer ${refreshToken}`,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
// Refresh token is invalid or expired, clear storage
|
||||||
|
localStorage.removeItem("access_token");
|
||||||
|
localStorage.removeItem("refresh_token");
|
||||||
|
throw new Error("Failed to refresh token");
|
||||||
|
}
|
||||||
|
|
||||||
|
const data: RefreshResponse = await response.json();
|
||||||
|
localStorage.setItem("access_token", data.access_token);
|
||||||
|
return data.access_token;
|
||||||
|
}
|
||||||
|
|
||||||
|
async validateToken(): Promise<boolean> {
|
||||||
|
const refreshToken = localStorage.getItem("refresh_token");
|
||||||
|
|
||||||
|
if (!refreshToken) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
await this.refreshToken();
|
||||||
|
return true;
|
||||||
|
} catch (error) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async fetchWithAuth(
|
||||||
|
url: string,
|
||||||
|
options: RequestInit = {},
|
||||||
|
): Promise<Response> {
|
||||||
|
const accessToken = localStorage.getItem("access_token");
|
||||||
|
|
||||||
|
// Add authorization header
|
||||||
|
const headers = {
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
...(options.headers || {}),
|
||||||
|
...(accessToken && { Authorization: `Bearer ${accessToken}` }),
|
||||||
|
};
|
||||||
|
|
||||||
|
let response = await fetch(url, { ...options, headers });
|
||||||
|
|
||||||
|
// If unauthorized, try refreshing the token
|
||||||
|
if (response.status === 401) {
|
||||||
|
try {
|
||||||
|
const newAccessToken = await this.refreshToken();
|
||||||
|
|
||||||
|
// Retry the request with new token
|
||||||
|
headers.Authorization = `Bearer ${newAccessToken}`;
|
||||||
|
response = await fetch(url, { ...options, headers });
|
||||||
|
} catch (error) {
|
||||||
|
// Refresh failed, redirect to login or throw error
|
||||||
|
throw new Error("Session expired. Please log in again.");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return response;
|
||||||
|
}
|
||||||
|
|
||||||
|
async fetchWithRefreshToken(
|
||||||
|
url: string,
|
||||||
|
options: RequestInit = {},
|
||||||
|
): Promise<Response> {
|
||||||
|
const refreshToken = localStorage.getItem("refresh_token");
|
||||||
|
|
||||||
|
// Add authorization header
|
||||||
|
const headers = {
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
...(options.headers || {}),
|
||||||
|
...(refreshToken && { Authorization: `Bearer ${refreshToken}` }),
|
||||||
|
};
|
||||||
|
|
||||||
|
let response = await fetch(url, { ...options, headers });
|
||||||
|
|
||||||
|
// If unauthorized, try refreshing the token
|
||||||
|
if (response.status === 401) {
|
||||||
|
try {
|
||||||
|
const newAccessToken = await this.refreshToken();
|
||||||
|
|
||||||
|
// Retry the request with new token
|
||||||
|
headers.Authorization = `Bearer ${newAccessToken}`;
|
||||||
|
response = await fetch(url, { ...options, headers });
|
||||||
|
} catch (error) {
|
||||||
|
// Refresh failed, redirect to login or throw error
|
||||||
|
throw new Error("Session expired. Please log in again.");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return response;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export const userService = new UserService();
|
||||||
BIN
services/raggr/raggr-frontend/src/assets/cat.png
Normal file
BIN
services/raggr/raggr-frontend/src/assets/cat.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 5.8 KiB |
3
services/raggr/raggr-frontend/src/assets/favicon.svg
Normal file
3
services/raggr/raggr-frontend/src/assets/favicon.svg
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 100 100">
|
||||||
|
<text y="80" font-size="80" font-family="system-ui, -apple-system, sans-serif">🐱</text>
|
||||||
|
</svg>
|
||||||
|
After Width: | Height: | Size: 163 B |
@@ -0,0 +1,31 @@
|
|||||||
|
import ReactMarkdown from "react-markdown";
|
||||||
|
|
||||||
|
type AnswerBubbleProps = {
|
||||||
|
text: string;
|
||||||
|
loading?: boolean;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const AnswerBubble = ({ text, loading }: AnswerBubbleProps) => {
|
||||||
|
return (
|
||||||
|
<div className="rounded-md bg-orange-100 p-3 sm:p-4 w-2/3">
|
||||||
|
{loading ? (
|
||||||
|
<div className="flex flex-col w-full animate-pulse gap-2">
|
||||||
|
<div className="flex flex-row gap-2 w-full">
|
||||||
|
<div className="bg-gray-400 w-1/2 p-3 rounded-lg" />
|
||||||
|
<div className="bg-gray-400 w-1/2 p-3 rounded-lg" />
|
||||||
|
</div>
|
||||||
|
<div className="flex flex-row gap-2 w-full">
|
||||||
|
<div className="bg-gray-400 w-1/3 p-3 rounded-lg" />
|
||||||
|
<div className="bg-gray-400 w-2/3 p-3 rounded-lg" />
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<div className=" flex flex-col break-words overflow-wrap-anywhere text-sm sm:text-base [&>*]:break-words">
|
||||||
|
<ReactMarkdown>
|
||||||
|
{"🐈: " + text}
|
||||||
|
</ReactMarkdown>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
303
services/raggr/raggr-frontend/src/components/ChatScreen.tsx
Normal file
303
services/raggr/raggr-frontend/src/components/ChatScreen.tsx
Normal file
@@ -0,0 +1,303 @@
|
|||||||
|
import { useEffect, useState, useRef } from "react";
|
||||||
|
import { conversationService } from "../api/conversationService";
|
||||||
|
import { QuestionBubble } from "./QuestionBubble";
|
||||||
|
import { AnswerBubble } from "./AnswerBubble";
|
||||||
|
import { MessageInput } from "./MessageInput";
|
||||||
|
import { ConversationList } from "./ConversationList";
|
||||||
|
import catIcon from "../assets/cat.png";
|
||||||
|
|
||||||
|
type Message = {
|
||||||
|
text: string;
|
||||||
|
speaker: "simba" | "user";
|
||||||
|
};
|
||||||
|
|
||||||
|
type QuestionAnswer = {
|
||||||
|
question: string;
|
||||||
|
answer: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
type Conversation = {
|
||||||
|
title: string;
|
||||||
|
id: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
type ChatScreenProps = {
|
||||||
|
setAuthenticated: (isAuth: boolean) => void;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const ChatScreen = ({ setAuthenticated }: ChatScreenProps) => {
|
||||||
|
const [query, setQuery] = useState<string>("");
|
||||||
|
const [answer, setAnswer] = useState<string>("");
|
||||||
|
const [simbaMode, setSimbaMode] = useState<boolean>(false);
|
||||||
|
const [questionsAnswers, setQuestionsAnswers] = useState<QuestionAnswer[]>(
|
||||||
|
[],
|
||||||
|
);
|
||||||
|
const [messages, setMessages] = useState<Message[]>([]);
|
||||||
|
const [conversations, setConversations] = useState<Conversation[]>([
|
||||||
|
{ title: "simba meow meow", id: "uuid" },
|
||||||
|
]);
|
||||||
|
const [showConversations, setShowConversations] = useState<boolean>(false);
|
||||||
|
const [selectedConversation, setSelectedConversation] =
|
||||||
|
useState<Conversation | null>(null);
|
||||||
|
const [sidebarCollapsed, setSidebarCollapsed] = useState<boolean>(false);
|
||||||
|
|
||||||
|
const messagesEndRef = useRef<HTMLDivElement>(null);
|
||||||
|
const simbaAnswers = ["meow.", "hiss...", "purrrrrr", "yowOWROWWowowr"];
|
||||||
|
|
||||||
|
const scrollToBottom = () => {
|
||||||
|
messagesEndRef.current?.scrollIntoView({ behavior: "smooth" });
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleSelectConversation = (conversation: Conversation) => {
|
||||||
|
setShowConversations(false);
|
||||||
|
setSelectedConversation(conversation);
|
||||||
|
const loadMessages = async () => {
|
||||||
|
try {
|
||||||
|
const fetchedConversation = await conversationService.getConversation(
|
||||||
|
conversation.id,
|
||||||
|
);
|
||||||
|
setMessages(
|
||||||
|
fetchedConversation.messages.map((message) => ({
|
||||||
|
text: message.text,
|
||||||
|
speaker: message.speaker,
|
||||||
|
})),
|
||||||
|
);
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Failed to load messages:", error);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
loadMessages();
|
||||||
|
};
|
||||||
|
|
||||||
|
const loadConversations = async () => {
|
||||||
|
try {
|
||||||
|
const fetchedConversations =
|
||||||
|
await conversationService.getAllConversations();
|
||||||
|
const parsedConversations = fetchedConversations.map((conversation) => ({
|
||||||
|
id: conversation.id,
|
||||||
|
title: conversation.name,
|
||||||
|
}));
|
||||||
|
setConversations(parsedConversations);
|
||||||
|
setSelectedConversation(parsedConversations[0]);
|
||||||
|
console.log(parsedConversations);
|
||||||
|
console.log("JELLYFISH@");
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Failed to load messages:", error);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleCreateNewConversation = async () => {
|
||||||
|
const newConversation = await conversationService.createConversation();
|
||||||
|
await loadConversations();
|
||||||
|
setSelectedConversation({
|
||||||
|
title: newConversation.name,
|
||||||
|
id: newConversation.id,
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
loadConversations();
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
scrollToBottom();
|
||||||
|
}, [messages]);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const loadMessages = async () => {
|
||||||
|
console.log(selectedConversation);
|
||||||
|
console.log("JELLYFISH");
|
||||||
|
if (selectedConversation == null) return;
|
||||||
|
try {
|
||||||
|
const conversation = await conversationService.getConversation(
|
||||||
|
selectedConversation.id,
|
||||||
|
);
|
||||||
|
// Update the conversation title in case it changed
|
||||||
|
setSelectedConversation({
|
||||||
|
id: conversation.id,
|
||||||
|
title: conversation.name,
|
||||||
|
});
|
||||||
|
setMessages(
|
||||||
|
conversation.messages.map((message) => ({
|
||||||
|
text: message.text,
|
||||||
|
speaker: message.speaker,
|
||||||
|
})),
|
||||||
|
);
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Failed to load messages:", error);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
loadMessages();
|
||||||
|
}, [selectedConversation?.id]);
|
||||||
|
|
||||||
|
const handleQuestionSubmit = async () => {
|
||||||
|
if (!query.trim()) return; // Don't submit empty messages
|
||||||
|
|
||||||
|
const currMessages = messages.concat([{ text: query, speaker: "user" }]);
|
||||||
|
setMessages(currMessages);
|
||||||
|
setQuery(""); // Clear input immediately after submission
|
||||||
|
|
||||||
|
if (simbaMode) {
|
||||||
|
console.log("simba mode activated");
|
||||||
|
const randomIndex = Math.floor(Math.random() * simbaAnswers.length);
|
||||||
|
const randomElement = simbaAnswers[randomIndex];
|
||||||
|
setAnswer(randomElement);
|
||||||
|
setQuestionsAnswers(
|
||||||
|
questionsAnswers.concat([
|
||||||
|
{
|
||||||
|
question: query,
|
||||||
|
answer: randomElement,
|
||||||
|
},
|
||||||
|
]),
|
||||||
|
);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
const result = await conversationService.sendQuery(
|
||||||
|
query,
|
||||||
|
selectedConversation.id,
|
||||||
|
);
|
||||||
|
setQuestionsAnswers(
|
||||||
|
questionsAnswers.concat([{ question: query, answer: result.response }]),
|
||||||
|
);
|
||||||
|
setMessages(
|
||||||
|
currMessages.concat([{ text: result.response, speaker: "simba" }]),
|
||||||
|
);
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Failed to send query:", error);
|
||||||
|
// If session expired, redirect to login
|
||||||
|
if (error instanceof Error && error.message.includes("Session expired")) {
|
||||||
|
setAuthenticated(false);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleQueryChange = (event: React.ChangeEvent<HTMLTextAreaElement>) => {
|
||||||
|
setQuery(event.target.value);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleKeyDown = (event: React.KeyboardEvent<HTMLTextAreaElement>) => {
|
||||||
|
// Submit on Enter, but allow Shift+Enter for new line
|
||||||
|
if (event.key === "Enter" && !event.shiftKey) {
|
||||||
|
event.preventDefault();
|
||||||
|
handleQuestionSubmit();
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="h-screen flex flex-row bg-[#F9F5EB]">
|
||||||
|
{/* Sidebar - Expanded */}
|
||||||
|
<aside
|
||||||
|
className={`hidden md:flex md:flex-col bg-[#F9F5EB] border-r border-gray-200 p-4 overflow-y-auto transition-all duration-300 ${sidebarCollapsed ? "w-20" : "w-64"}`}
|
||||||
|
>
|
||||||
|
{!sidebarCollapsed ? (
|
||||||
|
<div className="bg-[#F9F5EB]">
|
||||||
|
<div className="flex flex-row items-center gap-2 mb-6">
|
||||||
|
<img
|
||||||
|
src={catIcon}
|
||||||
|
alt="Simba"
|
||||||
|
className="cursor-pointer hover:opacity-80"
|
||||||
|
onClick={() => setSidebarCollapsed(true)}
|
||||||
|
/>
|
||||||
|
<h2 className="text-3xl bg-[#F9F5EB] font-semibold">asksimba!</h2>
|
||||||
|
</div>
|
||||||
|
<ConversationList
|
||||||
|
conversations={conversations}
|
||||||
|
onCreateNewConversation={handleCreateNewConversation}
|
||||||
|
onSelectConversation={handleSelectConversation}
|
||||||
|
/>
|
||||||
|
<div className="mt-auto pt-4">
|
||||||
|
<button
|
||||||
|
className="w-full p-2 border border-red-400 bg-red-200 hover:bg-red-400 cursor-pointer rounded-md text-sm"
|
||||||
|
onClick={() => setAuthenticated(false)}
|
||||||
|
>
|
||||||
|
logout
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
) : (
|
||||||
|
<div className="flex flex-col items-center gap-4">
|
||||||
|
<img
|
||||||
|
src={catIcon}
|
||||||
|
alt="Simba"
|
||||||
|
className="cursor-pointer hover:opacity-80"
|
||||||
|
onClick={() => setSidebarCollapsed(false)}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</aside>
|
||||||
|
|
||||||
|
{/* Main chat area */}
|
||||||
|
<div className="flex-1 flex flex-col h-screen overflow-hidden">
|
||||||
|
{/* Mobile header */}
|
||||||
|
<header className="md:hidden flex flex-row justify-between items-center gap-3 p-4 border-b border-gray-200 bg-white">
|
||||||
|
<div className="flex flex-row items-center gap-2">
|
||||||
|
<img src={catIcon} alt="Simba" className="w-10 h-10" />
|
||||||
|
<h1 className="text-xl">asksimba!</h1>
|
||||||
|
</div>
|
||||||
|
<div className="flex flex-row gap-2">
|
||||||
|
<button
|
||||||
|
className="p-2 border border-green-400 bg-green-200 hover:bg-green-400 cursor-pointer rounded-md text-sm"
|
||||||
|
onClick={() => setShowConversations(!showConversations)}
|
||||||
|
>
|
||||||
|
{showConversations ? "hide" : "show"}
|
||||||
|
</button>
|
||||||
|
<button
|
||||||
|
className="p-2 border border-red-400 bg-red-200 hover:bg-red-400 cursor-pointer rounded-md text-sm"
|
||||||
|
onClick={() => setAuthenticated(false)}
|
||||||
|
>
|
||||||
|
logout
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</header>
|
||||||
|
|
||||||
|
{/* Messages area */}
|
||||||
|
{selectedConversation && (
|
||||||
|
<div className="sticky top-0 mx-auto w-full">
|
||||||
|
<div className="bg-[#F9F5EB] text-black px-6 w-full py-3">
|
||||||
|
<h2 className="text-lg font-semibold">
|
||||||
|
{selectedConversation.title || "Untitled Conversation"}
|
||||||
|
</h2>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
<div className="flex-1 overflow-y-auto relative px-4 py-6">
|
||||||
|
{/* Floating conversation name */}
|
||||||
|
|
||||||
|
<div className="max-w-2xl mx-auto flex flex-col gap-4">
|
||||||
|
{showConversations && (
|
||||||
|
<div className="md:hidden">
|
||||||
|
<ConversationList
|
||||||
|
conversations={conversations}
|
||||||
|
onCreateNewConversation={handleCreateNewConversation}
|
||||||
|
onSelectConversation={handleSelectConversation}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
{messages.map((msg, index) => {
|
||||||
|
if (msg.speaker === "simba") {
|
||||||
|
return <AnswerBubble key={index} text={msg.text} />;
|
||||||
|
}
|
||||||
|
return <QuestionBubble key={index} text={msg.text} />;
|
||||||
|
})}
|
||||||
|
<div ref={messagesEndRef} />
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{/* Input area */}
|
||||||
|
<footer className="p-4 bg-[#F9F5EB]">
|
||||||
|
<div className="max-w-2xl mx-auto">
|
||||||
|
<MessageInput
|
||||||
|
query={query}
|
||||||
|
handleQueryChange={handleQueryChange}
|
||||||
|
handleKeyDown={handleKeyDown}
|
||||||
|
handleQuestionSubmit={handleQuestionSubmit}
|
||||||
|
setSimbaMode={setSimbaMode}
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
</footer>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
@@ -0,0 +1,69 @@
|
|||||||
|
import { useState, useEffect } from "react";
|
||||||
|
|
||||||
|
import { conversationService } from "../api/conversationService";
|
||||||
|
type Conversation = {
|
||||||
|
title: string;
|
||||||
|
id: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
type ConversationProps = {
|
||||||
|
conversations: Conversation[];
|
||||||
|
onSelectConversation: (conversation: Conversation) => void;
|
||||||
|
onCreateNewConversation: () => void;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const ConversationList = ({
|
||||||
|
conversations,
|
||||||
|
onSelectConversation,
|
||||||
|
onCreateNewConversation,
|
||||||
|
}: ConversationProps) => {
|
||||||
|
const [conservations, setConversations] = useState(conversations);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const loadConversations = async () => {
|
||||||
|
try {
|
||||||
|
let fetchedConversations =
|
||||||
|
await conversationService.getAllConversations();
|
||||||
|
|
||||||
|
if (conversations.length == 0) {
|
||||||
|
await conversationService.createConversation();
|
||||||
|
fetchedConversations =
|
||||||
|
await conversationService.getAllConversations();
|
||||||
|
}
|
||||||
|
setConversations(
|
||||||
|
fetchedConversations.map((conversation) => ({
|
||||||
|
id: conversation.id,
|
||||||
|
title: conversation.name,
|
||||||
|
})),
|
||||||
|
);
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Failed to load messages:", error);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
loadConversations();
|
||||||
|
}, []);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="bg-indigo-300 rounded-md p-3 sm:p-4 flex flex-col gap-1">
|
||||||
|
{conservations.map((conversation) => {
|
||||||
|
return (
|
||||||
|
<div
|
||||||
|
key={conversation.id}
|
||||||
|
className="border-blue-400 bg-indigo-300 hover:bg-indigo-200 cursor-pointer rounded-md p-3 min-h-[44px] flex items-center"
|
||||||
|
onClick={() => onSelectConversation(conversation)}
|
||||||
|
>
|
||||||
|
<p className="text-sm sm:text-base truncate w-full">
|
||||||
|
{conversation.title}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
})}
|
||||||
|
<div
|
||||||
|
className="border-blue-400 bg-indigo-300 hover:bg-indigo-200 cursor-pointer rounded-md p-3 min-h-[44px] flex items-center"
|
||||||
|
onClick={() => onCreateNewConversation()}
|
||||||
|
>
|
||||||
|
<p className="text-sm sm:text-base"> + Start a new thread</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
@@ -0,0 +1,24 @@
|
|||||||
|
type Conversation = {
|
||||||
|
title: string;
|
||||||
|
id: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
type ConversationMenuProps = {
|
||||||
|
conversations: Conversation[];
|
||||||
|
};
|
||||||
|
|
||||||
|
export const ConversationMenu = ({ conversations }: ConversationMenuProps) => {
|
||||||
|
return (
|
||||||
|
<div className="absolute bg-white w-md rounded-md shadow-xl m-4 p-4">
|
||||||
|
<p className="py-2 px-4 rounded-md w-full text-xl font-bold">askSimba!</p>
|
||||||
|
{conversations.map((conversation) => (
|
||||||
|
<p
|
||||||
|
key={conversation.id}
|
||||||
|
className="py-2 px-4 rounded-md hover:bg-stone-200 w-full text-xl font-bold cursor-pointer"
|
||||||
|
>
|
||||||
|
{conversation.title}
|
||||||
|
</p>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
130
services/raggr/raggr-frontend/src/components/LoginScreen.tsx
Normal file
130
services/raggr/raggr-frontend/src/components/LoginScreen.tsx
Normal file
@@ -0,0 +1,130 @@
|
|||||||
|
import { useState, useEffect } from "react";
|
||||||
|
import { userService } from "../api/userService";
|
||||||
|
import { oidcService } from "../api/oidcService";
|
||||||
|
|
||||||
|
type LoginScreenProps = {
|
||||||
|
setAuthenticated: (isAuth: boolean) => void;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const LoginScreen = ({ setAuthenticated }: LoginScreenProps) => {
|
||||||
|
const [error, setError] = useState<string>("");
|
||||||
|
const [isChecking, setIsChecking] = useState<boolean>(true);
|
||||||
|
const [isLoggingIn, setIsLoggingIn] = useState<boolean>(false);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
const initAuth = async () => {
|
||||||
|
// First, check for OIDC callback parameters
|
||||||
|
const callbackParams = oidcService.getCallbackParamsFromURL();
|
||||||
|
|
||||||
|
if (callbackParams) {
|
||||||
|
// Handle OIDC callback
|
||||||
|
try {
|
||||||
|
setIsLoggingIn(true);
|
||||||
|
const result = await oidcService.handleCallback(
|
||||||
|
callbackParams.code,
|
||||||
|
callbackParams.state
|
||||||
|
);
|
||||||
|
|
||||||
|
// Store tokens
|
||||||
|
localStorage.setItem("access_token", result.access_token);
|
||||||
|
localStorage.setItem("refresh_token", result.refresh_token);
|
||||||
|
|
||||||
|
// Clear URL parameters
|
||||||
|
oidcService.clearCallbackParams();
|
||||||
|
|
||||||
|
setAuthenticated(true);
|
||||||
|
setIsChecking(false);
|
||||||
|
return;
|
||||||
|
} catch (err) {
|
||||||
|
console.error("OIDC callback error:", err);
|
||||||
|
setError("Login failed. Please try again.");
|
||||||
|
oidcService.clearCallbackParams();
|
||||||
|
setIsLoggingIn(false);
|
||||||
|
setIsChecking(false);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if user is already authenticated
|
||||||
|
const isValid = await userService.validateToken();
|
||||||
|
if (isValid) {
|
||||||
|
setAuthenticated(true);
|
||||||
|
}
|
||||||
|
setIsChecking(false);
|
||||||
|
};
|
||||||
|
|
||||||
|
initAuth();
|
||||||
|
}, [setAuthenticated]);
|
||||||
|
|
||||||
|
const handleOIDCLogin = async () => {
|
||||||
|
try {
|
||||||
|
setIsLoggingIn(true);
|
||||||
|
setError("");
|
||||||
|
|
||||||
|
// Get authorization URL from backend
|
||||||
|
const authUrl = await oidcService.initiateLogin();
|
||||||
|
|
||||||
|
// Redirect to Authelia
|
||||||
|
window.location.href = authUrl;
|
||||||
|
} catch (err) {
|
||||||
|
setError("Failed to initiate login. Please try again.");
|
||||||
|
console.error("OIDC login error:", err);
|
||||||
|
setIsLoggingIn(false);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Show loading state while checking authentication or processing callback
|
||||||
|
if (isChecking || isLoggingIn) {
|
||||||
|
return (
|
||||||
|
<div className="h-screen bg-opacity-20">
|
||||||
|
<div className="bg-white/85 h-screen flex items-center justify-center">
|
||||||
|
<div className="text-center">
|
||||||
|
<p className="text-lg sm:text-xl">
|
||||||
|
{isLoggingIn ? "Logging in..." : "Checking authentication..."}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="h-screen bg-opacity-20">
|
||||||
|
<div className="bg-white/85 h-screen">
|
||||||
|
<div className="flex flex-row justify-center py-4">
|
||||||
|
<div className="flex flex-col gap-4 w-full px-4 sm:w-11/12 sm:max-w-2xl lg:max-w-4xl sm:px-0">
|
||||||
|
<div className="flex flex-col gap-4">
|
||||||
|
<div className="flex flex-grow justify-center w-full bg-amber-400 p-2">
|
||||||
|
<h1 className="text-base sm:text-xl font-bold text-center">
|
||||||
|
I AM LOOKING FOR A DESIGNER. THIS APP WILL REMAIN UGLY UNTIL A
|
||||||
|
DESIGNER COMES.
|
||||||
|
</h1>
|
||||||
|
</div>
|
||||||
|
<header className="flex flex-row justify-center gap-2 grow sticky top-0 z-10 bg-white">
|
||||||
|
<h1 className="text-2xl sm:text-3xl">ask simba!</h1>
|
||||||
|
</header>
|
||||||
|
|
||||||
|
{error && (
|
||||||
|
<div className="text-red-600 font-semibold text-sm sm:text-base bg-red-50 p-3 rounded-md">
|
||||||
|
{error}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<div className="text-center text-sm sm:text-base text-gray-600 py-2">
|
||||||
|
Click below to login with Authelia
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<button
|
||||||
|
className="p-3 sm:p-4 min-h-[44px] border border-blue-400 bg-blue-200 hover:bg-blue-400 cursor-pointer rounded-md flex-grow text-sm sm:text-base font-semibold"
|
||||||
|
onClick={handleOIDCLogin}
|
||||||
|
disabled={isLoggingIn}
|
||||||
|
>
|
||||||
|
{isLoggingIn ? "Redirecting..." : "Login with Authelia"}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
@@ -0,0 +1,43 @@
|
|||||||
|
import { useEffect, useState, useRef } from "react";
|
||||||
|
|
||||||
|
type MessageInputProps = {
|
||||||
|
handleQueryChange: (event: React.ChangeEvent<HTMLTextAreaElement>) => void;
|
||||||
|
handleKeyDown: (event: React.ChangeEvent<HTMLTextAreaElement>) => void;
|
||||||
|
handleQuestionSubmit: () => void;
|
||||||
|
setSimbaMode: (sdf: boolean) => void;
|
||||||
|
query: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export const MessageInput = ({query, handleKeyDown, handleQueryChange, handleQuestionSubmit, setSimbaMode}: MessageInputProps) => {
|
||||||
|
return (
|
||||||
|
<div className="flex flex-col gap-4 sticky bottom-0 bg-[#3D763A] p-6 rounded-xl">
|
||||||
|
<div className="flex flex-row justify-between grow">
|
||||||
|
<textarea
|
||||||
|
className="p-3 sm:p-4 border border-blue-200 rounded-md grow bg-[#F9F5EB] min-h-[44px] resize-y"
|
||||||
|
onChange={handleQueryChange}
|
||||||
|
onKeyDown={handleKeyDown}
|
||||||
|
value={query}
|
||||||
|
rows={2}
|
||||||
|
placeholder="Type your message... (Press Enter to send, Shift+Enter for new line)"
|
||||||
|
/>
|
||||||
|
</div>
|
||||||
|
<div className="flex flex-row justify-between gap-2 grow">
|
||||||
|
<button
|
||||||
|
className="p-3 sm:p-4 min-h-[44px] border border-blue-400 bg-[#EDA541] hover:bg-blue-400 cursor-pointer rounded-md flex-grow text-sm sm:text-base"
|
||||||
|
onClick={() => handleQuestionSubmit()}
|
||||||
|
type="submit"
|
||||||
|
>
|
||||||
|
Submit
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
<div className="flex flex-row justify-center gap-2 grow items-center">
|
||||||
|
<input
|
||||||
|
type="checkbox"
|
||||||
|
onChange={(event) => setSimbaMode(event.target.checked)}
|
||||||
|
className="w-5 h-5 cursor-pointer"
|
||||||
|
/>
|
||||||
|
<p className="text-sm sm:text-base">simba mode?</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
@@ -0,0 +1,11 @@
|
|||||||
|
type QuestionBubbleProps = {
|
||||||
|
text: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
export const QuestionBubble = ({ text }: QuestionBubbleProps) => {
|
||||||
|
return (
|
||||||
|
<div className="w-2/3 rounded-md bg-stone-200 p-3 sm:p-4 break-words overflow-wrap-anywhere text-sm sm:text-base ml-auto">
|
||||||
|
🤦: {text}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
};
|
||||||
56
services/raggr/raggr-frontend/src/contexts/AuthContext.tsx
Normal file
56
services/raggr/raggr-frontend/src/contexts/AuthContext.tsx
Normal file
@@ -0,0 +1,56 @@
|
|||||||
|
import { createContext, useContext, useState, ReactNode } from "react";
|
||||||
|
import { userService } from "../api/userService";
|
||||||
|
|
||||||
|
interface AuthContextType {
|
||||||
|
token: string | null;
|
||||||
|
login: (username: string, password: string) => Promise<any>;
|
||||||
|
logout: () => void;
|
||||||
|
isAuthenticated: () => boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
const AuthContext = createContext<AuthContextType | undefined>(undefined);
|
||||||
|
|
||||||
|
interface AuthProviderProps {
|
||||||
|
children: ReactNode;
|
||||||
|
}
|
||||||
|
|
||||||
|
export const AuthProvider = ({ children }: AuthProviderProps) => {
|
||||||
|
const [token, setToken] = useState(localStorage.getItem("access_token"));
|
||||||
|
|
||||||
|
const login = async (username: string, password: string) => {
|
||||||
|
try {
|
||||||
|
const data = await userService.login(username, password);
|
||||||
|
setToken(data.access_token);
|
||||||
|
localStorage.setItem("access_token", data.access_token);
|
||||||
|
localStorage.setItem("refresh_token", data.refresh_token);
|
||||||
|
return data;
|
||||||
|
} catch (error) {
|
||||||
|
console.error("Login failed:", error);
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const logout = () => {
|
||||||
|
setToken(null);
|
||||||
|
localStorage.removeItem("access_token");
|
||||||
|
localStorage.removeItem("refresh_token");
|
||||||
|
};
|
||||||
|
|
||||||
|
const isAuthenticated = () => {
|
||||||
|
return token !== null && token !== undefined && token !== "";
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<AuthContext.Provider value={{ token, login, logout, isAuthenticated }}>
|
||||||
|
{children}
|
||||||
|
</AuthContext.Provider>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
|
||||||
|
export const useAuth = () => {
|
||||||
|
const context = useContext(AuthContext);
|
||||||
|
if (context === undefined) {
|
||||||
|
throw new Error("useAuth must be used within an AuthProvider");
|
||||||
|
}
|
||||||
|
return context;
|
||||||
|
};
|
||||||
|
Before Width: | Height: | Size: 3.4 MiB After Width: | Height: | Size: 3.4 MiB |
|
Before Width: | Height: | Size: 2.1 MiB After Width: | Height: | Size: 2.1 MiB |
2877
services/raggr/raggr-frontend/yarn.lock
Normal file
2877
services/raggr/raggr-frontend/yarn.lock
Normal file
File diff suppressed because it is too large
Load Diff
29
services/raggr/startup-dev.sh
Executable file
29
services/raggr/startup-dev.sh
Executable file
@@ -0,0 +1,29 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
set -e
|
||||||
|
|
||||||
|
echo "Initializing directories..."
|
||||||
|
mkdir -p /app/chromadb
|
||||||
|
|
||||||
|
echo "Waiting for frontend to build..."
|
||||||
|
while [ ! -f /app/raggr-frontend/dist/index.html ]; do
|
||||||
|
sleep 1
|
||||||
|
done
|
||||||
|
echo "Frontend built successfully!"
|
||||||
|
|
||||||
|
echo "Setting up database..."
|
||||||
|
# Give PostgreSQL a moment to be ready (healthcheck in docker-compose handles this)
|
||||||
|
sleep 3
|
||||||
|
|
||||||
|
if ls migrations/models/0_*.py 1> /dev/null 2>&1; then
|
||||||
|
echo "Running database migrations..."
|
||||||
|
aerich upgrade
|
||||||
|
else
|
||||||
|
echo "No migrations found, initializing database..."
|
||||||
|
aerich init-db
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "Starting reindex process..."
|
||||||
|
python main.py "" --reindex || echo "Reindex failed, continuing anyway..."
|
||||||
|
|
||||||
|
echo "Starting Flask application in debug mode..."
|
||||||
|
python app.py
|
||||||
@@ -1,5 +1,8 @@
|
|||||||
#!/bin/bash
|
#!/bin/bash
|
||||||
|
|
||||||
|
echo "Running database migrations..."
|
||||||
|
aerich upgrade
|
||||||
|
|
||||||
echo "Starting reindex process..."
|
echo "Starting reindex process..."
|
||||||
python main.py "" --reindex
|
python main.py "" --reindex
|
||||||
|
|
||||||
934
uv.lock → services/raggr/uv.lock
generated
934
uv.lock → services/raggr/uv.lock
generated
File diff suppressed because it is too large
Load Diff
Reference in New Issue
Block a user