32 Commits

Author SHA1 Message Date
Ryan Chen
913875188a oidc 2025-12-25 07:36:26 -08:00
Ryan Chen
f5e2d68cd2 Making UI changes 2025-12-24 17:12:56 -08:00
Ryan Chen
70799ffb7d refactor 2025-11-10 15:51:13 -05:00
Ryan Chen
7f1d4fbdda asdf 2025-10-29 22:17:45 -04:00
Ryan Chen
5ebdd60ea0 Making better 2025-10-29 21:28:23 -04:00
ryan
289045e7d0 Merge pull request 'mobile-responsive-layout' (#9) from mobile-responsive-layout into main
Reviewed-on: #9
2025-10-29 21:15:14 -04:00
ryan
ceea83cb54 Merge branch 'main' into mobile-responsive-layout 2025-10-29 21:15:10 -04:00
Ryan Chen
1b60aab97c sdf 2025-10-29 21:14:52 -04:00
ryan
210bfc1476 Merge pull request 'query classification' (#8) from async-reindexing into main
Reviewed-on: #8
2025-10-29 21:13:42 -04:00
Ryan Chen
454fb1b52c Add authentication validation on login screen load
- Add validateToken() method to userService to check if refresh token is valid
- Automatically redirect to chat if user already has valid session
- Show 'Checking authentication...' loading state during validation
- Prevents unnecessary login if user is already authenticated
- Improves UX by skipping login screen when not needed

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-27 12:24:10 -04:00
Ryan Chen
c3f2501585 Clear text input immediately upon message submission
- Clear input field right after user sends message (before API call)
- Add validation to prevent submitting empty/whitespace-only messages
- Improve UX by allowing user to type next message while waiting for response
- Works for both simba mode and normal mode

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-27 12:22:32 -04:00
Ryan Chen
1da21fabee Add auto-scroll to bottom for new messages
- Automatically scroll to latest message when new messages arrive
- Uses smooth scrolling behavior for better UX
- Triggers on message array changes
- Improves chat experience by keeping conversation in view

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-27 12:12:05 -04:00
Ryan Chen
dd5690ee53 Add submit on Enter for chat textarea
- Press Enter to submit message
- Press Shift+Enter to insert new line
- Add helpful placeholder text explaining keyboard shortcuts
- Improve chat UX with standard messaging behavior

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-27 12:07:47 -04:00
Ryan Chen
5e7ac28b6f Update add_user.py to use configurable database path
- Use DATABASE_PATH and DATABASE_URL environment variables
- Consistent with app.py and aerich_config.py configuration
- Add environment variable documentation to help text
- Default remains database/raggr.db for backward compatibility

Usage:
  DATABASE_PATH=dev.db python add_user.py list

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-27 12:02:56 -04:00
Ryan Chen
29f8894e4a Add configurable database path via environment variables
- Add DATABASE_PATH environment variable support in app.py and aerich_config.py
- DATABASE_PATH: For simple relative/absolute paths (default: database/raggr.db)
- DATABASE_URL: For full connection strings (overrides DATABASE_PATH if set)
- Create .env.example with all configuration options documented
- Maintains backward compatibility with default database location

Usage:
  # Use default path
  python app.py

  # Use custom path for development
  DATABASE_PATH=dev.db python app.py

  # Use full connection string
  DATABASE_URL=sqlite://custom/path.db python app.py

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-27 12:01:16 -04:00
Ryan Chen
19d1df2f68 Improve mobile responsiveness with Tailwind breakpoints
- Replace fixed-width containers (min-w-xl max-w-xl) with responsive classes
- Mobile: full width with padding, Tablet: 90% max 768px, Desktop: max 1024px
- Make ChatScreen header stack vertically on mobile, horizontal on desktop
- Add touch-friendly button sizes (min 44x44px tap targets)
- Optimize textarea and form inputs for mobile keyboards
- Add text wrapping (break-words) to message bubbles to prevent overflow
- Apply responsive text sizing (text-sm sm:text-base) throughout
- Improve ConversationList with touch-friendly hit areas
- Add responsive padding/spacing across all components

All components now use standard Tailwind breakpoints:
- sm: 640px+ (tablet)
- md: 768px+ (larger tablet)
- lg: 1024px+ (desktop)
- xl: 1280px+ (large desktop)

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-10-27 11:57:54 -04:00
Ryan Chen
e577cb335b query classification 2025-10-26 17:29:00 -04:00
Ryan Chen
591788dfa4 reindex pls 2025-10-26 11:06:32 -04:00
Ryan Chen
561b5bddce reindex pls 2025-10-26 11:04:33 -04:00
Ryan Chen
ddd455a4c6 reindex pls 2025-10-26 11:02:51 -04:00
ryan
07424e77e0 Merge pull request 'favicon' (#7) from update-favicon-and-title into main
Reviewed-on: #7
2025-10-26 10:49:27 -04:00
Ryan Chen
a56f752917 favicon 2025-10-26 10:48:59 -04:00
Ryan Chen
e8264e80ce Changing DB thing 2025-10-26 09:36:33 -04:00
ryan
04350045d3 Merge pull request 'Adding support for conversations and multiple threads' (#6) from conversation-uplift into main
Reviewed-on: #6
2025-10-26 09:25:52 -04:00
Ryan Chen
f16e13fccc big uplift 2025-10-26 09:25:17 -04:00
ryan
245db92524 Merge pull request 'enabling login btw users' (#5) from quart-login into main
Reviewed-on: #5
2025-10-25 09:34:08 -04:00
Ryan Chen
29ac724d50 enabling login btw users 2025-10-25 09:30:54 -04:00
Ryan Chen
7161c09a4e do not fully delete lol 2025-10-24 08:47:59 -04:00
Ryan Chen
68d73b62e8 Instituting LLM fallback to OpenAI if gaming PC is not on 2025-10-24 08:44:08 -04:00
Ryan Chen
6b616137d3 oops 2025-10-23 22:45:14 -04:00
Ryan Chen
841b6ebd4f i commit 2025-10-23 22:39:26 -04:00
Ryan Chen
45a5e92aee Added conversation history (#4)
Reviewed-on: #4
Co-authored-by: Ryan Chen <ryan@torrtle.co>
Co-committed-by: Ryan Chen <ryan@torrtle.co>
2025-10-23 22:29:12 -04:00
73 changed files with 8430 additions and 1892 deletions

44
.env.example Normal file
View File

@@ -0,0 +1,44 @@
# Database Configuration
# PostgreSQL is recommended (required for OIDC features)
DATABASE_URL=postgres://raggr:changeme@postgres:5432/raggr
# PostgreSQL credentials (if using docker-compose postgres service)
POSTGRES_USER=raggr
POSTGRES_PASSWORD=changeme
POSTGRES_DB=raggr
# JWT Configuration
JWT_SECRET_KEY=your-secret-key-here
# Paperless Configuration
PAPERLESS_TOKEN=your-paperless-token
BASE_URL=192.168.1.5:8000
# Ollama Configuration
OLLAMA_URL=http://192.168.1.14:11434
OLLAMA_HOST=http://192.168.1.14:11434
# ChromaDB Configuration
CHROMADB_PATH=/path/to/chromadb
# OpenAI Configuration
OPENAI_API_KEY=your-openai-api-key
# Immich Configuration
IMMICH_URL=http://192.168.1.5:2283
IMMICH_API_KEY=your-immich-api-key
SEARCH_QUERY=simba cat
DOWNLOAD_DIR=./simba_photos
# OIDC Configuration (Authelia)
OIDC_ISSUER=https://auth.example.com
OIDC_CLIENT_ID=simbarag
OIDC_CLIENT_SECRET=your-client-secret-here
OIDC_REDIRECT_URI=http://localhost:8080/
OIDC_USE_DISCOVERY=true
# Optional: Manual OIDC endpoints (if discovery is disabled)
# OIDC_AUTHORIZATION_ENDPOINT=https://auth.example.com/api/oidc/authorization
# OIDC_TOKEN_ENDPOINT=https://auth.example.com/api/oidc/token
# OIDC_USERINFO_ENDPOINT=https://auth.example.com/api/oidc/userinfo
# OIDC_JWKS_URI=https://auth.example.com/api/oidc/jwks

7
.gitignore vendored
View File

@@ -9,5 +9,10 @@ wheels/
# Virtual environments
.venv
# Environment files
.env
# Database files
chromadb/
database/
*.db

110
DEV-README.md Normal file
View File

@@ -0,0 +1,110 @@
# Development Environment Setup
This guide explains how to run the application in development mode with hot reload enabled.
## Quick Start
### Development Mode (Hot Reload)
```bash
# Start all services in development mode
docker-compose -f docker-compose.dev.yml up --build
# Or run in detached mode
docker-compose -f docker-compose.dev.yml up -d --build
```
### Production Mode
```bash
# Start production services
docker-compose up --build
```
## What's Different in Dev Mode?
### Backend (Quart/Flask)
- **Hot Reload**: Python code changes are automatically detected and the server restarts
- **Source Mounted**: Your local `services/raggr` directory is mounted as a volume
- **Debug Mode**: Flask runs with `debug=True` for better error messages
- **Environment**: `FLASK_ENV=development` and `PYTHONUNBUFFERED=1` for immediate log output
### Frontend (React + rsbuild)
- **Auto Rebuild**: Frontend automatically rebuilds when files change
- **Watch Mode**: rsbuild runs in watch mode, rebuilding to `dist/` on save
- **Source Mounted**: Your local `services/raggr/raggr-frontend` directory is mounted as a volume
- **Served by Backend**: Built files are served by the backend, no separate dev server
## Ports
- **Application**: 8080 (accessible at `http://localhost:8080` or `http://YOUR_IP:8080`)
The backend serves both the API and the auto-rebuilt frontend, making it accessible from other machines on your network.
## Useful Commands
```bash
# View logs
docker-compose -f docker-compose.dev.yml logs -f
# View logs for specific service
docker-compose -f docker-compose.dev.yml logs -f raggr-backend
docker-compose -f docker-compose.dev.yml logs -f raggr-frontend
# Rebuild after dependency changes
docker-compose -f docker-compose.dev.yml up --build
# Stop all services
docker-compose -f docker-compose.dev.yml down
# Stop and remove volumes (fresh start)
docker-compose -f docker-compose.dev.yml down -v
```
## Making Changes
### Backend Changes
1. Edit any Python file in `services/raggr/`
2. Save the file
3. The Quart server will automatically restart
4. Check logs to confirm reload
### Frontend Changes
1. Edit any file in `services/raggr/raggr-frontend/src/`
2. Save the file
3. The browser will automatically refresh (Hot Module Replacement)
4. No need to rebuild
### Dependency Changes
**Backend** (pyproject.toml):
```bash
# Rebuild the backend service
docker-compose -f docker-compose.dev.yml up --build raggr-backend
```
**Frontend** (package.json):
```bash
# Rebuild the frontend service
docker-compose -f docker-compose.dev.yml up --build raggr-frontend
```
## Troubleshooting
### Port Already in Use
If you see port binding errors, make sure no other services are running on ports 8080 or 3000.
### Changes Not Reflected
1. Check if the file is properly mounted (check docker-compose.dev.yml volumes)
2. Verify the file isn't in an excluded directory (node_modules, __pycache__)
3. Check container logs for errors
### Frontend Not Connecting to Backend
Make sure your frontend API calls point to the correct backend URL. If accessing from the same machine, use `http://localhost:8080`. If accessing from another device on the network, use `http://YOUR_IP:8080`.
## Notes
- Both services bind to `0.0.0.0` and expose ports, making them accessible on your network
- Node modules and Python cache are excluded from volume mounts to use container versions
- Database and ChromaDB data persist in Docker volumes across restarts
- Access the app from any device on your network using your host machine's IP address

102
app.py
View File

@@ -1,102 +0,0 @@
import os
from quart import Quart, request, jsonify, render_template, send_from_directory
from tortoise.contrib.quart import register_tortoise
from quart_jwt_extended import JWTManager
from main import consult_simba_oracle
from blueprints.conversation.logic import (
get_the_only_conversation,
add_message_to_conversation,
)
app = Quart(
__name__,
static_folder="raggr-frontend/dist/static",
template_folder="raggr-frontend/dist",
)
app.config["JWT_SECRET_KEY"] = os.getenv("JWT_SECRET_KEY", "SECRET_KEY")
jwt = JWTManager(app)
# Initialize Tortoise ORM
register_tortoise(
app,
db_url=os.getenv("DATABASE_URL", "sqlite://raggr.db"),
modules={"models": ["blueprints.conversation.models"]},
generate_schemas=True,
)
# Serve React static files
@app.route("/static/<path:filename>")
async def static_files(filename):
return await send_from_directory(app.static_folder, filename)
# Serve the React app for all routes (catch-all)
@app.route("/", defaults={"path": ""})
@app.route("/<path:path>")
async def serve_react_app(path):
if path and os.path.exists(os.path.join(app.template_folder, path)):
return await send_from_directory(app.template_folder, path)
return await render_template("index.html")
@app.route("/api/query", methods=["POST"])
async def query():
data = await request.get_json()
query = data.get("query")
# add message to database
conversation = await get_the_only_conversation()
print(conversation)
await add_message_to_conversation(
conversation=conversation, message=query, speaker="user"
)
response = consult_simba_oracle(query)
await add_message_to_conversation(
conversation=conversation, message=response, speaker="simba"
)
return jsonify({"response": response})
@app.route("/api/messages", methods=["GET"])
async def get_messages():
conversation = await get_the_only_conversation()
# Prefetch related messages
await conversation.fetch_related("messages")
# Manually serialize the conversation with messages
messages = []
for msg in conversation.messages:
messages.append(
{
"id": str(msg.id),
"text": msg.text,
"speaker": msg.speaker.value,
"created_at": msg.created_at.isoformat(),
}
)
return jsonify(
{
"id": str(conversation.id),
"name": conversation.name,
"messages": messages,
"created_at": conversation.created_at.isoformat(),
"updated_at": conversation.updated_at.isoformat(),
}
)
# @app.route("/api/ingest", methods=["POST"])
# def webhook():
# data = request.get_json()
# print(data)
# return jsonify({"status": "received"})
if __name__ == "__main__":
app.run(host="0.0.0.0", port=8080, debug=True)

View File

@@ -1,17 +0,0 @@
from quart import Blueprint, jsonify
from .models import (
Conversation,
PydConversation,
)
conversation_blueprint = Blueprint(
"conversation_api", __name__, url_prefix="/api/conversation"
)
@conversation_blueprint.route("/<conversation_id>")
async def get_conversation(conversation_id: str):
conversation = await Conversation.get(id=conversation_id)
serialized_conversation = await PydConversation.from_tortoise_orm(conversation)
return jsonify(serialized_conversation.model_dump_json())

72
docker-compose.dev.yml Normal file
View File

@@ -0,0 +1,72 @@
version: "3.8"
services:
postgres:
image: postgres:16-alpine
environment:
- POSTGRES_USER=raggr
- POSTGRES_PASSWORD=raggr_dev_password
- POSTGRES_DB=raggr
ports:
- "5432:5432"
volumes:
- postgres_data:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U raggr"]
interval: 5s
timeout: 5s
retries: 5
raggr-backend:
build:
context: ./services/raggr
dockerfile: Dockerfile.dev
image: torrtle/simbarag:dev
ports:
- "8080:8080"
env_file:
- .env
environment:
- PAPERLESS_TOKEN=${PAPERLESS_TOKEN}
- BASE_URL=${BASE_URL}
- OLLAMA_URL=${OLLAMA_URL:-http://localhost:11434}
- CHROMADB_PATH=/app/chromadb
- OPENAI_API_KEY=${OPENAI_API_KEY}
- JWT_SECRET_KEY=${JWT_SECRET_KEY}
- OIDC_ISSUER=${OIDC_ISSUER}
- OIDC_CLIENT_ID=${OIDC_CLIENT_ID}
- OIDC_CLIENT_SECRET=${OIDC_CLIENT_SECRET}
- OIDC_REDIRECT_URI=${OIDC_REDIRECT_URI}
- OIDC_USE_DISCOVERY=${OIDC_USE_DISCOVERY:-true}
- DATABASE_URL=postgres://raggr:raggr_dev_password@postgres:5432/raggr
- FLASK_ENV=development
- PYTHONUNBUFFERED=1
depends_on:
postgres:
condition: service_healthy
volumes:
# Mount source code for hot reload
- ./services/raggr:/app
# Exclude node_modules and Python cache
- /app/raggr-frontend/node_modules
- /app/__pycache__
# Persist data
- chromadb_data:/app/chromadb
command: sh -c "chmod +x /app/startup-dev.sh && /app/startup-dev.sh"
raggr-frontend:
build:
context: ./services/raggr/raggr-frontend
dockerfile: Dockerfile.dev
environment:
- NODE_ENV=development
volumes:
# Mount source code for hot reload
- ./services/raggr/raggr-frontend:/app
# Exclude node_modules to use container's version
- /app/node_modules
command: sh -c "yarn build && yarn watch:build"
volumes:
chromadb_data:
postgres_data:

View File

@@ -1,7 +1,25 @@
version: "3.8"
services:
postgres:
image: postgres:16-alpine
environment:
- POSTGRES_USER=${POSTGRES_USER:-raggr}
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD:-changeme}
- POSTGRES_DB=${POSTGRES_DB:-raggr}
volumes:
- postgres_data:/var/lib/postgresql/data
healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-raggr}"]
interval: 10s
timeout: 5s
retries: 5
restart: unless-stopped
raggr:
build:
context: ./services/raggr
dockerfile: Dockerfile
image: torrtle/simbarag:latest
network_mode: host
environment:
@@ -10,8 +28,20 @@ services:
- OLLAMA_URL=${OLLAMA_URL:-http://localhost:11434}
- CHROMADB_PATH=/app/chromadb
- OPENAI_API_KEY=${OPENAI_API_KEY}
- JWT_SECRET_KEY=${JWT_SECRET_KEY}
- OIDC_ISSUER=${OIDC_ISSUER}
- OIDC_CLIENT_ID=${OIDC_CLIENT_ID}
- OIDC_CLIENT_SECRET=${OIDC_CLIENT_SECRET}
- OIDC_REDIRECT_URI=${OIDC_REDIRECT_URI}
- OIDC_USE_DISCOVERY=${OIDC_USE_DISCOVERY:-true}
- DATABASE_URL=${DATABASE_URL:-postgres://raggr:changeme@postgres:5432/raggr}
depends_on:
postgres:
condition: service_healthy
volumes:
- chromadb_data:/app/chromadb
restart: unless-stopped
volumes:
chromadb_data:
postgres_data:

64
llm.py
View File

@@ -1,64 +0,0 @@
import os
from ollama import Client
from openai import OpenAI
import logging
logging.basicConfig(level=logging.INFO)
class LLMClient:
def __init__(self):
try:
self.ollama_client = Client(
host=os.getenv("OLLAMA_URL", "http://localhost:11434"), timeout=10.0
)
self.ollama_client.chat(
model="gemma3:4b", messages=[{"role": "system", "content": "test"}]
)
self.PROVIDER = "ollama"
logging.info("Using Ollama as LLM backend")
except Exception as e:
print(e)
self.openai_client = OpenAI()
self.PROVIDER = "openai"
logging.info("Using OpenAI as LLM backend")
def chat(
self,
prompt: str,
system_prompt: str,
):
if self.PROVIDER == "ollama":
response = self.ollama_client.chat(
model="gemma3:4b",
messages=[
{
"role": "system",
"content": system_prompt,
},
{"role": "user", "content": prompt},
],
)
print(response)
output = response.message.content
elif self.PROVIDER == "openai":
response = self.openai_client.responses.create(
model="gpt-4o-mini",
input=[
{
"role": "system",
"content": system_prompt,
},
{"role": "user", "content": prompt},
],
)
output = response.output_text
return output
if __name__ == "__main__":
client = Client()
client.chat(model="gemma3:4b", messages=[{"role": "system", "promp": "hack"}])

View File

@@ -1,27 +0,0 @@
[project]
name = "raggr"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.13"
dependencies = [
"chromadb>=1.1.0",
"python-dotenv>=1.0.0",
"flask>=3.1.2",
"httpx>=0.28.1",
"ollama>=0.6.0",
"openai>=2.0.1",
"pydantic>=2.11.9",
"pillow>=10.0.0",
"pymupdf>=1.24.0",
"black>=25.9.0",
"pillow-heif>=1.1.1",
"flask-jwt-extended>=4.7.1",
"bcrypt>=5.0.0",
"pony>=0.7.19",
"flask-login>=0.6.3",
"quart>=0.20.0",
"tortoise-orm>=0.25.1",
"quart-jwt-extended>=0.1.0",
"pre-commit>=4.3.0",
]

Binary file not shown.

View File

@@ -1,204 +0,0 @@
import { useEffect, useState } from "react";
import axios from "axios";
import ReactMarkdown from "react-markdown";
import "./App.css";
type QuestionAnswer = {
question: string;
answer: string;
};
type QuestionBubbleProps = {
text: string;
};
type AnswerBubbleProps = {
text: string;
loading: string;
};
type QuestionAnswerPairProps = {
question: string;
answer: string;
loading: boolean;
};
type Conversation = {
title: string;
id: string;
};
type Message = {
text: string;
speaker: "simba" | "user";
};
type ConversationMenuProps = {
conversations: Conversation[];
};
const ConversationMenu = ({ conversations }: ConversationMenuProps) => {
return (
<div className="absolute bg-white w-md rounded-md shadow-xl m-4 p-4">
<p className="py-2 px-4 rounded-md w-full text-xl font-bold">askSimba!</p>
{conversations.map((conversation) => (
<p className="py-2 px-4 rounded-md hover:bg-stone-200 w-full text-xl font-bold cursor-pointer">
{conversation.title}
</p>
))}
</div>
);
};
const QuestionBubble = ({ text }: QuestionBubbleProps) => {
return <div className="rounded-md bg-stone-200 p-3">🤦: {text}</div>;
};
const AnswerBubble = ({ text, loading }: AnswerBubbleProps) => {
return (
<div className="rounded-md bg-orange-100 p-3">
{loading ? (
<div className="flex flex-col w-full animate-pulse gap-2">
<div className="flex flex-row gap-2 w-full">
<div className="bg-gray-400 w-1/2 p-3 rounded-lg" />
<div className="bg-gray-400 w-1/2 p-3 rounded-lg" />
</div>
<div className="flex flex-row gap-2 w-full">
<div className="bg-gray-400 w-1/3 p-3 rounded-lg" />
<div className="bg-gray-400 w-2/3 p-3 rounded-lg" />
</div>
</div>
) : (
<div className="flex flex-col">
<ReactMarkdown>{"🐈: " + text}</ReactMarkdown>
</div>
)}
</div>
);
};
const QuestionAnswerPair = ({
question,
answer,
loading,
}: QuestionAnswerPairProps) => {
return (
<div className="flex flex-col gap-4">
<QuestionBubble text={question} />
<AnswerBubble text={answer} loading={loading} />
</div>
);
};
const App = () => {
const [query, setQuery] = useState<string>("");
const [answer, setAnswer] = useState<string>("");
const [simbaMode, setSimbaMode] = useState<boolean>(false);
const [questionsAnswers, setQuestionsAnswers] = useState<QuestionAnswer[]>(
[],
);
const [messages, setMessages] = useState<Message[]>([]);
const [conversations, setConversations] = useState<Conversation[]>([
{ title: "simba meow meow", id: "uuid" },
]);
const simbaAnswers = ["meow.", "hiss...", "purrrrrr", "yowOWROWWowowr"];
useEffect(() => {
axios.get("/api/messages").then((result) => {
setMessages(
result.data.messages.map((message) => {
return {
text: message.text,
speaker: message.speaker,
};
}),
);
});
}, []);
const handleQuestionSubmit = () => {
let currMessages = messages.concat([{ text: query, speaker: "user" }]);
setMessages(currMessages);
if (simbaMode) {
console.log("simba mode activated");
const randomIndex = Math.floor(Math.random() * simbaAnswers.length);
const randomElement = simbaAnswers[randomIndex];
setAnswer(randomElement);
setQuestionsAnswers(
questionsAnswers.concat([
{
question: query,
answer: randomElement,
},
]),
);
return;
}
const payload = { query: query };
axios.post("/api/query", payload).then((result) => {
setQuestionsAnswers(
questionsAnswers.concat([
{ question: query, answer: result.data.response },
]),
);
setMessages(
currMessages.concat([{ text: result.data.response, speaker: "simba" }]),
);
});
};
const handleQueryChange = (event) => {
setQuery(event.target.value);
};
return (
<div className="h-screen bg-opacity-20">
<div className="bg-white/85 h-screen">
<div className="flex flex-row justify-center py-4">
<div className="flex flex-col gap-4 min-w-xl max-w-xl">
<header className="flex flex-row justify-center gap-2 grow sticky top-0 z-10 bg-white">
<h1 className="text-3xl">ask simba!</h1>
</header>
{/*{questionsAnswers.map((qa) => (
<QuestionAnswerPair question={qa.question} answer={qa.answer} />
))}*/}
{messages.map((msg) => {
if (msg.speaker == "simba") {
return <AnswerBubble text={msg.text} loading="" />;
}
return <QuestionBubble text={msg.text} />;
})}
<footer className="flex flex-col gap-2 sticky bottom-0">
<div className="flex flex-row justify-between gap-2 grow">
<textarea
type="text"
className="p-4 border border-blue-200 rounded-md grow bg-white"
onChange={handleQueryChange}
/>
</div>
<div className="flex flex-row justify-between gap-2 grow">
<button
className="p-4 border border-blue-400 bg-blue-200 hover:bg-blue-400 cursor-pointer rounded-md flex-grow"
onClick={() => handleQuestionSubmit()}
type="submit"
>
Submit
</button>
</div>
<div className="flex flex-row justify-center gap-2 grow">
<input
type="checkbox"
onChange={(event) => setSimbaMode(event.target.checked)}
/>
<p>simba mode?</p>
</div>
</footer>
</div>
</div>
</div>
</div>
);
};
export default App;

File diff suppressed because it is too large Load Diff

View File

@@ -23,6 +23,8 @@ RUN uv pip install --system -e .
# Copy application code
COPY *.py ./
COPY blueprints ./blueprints
COPY migrations ./migrations
COPY startup.sh ./
RUN chmod +x startup.sh
@@ -32,8 +34,8 @@ WORKDIR /app/raggr-frontend
RUN yarn install && yarn build
WORKDIR /app
# Create ChromaDB directory
RUN mkdir -p /app/chromadb
# Create ChromaDB and database directories
RUN mkdir -p /app/chromadb /app/database
# Expose port
EXPOSE 8080
@@ -43,4 +45,4 @@ ENV PYTHONPATH=/app
ENV CHROMADB_PATH=/app/chromadb
# Run the startup script
CMD ["./startup.sh"]
CMD ["./startup.sh"]

View File

@@ -0,0 +1,33 @@
FROM python:3.13-slim
WORKDIR /app
# Install system dependencies and uv
RUN apt-get update && apt-get install -y \
build-essential \
curl \
&& rm -rf /var/lib/apt/lists/* \
&& curl -LsSf https://astral.sh/uv/install.sh | sh
# Add uv to PATH
ENV PATH="/root/.local/bin:$PATH"
# Copy dependency files
COPY pyproject.toml ./
# Install Python dependencies using uv
RUN uv pip install --system -e .
# Create ChromaDB and database directories
RUN mkdir -p /app/chromadb /app/database
# Expose port
EXPOSE 8080
# Set environment variables
ENV PYTHONPATH=/app
ENV CHROMADB_PATH=/app/chromadb
ENV PYTHONUNBUFFERED=1
# The actual source code will be mounted as a volume
# No CMD here - will be specified in docker-compose

View File

@@ -0,0 +1,54 @@
# Database Migrations with Aerich
## Initial Setup (Run Once)
1. Install dependencies:
```bash
uv pip install -e .
```
2. Initialize Aerich:
```bash
aerich init-db
```
This will:
- Create a `migrations/` directory
- Generate the initial migration based on your models
- Create all tables in the database
## When You Add/Change Models
1. Generate a new migration:
```bash
aerich migrate --name "describe_your_changes"
```
Example:
```bash
aerich migrate --name "add_user_profile_model"
```
2. Apply the migration:
```bash
aerich upgrade
```
## Common Commands
- `aerich init-db` - Initialize database (first time only)
- `aerich migrate --name "description"` - Generate new migration
- `aerich upgrade` - Apply pending migrations
- `aerich downgrade` - Rollback last migration
- `aerich history` - Show migration history
- `aerich heads` - Show current migration heads
## Docker Setup
In Docker, migrations run automatically on container startup via the startup script.
## Notes
- Migration files are stored in `migrations/models/`
- Always commit migration files to version control
- Don't modify migration files manually after they're created

146
services/raggr/add_user.py Normal file
View File

@@ -0,0 +1,146 @@
# GENERATED BY CLAUDE
import os
import sys
import uuid
import asyncio
from tortoise import Tortoise
from blueprints.users.models import User
from dotenv import load_dotenv
load_dotenv()
# Database configuration with environment variable support
DATABASE_PATH = os.getenv("DATABASE_PATH", "database/raggr.db")
DATABASE_URL = os.getenv("DATABASE_URL", f"sqlite://{DATABASE_PATH}")
print(DATABASE_URL)
async def add_user(username: str, email: str, password: str):
"""Add a new user to the database"""
await Tortoise.init(
db_url=DATABASE_URL,
modules={
"models": [
"blueprints.users.models",
"blueprints.conversation.models",
]
},
)
try:
# Check if user already exists
existing_user = await User.filter(email=email).first()
if existing_user:
print(f"Error: User with email '{email}' already exists!")
return False
existing_username = await User.filter(username=username).first()
if existing_username:
print(f"Error: Username '{username}' is already taken!")
return False
# Create new user
user = User(
id=uuid.uuid4(),
username=username,
email=email,
)
user.set_password(password)
await user.save()
print("✓ User created successfully!")
print(f" Username: {username}")
print(f" Email: {email}")
print(f" ID: {user.id}")
return True
except Exception as e:
print(f"Error creating user: {e}")
return False
finally:
await Tortoise.close_connections()
async def list_users():
"""List all users in the database"""
await Tortoise.init(
db_url=DATABASE_URL,
modules={
"models": [
"blueprints.users.models",
"blueprints.conversation.models",
]
},
)
try:
users = await User.all()
if not users:
print("No users found in database.")
return
print(f"\nFound {len(users)} user(s):")
print("-" * 60)
for user in users:
print(f"Username: {user.username}")
print(f"Email: {user.email}")
print(f"ID: {user.id}")
print(f"Created: {user.created_at}")
print("-" * 60)
except Exception as e:
print(f"Error listing users: {e}")
finally:
await Tortoise.close_connections()
def print_usage():
"""Print usage instructions"""
print("Usage:")
print(" python add_user.py add <username> <email> <password>")
print(" python add_user.py list")
print("\nExamples:")
print(" python add_user.py add ryan ryan@example.com mypassword123")
print(" python add_user.py list")
print("\nEnvironment Variables:")
print(" DATABASE_PATH - Path to database file (default: database/raggr.db)")
print(" DATABASE_URL - Full database URL (overrides DATABASE_PATH)")
print("\n Example with custom database:")
print(" DATABASE_PATH=dev.db python add_user.py list")
async def main():
if len(sys.argv) < 2:
print_usage()
sys.exit(1)
command = sys.argv[1].lower()
if command == "add":
if len(sys.argv) != 5:
print("Error: Missing arguments for 'add' command")
print_usage()
sys.exit(1)
username = sys.argv[2]
email = sys.argv[3]
password = sys.argv[4]
success = await add_user(username, email, password)
sys.exit(0 if success else 1)
elif command == "list":
await list_users()
sys.exit(0)
else:
print(f"Error: Unknown command '{command}'")
print_usage()
sys.exit(1)
if __name__ == "__main__":
asyncio.run(main())

View File

@@ -0,0 +1,20 @@
import os
# Database configuration with environment variable support
# Use DATABASE_PATH for relative paths or DATABASE_URL for full connection strings
DATABASE_PATH = os.getenv("DATABASE_PATH", "database/raggr.db")
DATABASE_URL = os.getenv("DATABASE_URL", f"sqlite://{DATABASE_PATH}")
TORTOISE_ORM = {
"connections": {"default": DATABASE_URL},
"apps": {
"models": {
"models": [
"blueprints.conversation.models",
"blueprints.users.models",
"aerich.models",
],
"default_connection": "default",
},
},
}

138
services/raggr/app.py Normal file
View File

@@ -0,0 +1,138 @@
import os
from quart import Quart, jsonify, render_template, request, send_from_directory
from quart_jwt_extended import JWTManager, get_jwt_identity, jwt_refresh_token_required
from tortoise.contrib.quart import register_tortoise
import blueprints.conversation
import blueprints.conversation.logic
import blueprints.users
import blueprints.users.models
from main import consult_simba_oracle
app = Quart(
__name__,
static_folder="raggr-frontend/dist/static",
template_folder="raggr-frontend/dist",
)
app.config["JWT_SECRET_KEY"] = os.getenv("JWT_SECRET_KEY", "SECRET_KEY")
jwt = JWTManager(app)
# Register blueprints
app.register_blueprint(blueprints.users.user_blueprint)
app.register_blueprint(blueprints.conversation.conversation_blueprint)
# Database configuration with environment variable support
DATABASE_URL = os.getenv(
"DATABASE_URL",
"postgres://raggr:raggr_dev_password@localhost:5432/raggr"
)
TORTOISE_CONFIG = {
"connections": {"default": DATABASE_URL},
"apps": {
"models": {
"models": [
"blueprints.conversation.models",
"blueprints.users.models",
"aerich.models",
]
},
},
}
# Initialize Tortoise ORM
register_tortoise(
app,
config=TORTOISE_CONFIG,
generate_schemas=False, # Disabled - using Aerich for migrations
)
# Serve React static files
@app.route("/static/<path:filename>")
async def static_files(filename):
return await send_from_directory(app.static_folder, filename)
# Serve the React app for all routes (catch-all)
@app.route("/", defaults={"path": ""})
@app.route("/<path:path>")
async def serve_react_app(path):
if path and os.path.exists(os.path.join(app.template_folder, path)):
return await send_from_directory(app.template_folder, path)
return await render_template("index.html")
@app.route("/api/query", methods=["POST"])
@jwt_refresh_token_required
async def query():
current_user_uuid = get_jwt_identity()
user = await blueprints.users.models.User.get(id=current_user_uuid)
data = await request.get_json()
query = data.get("query")
conversation_id = data.get("conversation_id")
conversation = await blueprints.conversation.logic.get_conversation_by_id(
conversation_id
)
await conversation.fetch_related("messages")
await blueprints.conversation.logic.add_message_to_conversation(
conversation=conversation,
message=query,
speaker="user",
user=user,
)
transcript = await blueprints.conversation.logic.get_conversation_transcript(
user=user, conversation=conversation
)
response = consult_simba_oracle(input=query, transcript=transcript)
await blueprints.conversation.logic.add_message_to_conversation(
conversation=conversation,
message=response,
speaker="simba",
user=user,
)
return jsonify({"response": response})
@app.route("/api/messages", methods=["GET"])
@jwt_refresh_token_required
async def get_messages():
current_user_uuid = get_jwt_identity()
user = await blueprints.users.models.User.get(id=current_user_uuid)
conversation = await blueprints.conversation.logic.get_conversation_for_user(
user=user
)
# Prefetch related messages
await conversation.fetch_related("messages")
# Manually serialize the conversation with messages
messages = []
for msg in conversation.messages:
messages.append(
{
"id": str(msg.id),
"text": msg.text,
"speaker": msg.speaker.value,
"created_at": msg.created_at.isoformat(),
}
)
return jsonify(
{
"id": str(conversation.id),
"name": conversation.name,
"messages": messages,
"created_at": conversation.created_at.isoformat(),
"updated_at": conversation.updated_at.isoformat(),
}
)
if __name__ == "__main__":
app.run(host="0.0.0.0", port=8080, debug=True)

View File

@@ -0,0 +1 @@
# Blueprints package

View File

@@ -0,0 +1,72 @@
import datetime
from quart_jwt_extended import (
jwt_refresh_token_required,
get_jwt_identity,
)
from quart import Blueprint, jsonify
from .models import (
Conversation,
PydConversation,
PydListConversation,
)
import blueprints.users.models
conversation_blueprint = Blueprint(
"conversation_api", __name__, url_prefix="/api/conversation"
)
@conversation_blueprint.route("/<conversation_id>")
async def get_conversation(conversation_id: str):
conversation = await Conversation.get(id=conversation_id)
await conversation.fetch_related("messages")
# Manually serialize the conversation with messages
messages = []
for msg in conversation.messages:
messages.append(
{
"id": str(msg.id),
"text": msg.text,
"speaker": msg.speaker.value,
"created_at": msg.created_at.isoformat(),
}
)
return jsonify(
{
"id": str(conversation.id),
"name": conversation.name,
"messages": messages,
"created_at": conversation.created_at.isoformat(),
"updated_at": conversation.updated_at.isoformat(),
}
)
@conversation_blueprint.post("/")
@jwt_refresh_token_required
async def create_conversation():
user_uuid = get_jwt_identity()
user = await blueprints.users.models.User.get(id=user_uuid)
conversation = await Conversation.create(
name=f"{user.username} {datetime.datetime.now().timestamp}",
user=user,
)
serialized_conversation = await PydConversation.from_tortoise_orm(conversation)
return jsonify(serialized_conversation.model_dump())
@conversation_blueprint.get("/")
@jwt_refresh_token_required
async def get_all_conversations():
user_uuid = get_jwt_identity()
user = await blueprints.users.models.User.get(id=user_uuid)
conversations = Conversation.filter(user=user)
serialized_conversations = await PydListConversation.from_queryset(conversations)
return jsonify(serialized_conversations.model_dump())

View File

@@ -1,5 +1,9 @@
import tortoise.exceptions
from .models import Conversation, ConversationMessage
import blueprints.users.models
async def create_conversation(name: str = "") -> Conversation:
conversation = await Conversation.create(name=name)
@@ -10,6 +14,7 @@ async def add_message_to_conversation(
conversation: Conversation,
message: str,
speaker: str,
user: blueprints.users.models.User,
) -> ConversationMessage:
print(conversation, message, speaker)
message = await ConversationMessage.create(
@@ -30,3 +35,26 @@ async def get_the_only_conversation() -> Conversation:
conversation = await Conversation.create(name="simba_chat")
return conversation
async def get_conversation_for_user(user: blueprints.users.models.User) -> Conversation:
try:
return await Conversation.get(user=user)
except tortoise.exceptions.DoesNotExist:
await Conversation.get_or_create(name=f"{user.username}'s chat", user=user)
return await Conversation.get(user=user)
async def get_conversation_by_id(id: str) -> Conversation:
return await Conversation.get(id=id)
async def get_conversation_transcript(
user: blueprints.users.models.User, conversation: Conversation
) -> str:
messages = []
for message in conversation.messages:
messages.append(f"{message.speaker} at {message.created_at}: {message.text}")
return "\n".join(messages)

View File

@@ -18,6 +18,9 @@ class Conversation(Model):
name = fields.CharField(max_length=255)
created_at = fields.DatetimeField(auto_now_add=True)
updated_at = fields.DatetimeField(auto_now=True)
user: fields.ForeignKeyRelation = fields.ForeignKeyField(
"models.User", related_name="conversations", null=True
)
class Meta:
table = "conversations"
@@ -37,5 +40,15 @@ class ConversationMessage(Model):
PydConversationMessage = pydantic_model_creator(ConversationMessage)
PydConversation = pydantic_model_creator(Conversation, name="Conversation")
PydConversation = pydantic_model_creator(
Conversation, name="Conversation", allow_cycles=True, exclude=("user",)
)
PydConversationWithMessages = pydantic_model_creator(
Conversation,
name="ConversationWithMessages",
allow_cycles=True,
exclude=("user",),
include=("messages",),
)
PydListConversation = pydantic_queryset_creator(Conversation)
PydListConversationMessage = pydantic_queryset_creator(ConversationMessage)

View File

@@ -0,0 +1,180 @@
from quart import Blueprint, jsonify, request
from quart_jwt_extended import (
create_access_token,
create_refresh_token,
jwt_refresh_token_required,
get_jwt_identity,
)
from .models import User
from .oidc_service import OIDCUserService
from oidc_config import oidc_config
import secrets
import httpx
from urllib.parse import urlencode
import hashlib
import base64
user_blueprint = Blueprint("user_api", __name__, url_prefix="/api/user")
# In-memory storage for OIDC state/PKCE (production: use Redis or database)
# Format: {state: {"pkce_verifier": str, "redirect_after_login": str}}
_oidc_sessions = {}
@user_blueprint.route("/oidc/login", methods=["GET"])
async def oidc_login():
"""
Initiate OIDC login flow
Generates PKCE parameters and redirects to Authelia
"""
if not oidc_config.validate_config():
return jsonify({"error": "OIDC not configured"}), 500
try:
# Generate PKCE parameters
code_verifier = secrets.token_urlsafe(64)
# For PKCE, we need code_challenge = BASE64URL(SHA256(code_verifier))
code_challenge = (
base64.urlsafe_b64encode(hashlib.sha256(code_verifier.encode()).digest())
.decode()
.rstrip("=")
)
# Generate state for CSRF protection
state = secrets.token_urlsafe(32)
# Store PKCE verifier and state for callback validation
_oidc_sessions[state] = {
"pkce_verifier": code_verifier,
"redirect_after_login": request.args.get("redirect", "/"),
}
# Get authorization endpoint from discovery
discovery = await oidc_config.get_discovery_document()
auth_endpoint = discovery.get("authorization_endpoint")
# Build authorization URL
params = {
"client_id": oidc_config.client_id,
"response_type": "code",
"redirect_uri": oidc_config.redirect_uri,
"scope": "openid email profile",
"state": state,
"code_challenge": code_challenge,
"code_challenge_method": "S256",
}
auth_url = f"{auth_endpoint}?{urlencode(params)}"
return jsonify({"auth_url": auth_url})
except Exception as e:
return jsonify({"error": f"OIDC login failed: {str(e)}"}), 500
@user_blueprint.route("/oidc/callback", methods=["GET"])
async def oidc_callback():
"""
Handle OIDC callback from Authelia
Exchanges authorization code for tokens, verifies ID token, and creates/updates user
"""
# Get authorization code and state from callback
code = request.args.get("code")
state = request.args.get("state")
error = request.args.get("error")
if error:
return jsonify({"error": f"OIDC error: {error}"}), 400
if not code or not state:
return jsonify({"error": "Missing code or state"}), 400
# Validate state and retrieve PKCE verifier
session = _oidc_sessions.pop(state, None)
if not session:
return jsonify({"error": "Invalid or expired state"}), 400
pkce_verifier = session["pkce_verifier"]
# Exchange authorization code for tokens
discovery = await oidc_config.get_discovery_document()
token_endpoint = discovery.get("token_endpoint")
token_data = {
"grant_type": "authorization_code",
"code": code,
"redirect_uri": oidc_config.redirect_uri,
"client_id": oidc_config.client_id,
"client_secret": oidc_config.client_secret,
"code_verifier": pkce_verifier,
}
# Use client_secret_post method (credentials in POST body)
async with httpx.AsyncClient() as client:
token_response = await client.post(token_endpoint, data=token_data)
if token_response.status_code != 200:
return jsonify({"error": f"Failed to exchange code for token: {token_response.text}"}), 400
tokens = token_response.json()
id_token = tokens.get("id_token")
if not id_token:
return jsonify({"error": "No ID token received"}), 400
# Verify ID token
try:
claims = await oidc_config.verify_id_token(id_token)
except Exception as e:
return jsonify({"error": f"ID token verification failed: {str(e)}"}), 400
# Get or create user from OIDC claims
user = await OIDCUserService.get_or_create_user_from_oidc(claims)
# Issue backend JWT tokens
access_token = create_access_token(identity=str(user.id))
refresh_token = create_refresh_token(identity=str(user.id))
# Return tokens to frontend
# Frontend will handle storing these and redirecting
return jsonify(
access_token=access_token,
refresh_token=refresh_token,
user={"id": str(user.id), "username": user.username, "email": user.email},
)
@user_blueprint.route("/refresh", methods=["POST"])
@jwt_refresh_token_required
async def refresh():
"""Refresh access token (unchanged from original)"""
user_id = get_jwt_identity()
new_token = create_access_token(identity=user_id)
return jsonify(access_token=new_token)
# Legacy username/password login - kept for backward compatibility during migration
@user_blueprint.route("/login", methods=["POST"])
async def login():
"""
Legacy username/password login
This can be removed after full OIDC migration is complete
"""
data = await request.get_json()
username = data.get("username")
password = data.get("password")
user = await User.filter(username=username).first()
if not user or not user.verify_password(password):
return jsonify({"msg": "Invalid credentials"}), 401
access_token = create_access_token(identity=str(user.id))
refresh_token = create_refresh_token(identity=str(user.id))
return jsonify(
access_token=access_token,
refresh_token=refresh_token,
user={"id": str(user.id), "username": user.username},
)

View File

@@ -0,0 +1,33 @@
from tortoise.models import Model
from tortoise import fields
import bcrypt
class User(Model):
id = fields.UUIDField(primary_key=True)
username = fields.CharField(max_length=255)
password = fields.BinaryField(null=True) # Hashed - nullable for OIDC users
email = fields.CharField(max_length=100, unique=True)
# OIDC fields
oidc_subject = fields.CharField(max_length=255, unique=True, null=True, index=True) # "sub" claim from OIDC
auth_provider = fields.CharField(max_length=50, default="local") # "local" or "oidc"
created_at = fields.DatetimeField(auto_now_add=True)
updated_at = fields.DatetimeField(auto_now=True)
class Meta:
table = "users"
def set_password(self, plain_password: str):
self.password = bcrypt.hashpw(
plain_password.encode("utf-8"),
bcrypt.gensalt(),
)
def verify_password(self, plain_password: str):
if not self.password:
return False
return bcrypt.checkpw(plain_password.encode("utf-8"), self.password)

View File

@@ -0,0 +1,76 @@
"""
OIDC User Management Service
"""
from typing import Dict, Any, Optional
from uuid import uuid4
from .models import User
class OIDCUserService:
"""Service for managing OIDC user authentication and provisioning"""
@staticmethod
async def get_or_create_user_from_oidc(claims: Dict[str, Any]) -> User:
"""
Get existing user by OIDC subject, or create new user from OIDC claims
Args:
claims: Decoded OIDC ID token claims
Returns:
User object (existing or newly created)
"""
oidc_subject = claims.get("sub")
if not oidc_subject:
raise ValueError("No 'sub' claim in ID token")
# Try to find existing user by OIDC subject
user = await User.filter(oidc_subject=oidc_subject).first()
if user:
# Update user info from latest claims (optional)
user.email = claims.get("email", user.email)
user.username = (
claims.get("preferred_username")
or claims.get("name")
or user.username
)
await user.save()
return user
# Check if user exists by email (migration case)
email = claims.get("email")
if email:
user = await User.filter(email=email, auth_provider="local").first()
if user:
# Migrate existing local user to OIDC
user.oidc_subject = oidc_subject
user.auth_provider = "oidc"
user.password = None # Clear password
await user.save()
return user
# Create new user from OIDC claims
username = (
claims.get("preferred_username")
or claims.get("name")
or claims.get("email", "").split("@")[0]
or f"user_{oidc_subject[:8]}"
)
user = await User.create(
id=uuid4(),
username=username,
email=email
or f"{oidc_subject}@oidc.local", # Fallback if no email claim
oidc_subject=oidc_subject,
auth_provider="oidc",
password=None,
)
return user
@staticmethod
async def find_user_by_oidc_subject(oidc_subject: str) -> Optional[User]:
"""Find user by OIDC subject ID"""
return await User.filter(oidc_subject=oidc_subject).first()

View File

@@ -14,7 +14,7 @@ from llm import LLMClient
load_dotenv()
ollama_client = Client(
host=os.getenv("OLLAMA_HOST", "http://localhost:11434"), timeout=10.0
host=os.getenv("OLLAMA_HOST", "http://localhost:11434"), timeout=1.0
)

View File

@@ -27,7 +27,7 @@ headers = {"x-api-key": API_KEY, "Content-Type": "application/json"}
VISITED = {}
if __name__ == "__main__":
conn = sqlite3.connect("./visited.db")
conn = sqlite3.connect("./database/visited.db")
c = conn.cursor()
c.execute("select immich_id from visited")
rows = c.fetchall()

73
services/raggr/llm.py Normal file
View File

@@ -0,0 +1,73 @@
import os
from ollama import Client
from openai import OpenAI
import logging
from dotenv import load_dotenv
load_dotenv()
logging.basicConfig(level=logging.INFO)
TRY_OLLAMA = os.getenv("TRY_OLLAMA", False)
class LLMClient:
def __init__(self):
try:
self.ollama_client = Client(
host=os.getenv("OLLAMA_URL", "http://localhost:11434"), timeout=1.0
)
self.ollama_client.chat(
model="gemma3:4b", messages=[{"role": "system", "content": "test"}]
)
self.PROVIDER = "ollama"
logging.info("Using Ollama as LLM backend")
except Exception as e:
print(e)
self.openai_client = OpenAI()
self.PROVIDER = "openai"
logging.info("Using OpenAI as LLM backend")
def chat(
self,
prompt: str,
system_prompt: str,
):
# Instituting a fallback if my gaming PC is not on
if self.PROVIDER == "ollama":
try:
response = self.ollama_client.chat(
model="gemma3:4b",
messages=[
{
"role": "system",
"content": system_prompt,
},
{"role": "user", "content": prompt},
],
)
output = response.message.content
return output
except Exception as e:
logging.error(f"Could not connect to OLLAMA: {str(e)}")
response = self.openai_client.responses.create(
model="gpt-4o-mini",
input=[
{
"role": "system",
"content": system_prompt,
},
{"role": "user", "content": prompt},
],
)
output = response.output_text
return output
if __name__ == "__main__":
client = Client()
client.chat(model="gemma3:4b", messages=[{"role": "system", "promp": "hack"}])

View File

@@ -7,6 +7,8 @@ import argparse
import chromadb
import ollama
import time
from request import PaperlessNGXService
from chunker import Chunker
@@ -36,6 +38,7 @@ parser.add_argument("query", type=str, help="questions about simba's health")
parser.add_argument(
"--reindex", action="store_true", help="re-index the simba documents"
)
parser.add_argument("--classify", action="store_true", help="test classification")
parser.add_argument("--index", help="index a file")
ppngx = PaperlessNGXService()
@@ -77,7 +80,7 @@ def chunk_data(docs, collection, doctypes):
logging.info(f"chunking {len(docs)} documents")
texts: list[str] = [doc["content"] for doc in docs]
with sqlite3.connect("visited.db") as conn:
with sqlite3.connect("database/visited.db") as conn:
to_insert = []
c = conn.cursor()
for index, text in enumerate(texts):
@@ -113,9 +116,22 @@ def chunk_text(texts: list[str], collection):
)
def consult_oracle(input: str, collection):
import time
def classify_query(query: str, transcript: str) -> bool:
logging.info("Starting query generation")
qg_start = time.time()
qg = QueryGenerator()
query_type = qg.get_query_type(input=query, transcript=transcript)
logging.info(query_type)
qg_end = time.time()
logging.info(f"Query generation took {qg_end - qg_start:.2f} seconds")
return query_type == "Simba"
def consult_oracle(
input: str,
collection,
transcript: str = "",
):
chunker = Chunker(collection)
start_time = time.time()
@@ -153,7 +169,10 @@ def consult_oracle(input: str, collection):
logging.info("Starting LLM generation")
llm_start = time.time()
system_prompt = "You are a helpful assistant that understands veterinary terms."
prompt = f"Using the following data, help answer the user's query by providing as many details as possible. Using this data: {results}. Respond to this prompt: {input}"
transcript_prompt = f"Here is the message transcript thus far {transcript}."
prompt = f"""Using the following data, help answer the user's query by providing as many details as possible.
Using this data: {results}. {transcript_prompt if len(transcript) > 0 else ""}
Respond to this prompt: {input}"""
output = llm_client.chat(prompt=prompt, system_prompt=system_prompt)
llm_end = time.time()
logging.info(f"LLM generation took {llm_end - llm_start:.2f} seconds")
@@ -164,6 +183,16 @@ def consult_oracle(input: str, collection):
return output
def llm_chat(input: str, transcript: str = "") -> str:
system_prompt = "You are a helpful assistant that understands veterinary terms."
transcript_prompt = f"Here is the message transcript thus far {transcript}."
prompt = f"""Answer the user in as if you were a cat named Simba. Don't act too catlike. Be assertive.
{transcript_prompt if len(transcript) > 0 else ""}
Respond to this prompt: {input}"""
output = llm_client.chat(prompt=prompt, system_prompt=system_prompt)
return output
def paperless_workflow(input):
# Step 1: Get the text
ppngx = PaperlessNGXService()
@@ -173,15 +202,24 @@ def paperless_workflow(input):
consult_oracle(input, simba_docs)
def consult_simba_oracle(input: str):
return consult_oracle(
input=input,
collection=simba_docs,
)
def consult_simba_oracle(input: str, transcript: str = ""):
is_simba_related = classify_query(query=input, transcript=transcript)
if is_simba_related:
logging.info("Query is related to simba")
return consult_oracle(
input=input,
collection=simba_docs,
transcript=transcript,
)
logging.info("Query is NOT related to simba")
return llm_chat(input=input, transcript=transcript)
def filter_indexed_files(docs):
with sqlite3.connect("visited.db") as conn:
with sqlite3.connect("database/visited.db") as conn:
c = conn.cursor()
c.execute(
"CREATE TABLE IF NOT EXISTS indexed_documents (id INTEGER PRIMARY KEY AUTOINCREMENT, paperless_id INTEGER)"
@@ -194,42 +232,45 @@ def filter_indexed_files(docs):
return [doc for doc in docs if doc["id"] not in visited]
def reindex():
with sqlite3.connect("database/visited.db") as conn:
c = conn.cursor()
c.execute("DELETE FROM indexed_documents")
conn.commit()
# Delete all documents from the collection
all_docs = simba_docs.get()
if all_docs["ids"]:
simba_docs.delete(ids=all_docs["ids"])
logging.info("Fetching documents from Paperless-NGX")
ppngx = PaperlessNGXService()
docs = ppngx.get_data()
docs = filter_indexed_files(docs)
logging.info(f"Fetched {len(docs)} documents")
# Delete all chromadb data
ids = simba_docs.get(ids=None, limit=None, offset=0)
all_ids = ids["ids"]
if len(all_ids) > 0:
simba_docs.delete(ids=all_ids)
# Chunk documents
logging.info("Chunking documents now ...")
doctype_lookup = ppngx.get_doctypes()
chunk_data(docs, collection=simba_docs, doctypes=doctype_lookup)
logging.info("Done chunking documents")
if __name__ == "__main__":
args = parser.parse_args()
if args.reindex:
with sqlite3.connect("./visited.db") as conn:
c = conn.cursor()
c.execute("DELETE FROM indexed_documents")
reindex()
logging.info("Fetching documents from Paperless-NGX")
ppngx = PaperlessNGXService()
docs = ppngx.get_data()
docs = filter_indexed_files(docs)
logging.info(f"Fetched {len(docs)} documents")
# Delete all chromadb data
ids = simba_docs.get(ids=None, limit=None, offset=0)
all_ids = ids["ids"]
if len(all_ids) > 0:
simba_docs.delete(ids=all_ids)
# Chunk documents
logging.info("Chunking documents now ...")
tag_lookup = ppngx.get_tags()
doctype_lookup = ppngx.get_doctypes()
chunk_data(docs, collection=simba_docs, doctypes=doctype_lookup)
logging.info("Done chunking documents")
# if args.index:
# with open(args.index) as file:
# extension = args.index.split(".")[-1]
# if extension == "pdf":
# pdf_path = ppngx.download_pdf_from_id(id=document_id)
# image_paths = pdf_to_image(filepath=pdf_path)
# print(f"summarizing {file}")
# generated_summary = summarize_pdf_image(filepaths=image_paths)
# elif extension in [".md", ".txt"]:
# chunk_text(texts=[file.readall()], collection=simba_docs)
if args.classify:
consult_simba_oracle(input="yohohoho testing")
consult_simba_oracle(input="write an email")
consult_simba_oracle(input="how much does simba weigh")
if args.query:
logging.info("Consulting oracle ...")

View File

@@ -0,0 +1,71 @@
from tortoise import BaseDBAsyncClient
RUN_IN_TRANSACTION = True
async def upgrade(db: BaseDBAsyncClient) -> str:
return """
CREATE TABLE IF NOT EXISTS "users" (
"id" UUID NOT NULL PRIMARY KEY,
"username" VARCHAR(255) NOT NULL,
"password" BYTEA,
"email" VARCHAR(100) NOT NULL UNIQUE,
"oidc_subject" VARCHAR(255) UNIQUE,
"auth_provider" VARCHAR(50) NOT NULL DEFAULT 'local',
"created_at" TIMESTAMPTZ NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMPTZ NOT NULL DEFAULT CURRENT_TIMESTAMP
);
CREATE INDEX IF NOT EXISTS "idx_users_oidc_su_5aec5a" ON "users" ("oidc_subject");
CREATE TABLE IF NOT EXISTS "conversations" (
"id" UUID NOT NULL PRIMARY KEY,
"name" VARCHAR(255) NOT NULL,
"created_at" TIMESTAMPTZ NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updated_at" TIMESTAMPTZ NOT NULL DEFAULT CURRENT_TIMESTAMP,
"user_id" UUID REFERENCES "users" ("id") ON DELETE CASCADE
);
CREATE TABLE IF NOT EXISTS "conversation_messages" (
"id" UUID NOT NULL PRIMARY KEY,
"text" TEXT NOT NULL,
"created_at" TIMESTAMPTZ NOT NULL DEFAULT CURRENT_TIMESTAMP,
"speaker" VARCHAR(10) NOT NULL,
"conversation_id" UUID NOT NULL REFERENCES "conversations" ("id") ON DELETE CASCADE
);
COMMENT ON COLUMN "conversation_messages"."speaker" IS 'USER: user\nSIMBA: simba';
CREATE TABLE IF NOT EXISTS "aerich" (
"id" SERIAL NOT NULL PRIMARY KEY,
"version" VARCHAR(255) NOT NULL,
"app" VARCHAR(100) NOT NULL,
"content" JSONB NOT NULL
);"""
async def downgrade(db: BaseDBAsyncClient) -> str:
return """
"""
MODELS_STATE = (
"eJztmmtP4zgUhv9KlE+MxCLoUGaEViulpex0Z9qO2nR3LjuK3MRtvSROJnYGKsR/X9u5J0"
"56AUqL+gXosU9sPz7OeY/Lveq4FrTJSdvFv6BPAEUuVi+VexUDB7I/pO3Higo8L23lBgom"
"tnAwMz1FC5gQ6gOTssYpsAlkJgsS00deNBgObJsbXZN1RHiWmgKMfgbQoO4M0jn0WcP3H8"
"yMsAXvIIk/ejfGFEHbys0bWXxsYTfowhO28bh7dS168uEmhunagYPT3t6Czl2cdA8CZJ1w"
"H942gxj6gEIrsww+y2jZsSmcMTNQP4DJVK3UYMEpCGwOQ/19GmCTM1DESPzH+R/qGngYao"
"4WYcpZ3D+Eq0rXLKwqH6r9QRsevb14I1bpEjrzRaMgoj4IR0BB6Cq4piDF7xLK9hz4cpRx"
"/wJMNtFNMMaGlGMaQzHIGNBm1FQH3Bk2xDM6Zx8bzWYNxr+1oSDJegmULovrMOr7UVMjbO"
"NIU4SmD/mSDUDLIK9YC0UOlMPMexaQWpHrSfzHjgJma7AG2F5Eh6CGr97tdUa61vvMV+IQ"
"8tMWiDS9w1sawrooWI8uCluRPET5p6t/UPhH5dug3ynGftJP/6byOYGAugZ2bw1gZc5rbI"
"3B5DY28KwNNzbvedjYF93YaPKZfSXQN9bLIBmXR6SRaG5b3MTNkwZPvdMbac7gMMrwrl0f"
"ohn+CBcCYZfNA2BTliwi0TGOHrOr0FJrOgsf3CZqJBsUbHVsTZCG2VMbtbWrjioYToB5cw"
"t8y6iA6UBCwAySMtBW5Hn9cQjtRJrJWWYFXC984m6+VarYClZuw80wytErNzkNp2gBmK3b"
"isbmI9XQWaKCMxBXE8NGdiMPonivRTGFd5KUrzOrHGXcf19EcV0q73zRc1k8lr5HPe3Lm1"
"wm/zTo/xl3z0jl9qdB66CQX6OQKitk4kFwIxMDvIDs4MApSYHc7mbcX/joqONRZ3ip8Iz+"
"Lx51ey3tUiHImQB1tS3OVZlnpysUmWenlTUmbyocoGyiWe81L3F9ynf+nkpYs3Dh9UgpW7"
"w/21mKSzWtJFzW1bbPqeREzSCRbnEtUa3V+NE+aLP912Z8H9e9tMz67ItG28LFpQcIuXV9"
"SWS2EAb+Qg4z61WAOVnQsP7Z1ZJeBq/F9WpWbjFkrW5fG36VS964fzZuW1/1jlagCx2A7H"
"WiNHF4mhBdfuKfMkDPTlcTPXWqpyR7XGSZBgkm/0FTUjlUkyz6bQS0GKTb5fksB55p+bnh"
"+e4vZFWJdjnQkuP23qKq7ZrAfkQaynNtrhKmzeoobZa1+aG4fZ3F7eHrn1exscntcqlIWX"
"Y1X/pfh6e5n99lgbTde3kN+sicq5J6Lmo5rqvoQNpnZ0q6Lq64IpZWdBxzIRiinX9RYSe+"
"HfmtcXb+7vz924vz96yLmElieVfzMuj29SUVHD8I0muXav2RcTnUb6mcY0djHREXdt9PgM"
"9SX7ARKcSS9P7XaNCvvE6NXQogx5gt8LuFTHqs2IjQH7uJtYYiX3X9dz/Fr3kKuZk/oCW7"
"eN3mZeHD/9BpOYI="
)

View File

@@ -0,0 +1,113 @@
"""
OIDC Configuration for Authelia Integration
"""
import os
from typing import Dict, Any
from authlib.jose import jwt
from authlib.jose.errors import JoseError
import httpx
class OIDCConfig:
"""OIDC Configuration Manager"""
def __init__(self):
# Load from environment variables
self.issuer = os.getenv("OIDC_ISSUER") # e.g., https://auth.example.com
self.client_id = os.getenv("OIDC_CLIENT_ID")
self.client_secret = os.getenv("OIDC_CLIENT_SECRET")
self.redirect_uri = os.getenv(
"OIDC_REDIRECT_URI", "http://localhost:8080/api/user/oidc/callback"
)
# OIDC endpoints (can use discovery or manual config)
self.use_discovery = os.getenv("OIDC_USE_DISCOVERY", "true").lower() == "true"
# Manual endpoint configuration (fallback if discovery fails)
self.authorization_endpoint = os.getenv("OIDC_AUTHORIZATION_ENDPOINT")
self.token_endpoint = os.getenv("OIDC_TOKEN_ENDPOINT")
self.userinfo_endpoint = os.getenv("OIDC_USERINFO_ENDPOINT")
self.jwks_uri = os.getenv("OIDC_JWKS_URI")
# Cached discovery document and JWKS
self._discovery_doc: Dict[str, Any] | None = None
self._jwks: Dict[str, Any] | None = None
def validate_config(self) -> bool:
"""Validate that required configuration is present"""
if not self.issuer or not self.client_id or not self.client_secret:
return False
return True
async def get_discovery_document(self) -> Dict[str, Any]:
"""Fetch OIDC discovery document from .well-known endpoint"""
if self._discovery_doc:
return self._discovery_doc
if not self.use_discovery:
# Return manual configuration
return {
"issuer": self.issuer,
"authorization_endpoint": self.authorization_endpoint,
"token_endpoint": self.token_endpoint,
"userinfo_endpoint": self.userinfo_endpoint,
"jwks_uri": self.jwks_uri,
}
discovery_url = f"{self.issuer.rstrip('/')}/.well-known/openid-configuration"
async with httpx.AsyncClient() as client:
response = await client.get(discovery_url)
response.raise_for_status()
self._discovery_doc = response.json()
return self._discovery_doc
async def get_jwks(self) -> Dict[str, Any]:
"""Fetch JSON Web Key Set for token verification"""
if self._jwks:
return self._jwks
discovery = await self.get_discovery_document()
jwks_uri = discovery.get("jwks_uri")
if not jwks_uri:
raise ValueError("No jwks_uri found in discovery document")
async with httpx.AsyncClient() as client:
response = await client.get(jwks_uri)
response.raise_for_status()
self._jwks = response.json()
return self._jwks
async def verify_id_token(self, id_token: str) -> Dict[str, Any]:
"""
Verify and decode ID token from OIDC provider
Returns the decoded claims if valid
Raises exception if invalid
"""
jwks = await self.get_jwks()
try:
# Verify token signature and claims
claims = jwt.decode(
id_token,
jwks,
claims_options={
"iss": {"essential": True, "value": self.issuer},
"aud": {"essential": True, "value": self.client_id},
"exp": {"essential": True},
},
)
# Additional validation
claims.validate()
return claims
except JoseError as e:
raise ValueError(f"Invalid ID token: {str(e)}")
# Global instance
oidc_config = OIDCConfig()

View File

@@ -0,0 +1,12 @@
[project]
name = "raggr"
version = "0.1.0"
description = "Add your description here"
readme = "README.md"
requires-python = ">=3.13"
dependencies = ["chromadb>=1.1.0", "python-dotenv>=1.0.0", "flask>=3.1.2", "httpx>=0.28.1", "ollama>=0.6.0", "openai>=2.0.1", "pydantic>=2.11.9", "pillow>=10.0.0", "pymupdf>=1.24.0", "black>=25.9.0", "pillow-heif>=1.1.1", "flask-jwt-extended>=4.7.1", "bcrypt>=5.0.0", "pony>=0.7.19", "flask-login>=0.6.3", "quart>=0.20.0", "tortoise-orm>=0.25.1", "quart-jwt-extended>=0.1.0", "pre-commit>=4.3.0", "tortoise-orm-stubs>=1.0.2", "aerich>=0.8.0", "tomlkit>=0.13.3", "authlib>=1.3.0", "asyncpg>=0.30.0"]
[tool.aerich]
tortoise_orm = "app.TORTOISE_CONFIG"
location = "./migrations"
src_folder = "./."

View File

@@ -49,11 +49,20 @@ DOCTYPE_OPTIONS = [
"Letter",
]
QUERY_TYPE_OPTIONS = [
"Simba",
"Other",
]
class DocumentType(BaseModel):
type: list[str] = Field(description="type of document", enum=DOCTYPE_OPTIONS)
class QueryType(BaseModel):
type: str = Field(desciption="type of query", enum=QUERY_TYPE_OPTIONS)
PROMPT = """
You are an information specialist that processes user queries. The current year is 2025. The user queries are all about
a cat, Simba, and its records. The types of records are listed below. Using the query, extract the
@@ -111,6 +120,27 @@ Query: "Who does Simba know?"
Tags: ["Letter", "Documentation"]
"""
QUERY_TYPE_PROMPT = f"""You are an information specialist that processes user queries.
A query can have one tag attached from the following options. Based on the query and the transcript which is listed below, determine
which of the following options is most appropriate: {",".join(QUERY_TYPE_OPTIONS)}
### Example 1
Query: "Who is Simba's current vet?"
Tags: ["Simba"]
### Example 2
Query: "What is the capital of Tokyo?"
Tags: ["Other"]
### Example 3
Query: "Can you help me write an email?"
Tags: ["Other"]
TRANSCRIPT:
"""
class QueryGenerator:
def __init__(self) -> None:
@@ -154,6 +184,33 @@ class QueryGenerator:
metadata_query = {"document_type": {"$in": type_data["type"]}}
return metadata_query
def get_query_type(self, input: str, transcript: str):
client = OpenAI()
response = client.chat.completions.create(
messages=[
{
"role": "system",
"content": "You are an information specialist that is really good at deciding what tags a query should have",
},
{
"role": "user",
"content": f"{QUERY_TYPE_PROMPT}\nTRANSCRIPT:\n{transcript}\nQUERY:{input}",
},
],
model="gpt-4o",
response_format={
"type": "json_schema",
"json_schema": {
"name": "query_type",
"schema": QueryType.model_json_schema(),
},
},
)
response_json_str = response.choices[0].message.content
type_data = json.loads(response_json_str)
return type_data["type"]
def get_query(self, input: str):
client = OpenAI()
response = client.responses.parse(

View File

@@ -0,0 +1,9 @@
.git
.gitignore
README.md
.DS_Store
node_modules
dist
.cache
coverage
*.log

View File

@@ -6,6 +6,7 @@
# Dist
node_modules
dist/
.yarn
# Profile
.rspack-profile-*/

View File

@@ -0,0 +1 @@
nodeLinker: node-modules

View File

@@ -0,0 +1,15 @@
FROM node:20-slim
WORKDIR /app
# Copy package files
COPY package.json yarn.lock* ./
# Install dependencies
RUN yarn install
# Expose rsbuild dev server port (default 3000)
EXPOSE 3000
# The actual source code will be mounted as a volume
# CMD will be specified in docker-compose

View File

@@ -0,0 +1,63 @@
# Token Refresh Implementation
## Overview
The API services now automatically handle token refresh when access tokens expire. This provides a seamless user experience without requiring manual re-authentication.
## How It Works
### 1. **userService.ts**
The `userService` now includes:
- **`refreshToken()`**: Automatically gets the refresh token from localStorage, calls the `/api/user/refresh` endpoint, and updates the access token
- **`fetchWithAuth()`**: A wrapper around `fetch()` that:
1. Automatically adds the Authorization header with the access token
2. Detects 401 (Unauthorized) responses
3. Automatically refreshes the token using the refresh token
4. Retries the original request with the new access token
5. Throws an error if refresh fails (e.g., refresh token expired)
### 2. **conversationService.ts**
Now uses `userService.fetchWithAuth()` for all API calls:
- `sendQuery()` - No longer needs token parameter
- `getMessages()` - No longer needs token parameter
### 3. **Components Updated**
**ChatScreen.tsx**:
- Removed manual token handling
- Now simply calls `conversationService.sendQuery(query)` and `conversationService.getMessages()`
## Benefits
**Automatic token refresh** - Users stay logged in longer
**Transparent retry logic** - Failed requests due to expired tokens are automatically retried
**Cleaner code** - Components don't need to manage tokens
**Better UX** - No interruptions when access token expires
**Centralized auth logic** - All auth handling in one place
## Error Handling
- If refresh token is missing or invalid, the error is thrown
- Components can catch these errors and redirect to login
- LocalStorage is automatically cleared when refresh fails
## Usage Example
```typescript
// Old way (manual token management)
const token = localStorage.getItem("access_token");
const result = await conversationService.sendQuery(query, token);
// New way (automatic token refresh)
const result = await conversationService.sendQuery(query);
```
## Token Storage
- **Access Token**: `localStorage.getItem("access_token")`
- **Refresh Token**: `localStorage.getItem("refresh_token")`
Both are automatically managed by the services.

File diff suppressed because it is too large Load Diff

View File

@@ -6,21 +6,37 @@
"scripts": {
"build": "rsbuild build",
"dev": "rsbuild dev --open",
"preview": "rsbuild preview"
"preview": "rsbuild preview",
"watch": "npm-watch build",
"watch:build": "rsbuild build --watch"
},
"dependencies": {
"axios": "^1.12.2",
"marked": "^16.3.0",
"npm-watch": "^0.13.0",
"react": "^19.1.1",
"react-dom": "^19.1.1",
"react-markdown": "^10.1.0"
"react-markdown": "^10.1.0",
"watch": "^1.0.2"
},
"devDependencies": {
"@biomejs/biome": "2.3.10",
"@rsbuild/core": "^1.5.6",
"@rsbuild/plugin-react": "^1.4.0",
"@tailwindcss/postcss": "^4.0.0",
"@types/react": "^19.1.13",
"@types/react-dom": "^19.1.9",
"typescript": "^5.9.2"
},
"watch": {
"build": {
"patterns": [
"src"
],
"extensions": "ts,tsx,css,js,jsx",
"delay": 1000,
"quiet": false,
"inherit": true
}
}
}

View File

@@ -3,4 +3,8 @@ import { pluginReact } from '@rsbuild/plugin-react';
export default defineConfig({
plugins: [pluginReact()],
html: {
title: 'Raggr',
favicon: './src/assets/favicon.svg',
},
});

View File

@@ -3,4 +3,5 @@
body {
margin: 0;
font-family: Inter, Avenir, Helvetica, Arial, sans-serif;
background-color: #F9F5EB;
}

View File

@@ -0,0 +1,72 @@
import { useState, useEffect } from "react";
import "./App.css";
import { AuthProvider } from "./contexts/AuthContext";
import { ChatScreen } from "./components/ChatScreen";
import { LoginScreen } from "./components/LoginScreen";
import { conversationService } from "./api/conversationService";
const AppContainer = () => {
const [isAuthenticated, setAuthenticated] = useState<boolean>(false);
const [isChecking, setIsChecking] = useState<boolean>(true);
useEffect(() => {
const checkAuth = async () => {
const accessToken = localStorage.getItem("access_token");
const refreshToken = localStorage.getItem("refresh_token");
// No tokens at all, not authenticated
if (!accessToken && !refreshToken) {
setIsChecking(false);
setAuthenticated(false);
return;
}
// Try to verify token by making a request
try {
await conversationService.getAllConversations();
// If successful, user is authenticated
setAuthenticated(true);
} catch (error) {
// Token is invalid or expired
console.error("Authentication check failed:", error);
localStorage.removeItem("access_token");
localStorage.removeItem("refresh_token");
setAuthenticated(false);
} finally {
setIsChecking(false);
}
};
checkAuth();
}, []);
// Show loading state while checking authentication
if (isChecking) {
return (
<div className="h-screen flex items-center justify-center bg-white/85">
<div className="text-xl">Loading...</div>
</div>
);
}
return (
<>
{isAuthenticated ? (
<ChatScreen setAuthenticated={setAuthenticated} />
) : (
<LoginScreen setAuthenticated={setAuthenticated} />
)}
</>
);
};
const App = () => {
return (
<AuthProvider>
<AppContainer />
</AuthProvider>
);
};
export default App;

View File

@@ -0,0 +1,115 @@
import { userService } from "./userService";
interface Message {
id: string;
text: string;
speaker: "user" | "simba";
created_at: string;
}
interface Conversation {
id: string;
name: string;
messages?: Message[];
created_at: string;
updated_at: string;
user_id?: string;
}
interface QueryRequest {
query: string;
}
interface QueryResponse {
response: string;
}
interface CreateConversationRequest {
user_id: string;
}
class ConversationService {
private baseUrl = "/api";
private conversationBaseUrl = "/api/conversation";
async sendQuery(
query: string,
conversation_id: string,
): Promise<QueryResponse> {
const response = await userService.fetchWithRefreshToken(
`${this.baseUrl}/query`,
{
method: "POST",
body: JSON.stringify({ query, conversation_id }),
},
);
if (!response.ok) {
throw new Error("Failed to send query");
}
return await response.json();
}
async getMessages(): Promise<Conversation> {
const response = await userService.fetchWithRefreshToken(
`${this.baseUrl}/messages`,
{
method: "GET",
},
);
if (!response.ok) {
throw new Error("Failed to fetch messages");
}
return await response.json();
}
async getConversation(conversationId: string): Promise<Conversation> {
const response = await userService.fetchWithRefreshToken(
`${this.conversationBaseUrl}/${conversationId}`,
{
method: "GET",
},
);
if (!response.ok) {
throw new Error("Failed to fetch conversation");
}
return await response.json();
}
async createConversation(): Promise<Conversation> {
const response = await userService.fetchWithRefreshToken(
`${this.conversationBaseUrl}/`,
{
method: "POST",
},
);
if (!response.ok) {
throw new Error("Failed to create conversation");
}
return await response.json();
}
async getAllConversations(): Promise<Conversation[]> {
const response = await userService.fetchWithRefreshToken(
`${this.conversationBaseUrl}/`,
{
method: "GET",
},
);
if (!response.ok) {
throw new Error("Failed to fetch conversations");
}
return await response.json();
}
}
export const conversationService = new ConversationService();

View File

@@ -0,0 +1,94 @@
/**
* OIDC Authentication Service
* Handles OAuth 2.0 Authorization Code flow with PKCE
*/
interface OIDCLoginResponse {
auth_url: string;
}
interface OIDCCallbackResponse {
access_token: string;
refresh_token: string;
user: {
id: string;
username: string;
email: string;
};
}
class OIDCService {
private baseUrl = "/api/user/oidc";
/**
* Initiate OIDC login flow
* Returns authorization URL to redirect user to
*/
async initiateLogin(redirectAfterLogin: string = "/"): Promise<string> {
const response = await fetch(
`${this.baseUrl}/login?redirect=${encodeURIComponent(redirectAfterLogin)}`,
{
method: "GET",
headers: { "Content-Type": "application/json" },
}
);
if (!response.ok) {
throw new Error("Failed to initiate OIDC login");
}
const data: OIDCLoginResponse = await response.json();
return data.auth_url;
}
/**
* Handle OIDC callback
* Exchanges authorization code for tokens
*/
async handleCallback(
code: string,
state: string
): Promise<OIDCCallbackResponse> {
const response = await fetch(
`${this.baseUrl}/callback?code=${encodeURIComponent(code)}&state=${encodeURIComponent(state)}`,
{
method: "GET",
headers: { "Content-Type": "application/json" },
}
);
if (!response.ok) {
throw new Error("OIDC callback failed");
}
return await response.json();
}
/**
* Extract OIDC callback parameters from URL
*/
getCallbackParamsFromURL(): { code: string; state: string } | null {
const params = new URLSearchParams(window.location.search);
const code = params.get("code");
const state = params.get("state");
if (code && state) {
return { code, state };
}
return null;
}
/**
* Clear callback parameters from URL without reload
*/
clearCallbackParams(): void {
const url = new URL(window.location.href);
url.searchParams.delete("code");
url.searchParams.delete("state");
url.searchParams.delete("error");
window.history.replaceState({}, "", url.toString());
}
}
export const oidcService = new OIDCService();

View File

@@ -0,0 +1,139 @@
interface LoginResponse {
access_token: string;
refresh_token: string;
user: {
id: string;
username: string;
email?: string;
};
}
interface RefreshResponse {
access_token: string;
}
class UserService {
private baseUrl = "/api/user";
async login(username: string, password: string): Promise<LoginResponse> {
const response = await fetch(`${this.baseUrl}/login`, {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({ username, password }),
});
if (!response.ok) {
throw new Error("Invalid credentials");
}
return await response.json();
}
async refreshToken(): Promise<string> {
const refreshToken = localStorage.getItem("refresh_token");
if (!refreshToken) {
throw new Error("No refresh token available");
}
const response = await fetch(`${this.baseUrl}/refresh`, {
method: "POST",
headers: {
"Content-Type": "application/json",
Authorization: `Bearer ${refreshToken}`,
},
});
if (!response.ok) {
// Refresh token is invalid or expired, clear storage
localStorage.removeItem("access_token");
localStorage.removeItem("refresh_token");
throw new Error("Failed to refresh token");
}
const data: RefreshResponse = await response.json();
localStorage.setItem("access_token", data.access_token);
return data.access_token;
}
async validateToken(): Promise<boolean> {
const refreshToken = localStorage.getItem("refresh_token");
if (!refreshToken) {
return false;
}
try {
await this.refreshToken();
return true;
} catch (error) {
return false;
}
}
async fetchWithAuth(
url: string,
options: RequestInit = {},
): Promise<Response> {
const accessToken = localStorage.getItem("access_token");
// Add authorization header
const headers = {
"Content-Type": "application/json",
...(options.headers || {}),
...(accessToken && { Authorization: `Bearer ${accessToken}` }),
};
let response = await fetch(url, { ...options, headers });
// If unauthorized, try refreshing the token
if (response.status === 401) {
try {
const newAccessToken = await this.refreshToken();
// Retry the request with new token
headers.Authorization = `Bearer ${newAccessToken}`;
response = await fetch(url, { ...options, headers });
} catch (error) {
// Refresh failed, redirect to login or throw error
throw new Error("Session expired. Please log in again.");
}
}
return response;
}
async fetchWithRefreshToken(
url: string,
options: RequestInit = {},
): Promise<Response> {
const refreshToken = localStorage.getItem("refresh_token");
// Add authorization header
const headers = {
"Content-Type": "application/json",
...(options.headers || {}),
...(refreshToken && { Authorization: `Bearer ${refreshToken}` }),
};
let response = await fetch(url, { ...options, headers });
// If unauthorized, try refreshing the token
if (response.status === 401) {
try {
const newAccessToken = await this.refreshToken();
// Retry the request with new token
headers.Authorization = `Bearer ${newAccessToken}`;
response = await fetch(url, { ...options, headers });
} catch (error) {
// Refresh failed, redirect to login or throw error
throw new Error("Session expired. Please log in again.");
}
}
return response;
}
}
export const userService = new UserService();

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.8 KiB

View File

@@ -0,0 +1,3 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 100 100">
<text y="80" font-size="80" font-family="system-ui, -apple-system, sans-serif">🐱</text>
</svg>

After

Width:  |  Height:  |  Size: 163 B

View File

@@ -0,0 +1,31 @@
import ReactMarkdown from "react-markdown";
type AnswerBubbleProps = {
text: string;
loading?: boolean;
};
export const AnswerBubble = ({ text, loading }: AnswerBubbleProps) => {
return (
<div className="rounded-md bg-orange-100 p-3 sm:p-4 w-2/3">
{loading ? (
<div className="flex flex-col w-full animate-pulse gap-2">
<div className="flex flex-row gap-2 w-full">
<div className="bg-gray-400 w-1/2 p-3 rounded-lg" />
<div className="bg-gray-400 w-1/2 p-3 rounded-lg" />
</div>
<div className="flex flex-row gap-2 w-full">
<div className="bg-gray-400 w-1/3 p-3 rounded-lg" />
<div className="bg-gray-400 w-2/3 p-3 rounded-lg" />
</div>
</div>
) : (
<div className=" flex flex-col break-words overflow-wrap-anywhere text-sm sm:text-base [&>*]:break-words">
<ReactMarkdown>
{"🐈: " + text}
</ReactMarkdown>
</div>
)}
</div>
);
};

View File

@@ -0,0 +1,282 @@
import { useEffect, useState, useRef } from "react";
import { conversationService } from "../api/conversationService";
import { QuestionBubble } from "./QuestionBubble";
import { AnswerBubble } from "./AnswerBubble";
import { MessageInput } from "./MessageInput";
import { ConversationList } from "./ConversationList";
import catIcon from "../assets/cat.png";
type Message = {
text: string;
speaker: "simba" | "user";
};
type QuestionAnswer = {
question: string;
answer: string;
};
type Conversation = {
title: string;
id: string;
};
type ChatScreenProps = {
setAuthenticated: (isAuth: boolean) => void;
};
export const ChatScreen = ({ setAuthenticated }: ChatScreenProps) => {
const [query, setQuery] = useState<string>("");
const [answer, setAnswer] = useState<string>("");
const [simbaMode, setSimbaMode] = useState<boolean>(false);
const [questionsAnswers, setQuestionsAnswers] = useState<QuestionAnswer[]>(
[],
);
const [messages, setMessages] = useState<Message[]>([]);
const [conversations, setConversations] = useState<Conversation[]>([
{ title: "simba meow meow", id: "uuid" },
]);
const [showConversations, setShowConversations] = useState<boolean>(false);
const [selectedConversation, setSelectedConversation] =
useState<Conversation | null>(null);
const [sidebarCollapsed, setSidebarCollapsed] = useState<boolean>(false);
const messagesEndRef = useRef<HTMLDivElement>(null);
const simbaAnswers = ["meow.", "hiss...", "purrrrrr", "yowOWROWWowowr"];
const scrollToBottom = () => {
messagesEndRef.current?.scrollIntoView({ behavior: "smooth" });
};
const handleSelectConversation = (conversation: Conversation) => {
setShowConversations(false);
setSelectedConversation(conversation);
const loadMessages = async () => {
try {
const fetchedConversation = await conversationService.getConversation(
conversation.id,
);
setMessages(
fetchedConversation.messages.map((message) => ({
text: message.text,
speaker: message.speaker,
})),
);
} catch (error) {
console.error("Failed to load messages:", error);
}
};
loadMessages();
};
const loadConversations = async () => {
try {
const fetchedConversations =
await conversationService.getAllConversations();
const parsedConversations = fetchedConversations.map((conversation) => ({
id: conversation.id,
title: conversation.name,
}));
setConversations(parsedConversations);
setSelectedConversation(parsedConversations[0]);
console.log(parsedConversations);
} catch (error) {
console.error("Failed to load messages:", error);
}
};
const handleCreateNewConversation = async () => {
const newConversation = await conversationService.createConversation();
await loadConversations();
setSelectedConversation({
title: newConversation.name,
id: newConversation.id,
});
};
useEffect(() => {
loadConversations();
}, []);
useEffect(() => {
scrollToBottom();
}, [messages]);
useEffect(() => {
const loadMessages = async () => {
if (selectedConversation == null) return;
try {
const conversation = await conversationService.getConversation(
selectedConversation.id,
);
setMessages(
conversation.messages.map((message) => ({
text: message.text,
speaker: message.speaker,
})),
);
} catch (error) {
console.error("Failed to load messages:", error);
}
};
loadMessages();
}, [selectedConversation]);
const handleQuestionSubmit = async () => {
if (!query.trim()) return; // Don't submit empty messages
const currMessages = messages.concat([{ text: query, speaker: "user" }]);
setMessages(currMessages);
setQuery(""); // Clear input immediately after submission
if (simbaMode) {
console.log("simba mode activated");
const randomIndex = Math.floor(Math.random() * simbaAnswers.length);
const randomElement = simbaAnswers[randomIndex];
setAnswer(randomElement);
setQuestionsAnswers(
questionsAnswers.concat([
{
question: query,
answer: randomElement,
},
]),
);
return;
}
try {
const result = await conversationService.sendQuery(
query,
selectedConversation.id,
);
setQuestionsAnswers(
questionsAnswers.concat([{ question: query, answer: result.response }]),
);
setMessages(
currMessages.concat([{ text: result.response, speaker: "simba" }]),
);
} catch (error) {
console.error("Failed to send query:", error);
// If session expired, redirect to login
if (error instanceof Error && error.message.includes("Session expired")) {
setAuthenticated(false);
}
}
};
const handleQueryChange = (event: React.ChangeEvent<HTMLTextAreaElement>) => {
setQuery(event.target.value);
};
const handleKeyDown = (event: React.KeyboardEvent<HTMLTextAreaElement>) => {
// Submit on Enter, but allow Shift+Enter for new line
if (event.key === "Enter" && !event.shiftKey) {
event.preventDefault();
handleQuestionSubmit();
}
};
return (
<div className="h-screen flex flex-row bg-[#F9F5EB]">
{/* Sidebar - Expanded */}
<aside className={`hidden md:flex md:flex-col bg-white border-r border-gray-200 p-4 overflow-y-auto transition-all duration-300 ${sidebarCollapsed ? 'w-20' : 'w-64'}`}>
{!sidebarCollapsed ? (
<>
<div className="flex flex-row items-center gap-2 mb-6">
<img
src={catIcon}
alt="Simba"
className="cursor-pointer hover:opacity-80"
onClick={() => setSidebarCollapsed(true)}
/>
<h2 className="text-3xl bg-[#F9F5EB] font-semibold">asksimba!</h2>
</div>
<ConversationList
conversations={conversations}
onCreateNewConversation={handleCreateNewConversation}
onSelectConversation={handleSelectConversation}
/>
<div className="mt-auto pt-4">
<button
className="w-full p-2 border border-red-400 bg-red-200 hover:bg-red-400 cursor-pointer rounded-md text-sm"
onClick={() => setAuthenticated(false)}
>
logout
</button>
</div>
</>
) : (
<div className="flex flex-col items-center gap-4">
<img
src={catIcon}
alt="Simba"
className="cursor-pointer hover:opacity-80"
onClick={() => setSidebarCollapsed(false)}
/>
</div>
)}
</aside>
{/* Main chat area */}
<div className="flex-1 flex flex-col h-screen overflow-hidden">
{/* Mobile header */}
<header className="md:hidden flex flex-row justify-between items-center gap-3 p-4 border-b border-gray-200 bg-white">
<div className="flex flex-row items-center gap-2">
<img src={catIcon} alt="Simba" className="w-10 h-10" />
<h1 className="text-xl">asksimba!</h1>
</div>
<div className="flex flex-row gap-2">
<button
className="p-2 border border-green-400 bg-green-200 hover:bg-green-400 cursor-pointer rounded-md text-sm"
onClick={() => setShowConversations(!showConversations)}
>
{showConversations ? "hide" : "show"}
</button>
<button
className="p-2 border border-red-400 bg-red-200 hover:bg-red-400 cursor-pointer rounded-md text-sm"
onClick={() => setAuthenticated(false)}
>
logout
</button>
</div>
</header>
{/* Messages area */}
<div className="flex-1 overflow-y-auto px-4 py-4">
<div className="max-w-2xl mx-auto flex flex-col gap-4">
{showConversations && (
<div className="md:hidden">
<ConversationList
conversations={conversations}
onCreateNewConversation={handleCreateNewConversation}
onSelectConversation={handleSelectConversation}
/>
</div>
)}
{messages.map((msg, index) => {
if (msg.speaker === "simba") {
return <AnswerBubble key={index} text={msg.text} />;
}
return <QuestionBubble key={index} text={msg.text} />;
})}
<div ref={messagesEndRef} />
</div>
</div>
{/* Input area */}
<footer className="p-4 bg-[#F9F5EB]">
<div className="max-w-2xl mx-auto">
<MessageInput
query={query}
handleQueryChange={handleQueryChange}
handleKeyDown={handleKeyDown}
handleQuestionSubmit={handleQuestionSubmit}
setSimbaMode={setSimbaMode}
/>
</div>
</footer>
</div>
</div>
);
};

View File

@@ -0,0 +1,69 @@
import { useState, useEffect } from "react";
import { conversationService } from "../api/conversationService";
type Conversation = {
title: string;
id: string;
};
type ConversationProps = {
conversations: Conversation[];
onSelectConversation: (conversation: Conversation) => void;
onCreateNewConversation: () => void;
};
export const ConversationList = ({
conversations,
onSelectConversation,
onCreateNewConversation,
}: ConversationProps) => {
const [conservations, setConversations] = useState(conversations);
useEffect(() => {
const loadConversations = async () => {
try {
let fetchedConversations =
await conversationService.getAllConversations();
if (conversations.length == 0) {
await conversationService.createConversation();
fetchedConversations =
await conversationService.getAllConversations();
}
setConversations(
fetchedConversations.map((conversation) => ({
id: conversation.id,
title: conversation.name,
})),
);
} catch (error) {
console.error("Failed to load messages:", error);
}
};
loadConversations();
}, []);
return (
<div className="bg-indigo-300 rounded-md p-3 sm:p-4 flex flex-col gap-1">
{conservations.map((conversation) => {
return (
<div
key={conversation.id}
className="border-blue-400 bg-indigo-300 hover:bg-indigo-200 cursor-pointer rounded-md p-3 min-h-[44px] flex items-center"
onClick={() => onSelectConversation(conversation)}
>
<p className="text-sm sm:text-base truncate w-full">
{conversation.title}
</p>
</div>
);
})}
<div
className="border-blue-400 bg-indigo-300 hover:bg-indigo-200 cursor-pointer rounded-md p-3 min-h-[44px] flex items-center"
onClick={() => onCreateNewConversation()}
>
<p className="text-sm sm:text-base"> + Start a new thread</p>
</div>
</div>
);
};

View File

@@ -0,0 +1,24 @@
type Conversation = {
title: string;
id: string;
};
type ConversationMenuProps = {
conversations: Conversation[];
};
export const ConversationMenu = ({ conversations }: ConversationMenuProps) => {
return (
<div className="absolute bg-white w-md rounded-md shadow-xl m-4 p-4">
<p className="py-2 px-4 rounded-md w-full text-xl font-bold">askSimba!</p>
{conversations.map((conversation) => (
<p
key={conversation.id}
className="py-2 px-4 rounded-md hover:bg-stone-200 w-full text-xl font-bold cursor-pointer"
>
{conversation.title}
</p>
))}
</div>
);
};

View File

@@ -0,0 +1,130 @@
import { useState, useEffect } from "react";
import { userService } from "../api/userService";
import { oidcService } from "../api/oidcService";
type LoginScreenProps = {
setAuthenticated: (isAuth: boolean) => void;
};
export const LoginScreen = ({ setAuthenticated }: LoginScreenProps) => {
const [error, setError] = useState<string>("");
const [isChecking, setIsChecking] = useState<boolean>(true);
const [isLoggingIn, setIsLoggingIn] = useState<boolean>(false);
useEffect(() => {
const initAuth = async () => {
// First, check for OIDC callback parameters
const callbackParams = oidcService.getCallbackParamsFromURL();
if (callbackParams) {
// Handle OIDC callback
try {
setIsLoggingIn(true);
const result = await oidcService.handleCallback(
callbackParams.code,
callbackParams.state
);
// Store tokens
localStorage.setItem("access_token", result.access_token);
localStorage.setItem("refresh_token", result.refresh_token);
// Clear URL parameters
oidcService.clearCallbackParams();
setAuthenticated(true);
setIsChecking(false);
return;
} catch (err) {
console.error("OIDC callback error:", err);
setError("Login failed. Please try again.");
oidcService.clearCallbackParams();
setIsLoggingIn(false);
setIsChecking(false);
return;
}
}
// Check if user is already authenticated
const isValid = await userService.validateToken();
if (isValid) {
setAuthenticated(true);
}
setIsChecking(false);
};
initAuth();
}, [setAuthenticated]);
const handleOIDCLogin = async () => {
try {
setIsLoggingIn(true);
setError("");
// Get authorization URL from backend
const authUrl = await oidcService.initiateLogin();
// Redirect to Authelia
window.location.href = authUrl;
} catch (err) {
setError("Failed to initiate login. Please try again.");
console.error("OIDC login error:", err);
setIsLoggingIn(false);
}
};
// Show loading state while checking authentication or processing callback
if (isChecking || isLoggingIn) {
return (
<div className="h-screen bg-opacity-20">
<div className="bg-white/85 h-screen flex items-center justify-center">
<div className="text-center">
<p className="text-lg sm:text-xl">
{isLoggingIn ? "Logging in..." : "Checking authentication..."}
</p>
</div>
</div>
</div>
);
}
return (
<div className="h-screen bg-opacity-20">
<div className="bg-white/85 h-screen">
<div className="flex flex-row justify-center py-4">
<div className="flex flex-col gap-4 w-full px-4 sm:w-11/12 sm:max-w-2xl lg:max-w-4xl sm:px-0">
<div className="flex flex-col gap-4">
<div className="flex flex-grow justify-center w-full bg-amber-400 p-2">
<h1 className="text-base sm:text-xl font-bold text-center">
I AM LOOKING FOR A DESIGNER. THIS APP WILL REMAIN UGLY UNTIL A
DESIGNER COMES.
</h1>
</div>
<header className="flex flex-row justify-center gap-2 grow sticky top-0 z-10 bg-white">
<h1 className="text-2xl sm:text-3xl">ask simba!</h1>
</header>
{error && (
<div className="text-red-600 font-semibold text-sm sm:text-base bg-red-50 p-3 rounded-md">
{error}
</div>
)}
<div className="text-center text-sm sm:text-base text-gray-600 py-2">
Click below to login with Authelia
</div>
</div>
<button
className="p-3 sm:p-4 min-h-[44px] border border-blue-400 bg-blue-200 hover:bg-blue-400 cursor-pointer rounded-md flex-grow text-sm sm:text-base font-semibold"
onClick={handleOIDCLogin}
disabled={isLoggingIn}
>
{isLoggingIn ? "Redirecting..." : "Login with Authelia"}
</button>
</div>
</div>
</div>
</div>
);
};

View File

@@ -0,0 +1,43 @@
import { useEffect, useState, useRef } from "react";
type MessageInputProps = {
handleQueryChange: (event: React.ChangeEvent<HTMLTextAreaElement>) => void;
handleKeyDown: (event: React.ChangeEvent<HTMLTextAreaElement>) => void;
handleQuestionSubmit: () => void;
setSimbaMode: (sdf: boolean) => void;
query: string;
}
export const MessageInput = ({query, handleKeyDown, handleQueryChange, handleQuestionSubmit, setSimbaMode}: MessageInputProps) => {
return (
<div className="flex flex-col gap-4 sticky bottom-0 bg-[#3D763A] p-6 rounded-xl">
<div className="flex flex-row justify-between grow">
<textarea
className="p-3 sm:p-4 border border-blue-200 rounded-md grow bg-[#F9F5EB] min-h-[44px] resize-y"
onChange={handleQueryChange}
onKeyDown={handleKeyDown}
value={query}
rows={2}
placeholder="Type your message... (Press Enter to send, Shift+Enter for new line)"
/>
</div>
<div className="flex flex-row justify-between gap-2 grow">
<button
className="p-3 sm:p-4 min-h-[44px] border border-blue-400 bg-[#EDA541] hover:bg-blue-400 cursor-pointer rounded-md flex-grow text-sm sm:text-base"
onClick={() => handleQuestionSubmit()}
type="submit"
>
Submit
</button>
</div>
<div className="flex flex-row justify-center gap-2 grow items-center">
<input
type="checkbox"
onChange={(event) => setSimbaMode(event.target.checked)}
className="w-5 h-5 cursor-pointer"
/>
<p className="text-sm sm:text-base">simba mode?</p>
</div>
</div>
);
}

View File

@@ -0,0 +1,11 @@
type QuestionBubbleProps = {
text: string;
};
export const QuestionBubble = ({ text }: QuestionBubbleProps) => {
return (
<div className="w-2/3 rounded-md bg-stone-200 p-3 sm:p-4 break-words overflow-wrap-anywhere text-sm sm:text-base ml-auto">
🤦: {text}
</div>
);
};

View File

@@ -0,0 +1,56 @@
import { createContext, useContext, useState, ReactNode } from "react";
import { userService } from "../api/userService";
interface AuthContextType {
token: string | null;
login: (username: string, password: string) => Promise<any>;
logout: () => void;
isAuthenticated: () => boolean;
}
const AuthContext = createContext<AuthContextType | undefined>(undefined);
interface AuthProviderProps {
children: ReactNode;
}
export const AuthProvider = ({ children }: AuthProviderProps) => {
const [token, setToken] = useState(localStorage.getItem("access_token"));
const login = async (username: string, password: string) => {
try {
const data = await userService.login(username, password);
setToken(data.access_token);
localStorage.setItem("access_token", data.access_token);
localStorage.setItem("refresh_token", data.refresh_token);
return data;
} catch (error) {
console.error("Login failed:", error);
throw error;
}
};
const logout = () => {
setToken(null);
localStorage.removeItem("access_token");
localStorage.removeItem("refresh_token");
};
const isAuthenticated = () => {
return token !== null && token !== undefined && token !== "";
};
return (
<AuthContext.Provider value={{ token, login, logout, isAuthenticated }}>
{children}
</AuthContext.Provider>
);
};
export const useAuth = () => {
const context = useContext(AuthContext);
if (context === undefined) {
throw new Error("useAuth must be used within an AuthProvider");
}
return context;
};

View File

Before

Width:  |  Height:  |  Size: 3.4 MiB

After

Width:  |  Height:  |  Size: 3.4 MiB

View File

Before

Width:  |  Height:  |  Size: 2.1 MiB

After

Width:  |  Height:  |  Size: 2.1 MiB

File diff suppressed because it is too large Load Diff

29
services/raggr/startup-dev.sh Executable file
View File

@@ -0,0 +1,29 @@
#!/bin/bash
set -e
echo "Initializing directories..."
mkdir -p /app/chromadb
echo "Waiting for frontend to build..."
while [ ! -f /app/raggr-frontend/dist/index.html ]; do
sleep 1
done
echo "Frontend built successfully!"
echo "Setting up database..."
# Give PostgreSQL a moment to be ready (healthcheck in docker-compose handles this)
sleep 3
if ls migrations/models/0_*.py 1> /dev/null 2>&1; then
echo "Running database migrations..."
aerich upgrade
else
echo "No migrations found, initializing database..."
aerich init-db
fi
echo "Starting reindex process..."
python main.py "" --reindex || echo "Reindex failed, continuing anyway..."
echo "Starting Flask application in debug mode..."
python app.py

View File

@@ -1,5 +1,8 @@
#!/bin/bash
echo "Running database migrations..."
aerich upgrade
echo "Starting reindex process..."
python main.py "" --reindex

View File

@@ -1,7 +1,22 @@
version = 1
revision = 2
revision = 3
requires-python = ">=3.13"
[[package]]
name = "aerich"
version = "0.9.2"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "anyio" },
{ name = "asyncclick" },
{ name = "dictdiffer" },
{ name = "tortoise-orm" },
]
sdist = { url = "https://files.pythonhosted.org/packages/c4/60/5d3885f531fab2cecec67510e7b821efc403940ed9eefd034b2c21350f3c/aerich-0.9.2.tar.gz", hash = "sha256:02d58658714eebe396fe7bd9f9401db3a60a44dc885910ad3990920d0357317d", size = 74231, upload-time = "2025-10-10T05:53:49.632Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/87/1a/956c6b1e35881bb9835a33c8db1565edcd133f8e45321010489092a0df40/aerich-0.9.2-py3-none-any.whl", hash = "sha256:d0f007acb21f6559f1eccd4e404fb039cf48af2689e0669afa62989389c0582d", size = 46451, upload-time = "2025-10-10T05:53:48.71Z" },
]
[[package]]
name = "aiofiles"
version = "25.1.0"
@@ -45,6 +60,18 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/15/b3/9b1a8074496371342ec1e796a96f99c82c945a339cd81a8e73de28b4cf9e/anyio-4.11.0-py3-none-any.whl", hash = "sha256:0287e96f4d26d4149305414d4e3bc32f0dcd0862365a4bddea19d7a1ec38c4fc", size = 109097, upload-time = "2025-09-23T09:19:10.601Z" },
]
[[package]]
name = "asyncclick"
version = "8.3.0.7"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "colorama", marker = "sys_platform == 'win32'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/f9/ca/25e426d16bd0e91c1c9259112cecd17b2c2c239bdd8e5dba430f3bd5e3ef/asyncclick-8.3.0.7.tar.gz", hash = "sha256:8a80d8ac613098ee6a9a8f0248f60c66c273e22402cf3f115ed7f071acfc71d3", size = 277634, upload-time = "2025-10-11T08:35:44.841Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/01/d9/782ffcb4c97b889bc12d8276637d2739b99520390ee8fec77c07416c5d12/asyncclick-8.3.0.7-py3-none-any.whl", hash = "sha256:7607046de39a3f315867cad818849f973e29d350c10d92f251db3ff7600c6c7d", size = 109925, upload-time = "2025-10-11T08:35:43.378Z" },
]
[[package]]
name = "attrs"
version = "25.3.0"
@@ -54,6 +81,18 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/77/06/bb80f5f86020c4551da315d78b3ab75e8228f89f0162f2c3a819e407941a/attrs-25.3.0-py3-none-any.whl", hash = "sha256:427318ce031701fea540783410126f03899a97ffc6f61596ad581ac2e40e3bc3", size = 63815, upload-time = "2025-03-13T11:10:21.14Z" },
]
[[package]]
name = "authlib"
version = "1.6.6"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "cryptography" },
]
sdist = { url = "https://files.pythonhosted.org/packages/bb/9b/b1661026ff24bc641b76b78c5222d614776b0c085bcfdac9bd15a1cb4b35/authlib-1.6.6.tar.gz", hash = "sha256:45770e8e056d0f283451d9996fbb59b70d45722b45d854d58f32878d0a40c38e", size = 164894, upload-time = "2025-12-12T08:01:41.464Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/54/51/321e821856452f7386c4e9df866f196720b1ad0c5ea1623ea7399969ae3b/authlib-1.6.6-py2.py3-none-any.whl", hash = "sha256:7d9e9bc535c13974313a87f53e8430eb6ea3d1cf6ae4f6efcd793f2e949143fd", size = 244005, upload-time = "2025-12-12T08:01:40.209Z" },
]
[[package]]
name = "backoff"
version = "2.2.1"
@@ -191,6 +230,51 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/e5/48/1549795ba7742c948d2ad169c1c8cdbae65bc450d6cd753d124b17c8cd32/certifi-2025.8.3-py3-none-any.whl", hash = "sha256:f6c12493cfb1b06ba2ff328595af9350c65d6644968e5d3a2ffd78699af217a5", size = 161216, upload-time = "2025-08-03T03:07:45.777Z" },
]
[[package]]
name = "cffi"
version = "2.0.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "pycparser", marker = "implementation_name != 'PyPy'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/eb/56/b1ba7935a17738ae8453301356628e8147c79dbb825bcbc73dc7401f9846/cffi-2.0.0.tar.gz", hash = "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529", size = 523588, upload-time = "2025-09-08T23:24:04.541Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/4b/8d/a0a47a0c9e413a658623d014e91e74a50cdd2c423f7ccfd44086ef767f90/cffi-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb", size = 185230, upload-time = "2025-09-08T23:23:00.879Z" },
{ url = "https://files.pythonhosted.org/packages/4a/d2/a6c0296814556c68ee32009d9c2ad4f85f2707cdecfd7727951ec228005d/cffi-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca", size = 181043, upload-time = "2025-09-08T23:23:02.231Z" },
{ url = "https://files.pythonhosted.org/packages/b0/1e/d22cc63332bd59b06481ceaac49d6c507598642e2230f201649058a7e704/cffi-2.0.0-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b", size = 212446, upload-time = "2025-09-08T23:23:03.472Z" },
{ url = "https://files.pythonhosted.org/packages/a9/f5/a2c23eb03b61a0b8747f211eb716446c826ad66818ddc7810cc2cc19b3f2/cffi-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b", size = 220101, upload-time = "2025-09-08T23:23:04.792Z" },
{ url = "https://files.pythonhosted.org/packages/f2/7f/e6647792fc5850d634695bc0e6ab4111ae88e89981d35ac269956605feba/cffi-2.0.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2", size = 207948, upload-time = "2025-09-08T23:23:06.127Z" },
{ url = "https://files.pythonhosted.org/packages/cb/1e/a5a1bd6f1fb30f22573f76533de12a00bf274abcdc55c8edab639078abb6/cffi-2.0.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3", size = 206422, upload-time = "2025-09-08T23:23:07.753Z" },
{ url = "https://files.pythonhosted.org/packages/98/df/0a1755e750013a2081e863e7cd37e0cdd02664372c754e5560099eb7aa44/cffi-2.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26", size = 219499, upload-time = "2025-09-08T23:23:09.648Z" },
{ url = "https://files.pythonhosted.org/packages/50/e1/a969e687fcf9ea58e6e2a928ad5e2dd88cc12f6f0ab477e9971f2309b57c/cffi-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c", size = 222928, upload-time = "2025-09-08T23:23:10.928Z" },
{ url = "https://files.pythonhosted.org/packages/36/54/0362578dd2c9e557a28ac77698ed67323ed5b9775ca9d3fe73fe191bb5d8/cffi-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b", size = 221302, upload-time = "2025-09-08T23:23:12.42Z" },
{ url = "https://files.pythonhosted.org/packages/eb/6d/bf9bda840d5f1dfdbf0feca87fbdb64a918a69bca42cfa0ba7b137c48cb8/cffi-2.0.0-cp313-cp313-win32.whl", hash = "sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27", size = 172909, upload-time = "2025-09-08T23:23:14.32Z" },
{ url = "https://files.pythonhosted.org/packages/37/18/6519e1ee6f5a1e579e04b9ddb6f1676c17368a7aba48299c3759bbc3c8b3/cffi-2.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75", size = 183402, upload-time = "2025-09-08T23:23:15.535Z" },
{ url = "https://files.pythonhosted.org/packages/cb/0e/02ceeec9a7d6ee63bb596121c2c8e9b3a9e150936f4fbef6ca1943e6137c/cffi-2.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91", size = 177780, upload-time = "2025-09-08T23:23:16.761Z" },
{ url = "https://files.pythonhosted.org/packages/92/c4/3ce07396253a83250ee98564f8d7e9789fab8e58858f35d07a9a2c78de9f/cffi-2.0.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5", size = 185320, upload-time = "2025-09-08T23:23:18.087Z" },
{ url = "https://files.pythonhosted.org/packages/59/dd/27e9fa567a23931c838c6b02d0764611c62290062a6d4e8ff7863daf9730/cffi-2.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13", size = 181487, upload-time = "2025-09-08T23:23:19.622Z" },
{ url = "https://files.pythonhosted.org/packages/d6/43/0e822876f87ea8a4ef95442c3d766a06a51fc5298823f884ef87aaad168c/cffi-2.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b", size = 220049, upload-time = "2025-09-08T23:23:20.853Z" },
{ url = "https://files.pythonhosted.org/packages/b4/89/76799151d9c2d2d1ead63c2429da9ea9d7aac304603de0c6e8764e6e8e70/cffi-2.0.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c", size = 207793, upload-time = "2025-09-08T23:23:22.08Z" },
{ url = "https://files.pythonhosted.org/packages/bb/dd/3465b14bb9e24ee24cb88c9e3730f6de63111fffe513492bf8c808a3547e/cffi-2.0.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef", size = 206300, upload-time = "2025-09-08T23:23:23.314Z" },
{ url = "https://files.pythonhosted.org/packages/47/d9/d83e293854571c877a92da46fdec39158f8d7e68da75bf73581225d28e90/cffi-2.0.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775", size = 219244, upload-time = "2025-09-08T23:23:24.541Z" },
{ url = "https://files.pythonhosted.org/packages/2b/0f/1f177e3683aead2bb00f7679a16451d302c436b5cbf2505f0ea8146ef59e/cffi-2.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205", size = 222828, upload-time = "2025-09-08T23:23:26.143Z" },
{ url = "https://files.pythonhosted.org/packages/c6/0f/cafacebd4b040e3119dcb32fed8bdef8dfe94da653155f9d0b9dc660166e/cffi-2.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1", size = 220926, upload-time = "2025-09-08T23:23:27.873Z" },
{ url = "https://files.pythonhosted.org/packages/3e/aa/df335faa45b395396fcbc03de2dfcab242cd61a9900e914fe682a59170b1/cffi-2.0.0-cp314-cp314-win32.whl", hash = "sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f", size = 175328, upload-time = "2025-09-08T23:23:44.61Z" },
{ url = "https://files.pythonhosted.org/packages/bb/92/882c2d30831744296ce713f0feb4c1cd30f346ef747b530b5318715cc367/cffi-2.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25", size = 185650, upload-time = "2025-09-08T23:23:45.848Z" },
{ url = "https://files.pythonhosted.org/packages/9f/2c/98ece204b9d35a7366b5b2c6539c350313ca13932143e79dc133ba757104/cffi-2.0.0-cp314-cp314-win_arm64.whl", hash = "sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad", size = 180687, upload-time = "2025-09-08T23:23:47.105Z" },
{ url = "https://files.pythonhosted.org/packages/3e/61/c768e4d548bfa607abcda77423448df8c471f25dbe64fb2ef6d555eae006/cffi-2.0.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9", size = 188773, upload-time = "2025-09-08T23:23:29.347Z" },
{ url = "https://files.pythonhosted.org/packages/2c/ea/5f76bce7cf6fcd0ab1a1058b5af899bfbef198bea4d5686da88471ea0336/cffi-2.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d", size = 185013, upload-time = "2025-09-08T23:23:30.63Z" },
{ url = "https://files.pythonhosted.org/packages/be/b4/c56878d0d1755cf9caa54ba71e5d049479c52f9e4afc230f06822162ab2f/cffi-2.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c", size = 221593, upload-time = "2025-09-08T23:23:31.91Z" },
{ url = "https://files.pythonhosted.org/packages/e0/0d/eb704606dfe8033e7128df5e90fee946bbcb64a04fcdaa97321309004000/cffi-2.0.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8", size = 209354, upload-time = "2025-09-08T23:23:33.214Z" },
{ url = "https://files.pythonhosted.org/packages/d8/19/3c435d727b368ca475fb8742ab97c9cb13a0de600ce86f62eab7fa3eea60/cffi-2.0.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc", size = 208480, upload-time = "2025-09-08T23:23:34.495Z" },
{ url = "https://files.pythonhosted.org/packages/d0/44/681604464ed9541673e486521497406fadcc15b5217c3e326b061696899a/cffi-2.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592", size = 221584, upload-time = "2025-09-08T23:23:36.096Z" },
{ url = "https://files.pythonhosted.org/packages/25/8e/342a504ff018a2825d395d44d63a767dd8ebc927ebda557fecdaca3ac33a/cffi-2.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512", size = 224443, upload-time = "2025-09-08T23:23:37.328Z" },
{ url = "https://files.pythonhosted.org/packages/e1/5e/b666bacbbc60fbf415ba9988324a132c9a7a0448a9a8f125074671c0f2c3/cffi-2.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4", size = 223437, upload-time = "2025-09-08T23:23:38.945Z" },
{ url = "https://files.pythonhosted.org/packages/a0/1d/ec1a60bd1a10daa292d3cd6bb0b359a81607154fb8165f3ec95fe003b85c/cffi-2.0.0-cp314-cp314t-win32.whl", hash = "sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e", size = 180487, upload-time = "2025-09-08T23:23:40.423Z" },
{ url = "https://files.pythonhosted.org/packages/bf/41/4c1168c74fac325c0c8156f04b6749c8b6a8f405bbf91413ba088359f60d/cffi-2.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6", size = 191726, upload-time = "2025-09-08T23:23:41.742Z" },
{ url = "https://files.pythonhosted.org/packages/ae/3a/dbeec9d1ee0844c679f6bb5d6ad4e9f198b1224f4e7a32825f47f6192b0c/cffi-2.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9", size = 184195, upload-time = "2025-09-08T23:23:43.004Z" },
]
[[package]]
name = "cfgv"
version = "3.4.0"
@@ -306,6 +390,71 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/a7/06/3d6badcf13db419e25b07041d9c7b4a2c331d3f4e7134445ec5df57714cd/coloredlogs-15.0.1-py2.py3-none-any.whl", hash = "sha256:612ee75c546f53e92e70049c9dbfcc18c935a2b9a53b66085ce9ef6a6e5c0934", size = 46018, upload-time = "2021-06-11T10:22:42.561Z" },
]
[[package]]
name = "cryptography"
version = "46.0.3"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "cffi", marker = "platform_python_implementation != 'PyPy'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/9f/33/c00162f49c0e2fe8064a62cb92b93e50c74a72bc370ab92f86112b33ff62/cryptography-46.0.3.tar.gz", hash = "sha256:a8b17438104fed022ce745b362294d9ce35b4c2e45c1d958ad4a4b019285f4a1", size = 749258, upload-time = "2025-10-15T23:18:31.74Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/1d/42/9c391dd801d6cf0d561b5890549d4b27bafcc53b39c31a817e69d87c625b/cryptography-46.0.3-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:109d4ddfadf17e8e7779c39f9b18111a09efb969a301a31e987416a0191ed93a", size = 7225004, upload-time = "2025-10-15T23:16:52.239Z" },
{ url = "https://files.pythonhosted.org/packages/1c/67/38769ca6b65f07461eb200e85fc1639b438bdc667be02cf7f2cd6a64601c/cryptography-46.0.3-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:09859af8466b69bc3c27bdf4f5d84a665e0f7ab5088412e9e2ec49758eca5cbc", size = 4296667, upload-time = "2025-10-15T23:16:54.369Z" },
{ url = "https://files.pythonhosted.org/packages/5c/49/498c86566a1d80e978b42f0d702795f69887005548c041636df6ae1ca64c/cryptography-46.0.3-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:01ca9ff2885f3acc98c29f1860552e37f6d7c7d013d7334ff2a9de43a449315d", size = 4450807, upload-time = "2025-10-15T23:16:56.414Z" },
{ url = "https://files.pythonhosted.org/packages/4b/0a/863a3604112174c8624a2ac3c038662d9e59970c7f926acdcfaed8d61142/cryptography-46.0.3-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:6eae65d4c3d33da080cff9c4ab1f711b15c1d9760809dad6ea763f3812d254cb", size = 4299615, upload-time = "2025-10-15T23:16:58.442Z" },
{ url = "https://files.pythonhosted.org/packages/64/02/b73a533f6b64a69f3cd3872acb6ebc12aef924d8d103133bb3ea750dc703/cryptography-46.0.3-cp311-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:e5bf0ed4490068a2e72ac03d786693adeb909981cc596425d09032d372bcc849", size = 4016800, upload-time = "2025-10-15T23:17:00.378Z" },
{ url = "https://files.pythonhosted.org/packages/25/d5/16e41afbfa450cde85a3b7ec599bebefaef16b5c6ba4ec49a3532336ed72/cryptography-46.0.3-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:5ecfccd2329e37e9b7112a888e76d9feca2347f12f37918facbb893d7bb88ee8", size = 4984707, upload-time = "2025-10-15T23:17:01.98Z" },
{ url = "https://files.pythonhosted.org/packages/c9/56/e7e69b427c3878352c2fb9b450bd0e19ed552753491d39d7d0a2f5226d41/cryptography-46.0.3-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:a2c0cd47381a3229c403062f764160d57d4d175e022c1df84e168c6251a22eec", size = 4482541, upload-time = "2025-10-15T23:17:04.078Z" },
{ url = "https://files.pythonhosted.org/packages/78/f6/50736d40d97e8483172f1bb6e698895b92a223dba513b0ca6f06b2365339/cryptography-46.0.3-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:549e234ff32571b1f4076ac269fcce7a808d3bf98b76c8dd560e42dbc66d7d91", size = 4299464, upload-time = "2025-10-15T23:17:05.483Z" },
{ url = "https://files.pythonhosted.org/packages/00/de/d8e26b1a855f19d9994a19c702fa2e93b0456beccbcfe437eda00e0701f2/cryptography-46.0.3-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:c0a7bb1a68a5d3471880e264621346c48665b3bf1c3759d682fc0864c540bd9e", size = 4950838, upload-time = "2025-10-15T23:17:07.425Z" },
{ url = "https://files.pythonhosted.org/packages/8f/29/798fc4ec461a1c9e9f735f2fc58741b0daae30688f41b2497dcbc9ed1355/cryptography-46.0.3-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:10b01676fc208c3e6feeb25a8b83d81767e8059e1fe86e1dc62d10a3018fa926", size = 4481596, upload-time = "2025-10-15T23:17:09.343Z" },
{ url = "https://files.pythonhosted.org/packages/15/8d/03cd48b20a573adfff7652b76271078e3045b9f49387920e7f1f631d125e/cryptography-46.0.3-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:0abf1ffd6e57c67e92af68330d05760b7b7efb243aab8377e583284dbab72c71", size = 4426782, upload-time = "2025-10-15T23:17:11.22Z" },
{ url = "https://files.pythonhosted.org/packages/fa/b1/ebacbfe53317d55cf33165bda24c86523497a6881f339f9aae5c2e13e57b/cryptography-46.0.3-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:a04bee9ab6a4da801eb9b51f1b708a1b5b5c9eb48c03f74198464c66f0d344ac", size = 4698381, upload-time = "2025-10-15T23:17:12.829Z" },
{ url = "https://files.pythonhosted.org/packages/96/92/8a6a9525893325fc057a01f654d7efc2c64b9de90413adcf605a85744ff4/cryptography-46.0.3-cp311-abi3-win32.whl", hash = "sha256:f260d0d41e9b4da1ed1e0f1ce571f97fe370b152ab18778e9e8f67d6af432018", size = 3055988, upload-time = "2025-10-15T23:17:14.65Z" },
{ url = "https://files.pythonhosted.org/packages/7e/bf/80fbf45253ea585a1e492a6a17efcb93467701fa79e71550a430c5e60df0/cryptography-46.0.3-cp311-abi3-win_amd64.whl", hash = "sha256:a9a3008438615669153eb86b26b61e09993921ebdd75385ddd748702c5adfddb", size = 3514451, upload-time = "2025-10-15T23:17:16.142Z" },
{ url = "https://files.pythonhosted.org/packages/2e/af/9b302da4c87b0beb9db4e756386a7c6c5b8003cd0e742277888d352ae91d/cryptography-46.0.3-cp311-abi3-win_arm64.whl", hash = "sha256:5d7f93296ee28f68447397bf5198428c9aeeab45705a55d53a6343455dcb2c3c", size = 2928007, upload-time = "2025-10-15T23:17:18.04Z" },
{ url = "https://files.pythonhosted.org/packages/f5/e2/a510aa736755bffa9d2f75029c229111a1d02f8ecd5de03078f4c18d91a3/cryptography-46.0.3-cp314-cp314t-macosx_10_9_universal2.whl", hash = "sha256:00a5e7e87938e5ff9ff5447ab086a5706a957137e6e433841e9d24f38a065217", size = 7158012, upload-time = "2025-10-15T23:17:19.982Z" },
{ url = "https://files.pythonhosted.org/packages/73/dc/9aa866fbdbb95b02e7f9d086f1fccfeebf8953509b87e3f28fff927ff8a0/cryptography-46.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:c8daeb2d2174beb4575b77482320303f3d39b8e81153da4f0fb08eb5fe86a6c5", size = 4288728, upload-time = "2025-10-15T23:17:21.527Z" },
{ url = "https://files.pythonhosted.org/packages/c5/fd/bc1daf8230eaa075184cbbf5f8cd00ba9db4fd32d63fb83da4671b72ed8a/cryptography-46.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:39b6755623145ad5eff1dab323f4eae2a32a77a7abef2c5089a04a3d04366715", size = 4435078, upload-time = "2025-10-15T23:17:23.042Z" },
{ url = "https://files.pythonhosted.org/packages/82/98/d3bd5407ce4c60017f8ff9e63ffee4200ab3e23fe05b765cab805a7db008/cryptography-46.0.3-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:db391fa7c66df6762ee3f00c95a89e6d428f4d60e7abc8328f4fe155b5ac6e54", size = 4293460, upload-time = "2025-10-15T23:17:24.885Z" },
{ url = "https://files.pythonhosted.org/packages/26/e9/e23e7900983c2b8af7a08098db406cf989d7f09caea7897e347598d4cd5b/cryptography-46.0.3-cp314-cp314t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:78a97cf6a8839a48c49271cdcbd5cf37ca2c1d6b7fdd86cc864f302b5e9bf459", size = 3995237, upload-time = "2025-10-15T23:17:26.449Z" },
{ url = "https://files.pythonhosted.org/packages/91/15/af68c509d4a138cfe299d0d7ddb14afba15233223ebd933b4bbdbc7155d3/cryptography-46.0.3-cp314-cp314t-manylinux_2_28_ppc64le.whl", hash = "sha256:dfb781ff7eaa91a6f7fd41776ec37c5853c795d3b358d4896fdbb5df168af422", size = 4967344, upload-time = "2025-10-15T23:17:28.06Z" },
{ url = "https://files.pythonhosted.org/packages/ca/e3/8643d077c53868b681af077edf6b3cb58288b5423610f21c62aadcbe99f4/cryptography-46.0.3-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:6f61efb26e76c45c4a227835ddeae96d83624fb0d29eb5df5b96e14ed1a0afb7", size = 4466564, upload-time = "2025-10-15T23:17:29.665Z" },
{ url = "https://files.pythonhosted.org/packages/0e/43/c1e8726fa59c236ff477ff2b5dc071e54b21e5a1e51aa2cee1676f1c986f/cryptography-46.0.3-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:23b1a8f26e43f47ceb6d6a43115f33a5a37d57df4ea0ca295b780ae8546e8044", size = 4292415, upload-time = "2025-10-15T23:17:31.686Z" },
{ url = "https://files.pythonhosted.org/packages/42/f9/2f8fefdb1aee8a8e3256a0568cffc4e6d517b256a2fe97a029b3f1b9fe7e/cryptography-46.0.3-cp314-cp314t-manylinux_2_34_ppc64le.whl", hash = "sha256:b419ae593c86b87014b9be7396b385491ad7f320bde96826d0dd174459e54665", size = 4931457, upload-time = "2025-10-15T23:17:33.478Z" },
{ url = "https://files.pythonhosted.org/packages/79/30/9b54127a9a778ccd6d27c3da7563e9f2d341826075ceab89ae3b41bf5be2/cryptography-46.0.3-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:50fc3343ac490c6b08c0cf0d704e881d0d660be923fd3076db3e932007e726e3", size = 4466074, upload-time = "2025-10-15T23:17:35.158Z" },
{ url = "https://files.pythonhosted.org/packages/ac/68/b4f4a10928e26c941b1b6a179143af9f4d27d88fe84a6a3c53592d2e76bf/cryptography-46.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:22d7e97932f511d6b0b04f2bfd818d73dcd5928db509460aaf48384778eb6d20", size = 4420569, upload-time = "2025-10-15T23:17:37.188Z" },
{ url = "https://files.pythonhosted.org/packages/a3/49/3746dab4c0d1979888f125226357d3262a6dd40e114ac29e3d2abdf1ec55/cryptography-46.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:d55f3dffadd674514ad19451161118fd010988540cee43d8bc20675e775925de", size = 4681941, upload-time = "2025-10-15T23:17:39.236Z" },
{ url = "https://files.pythonhosted.org/packages/fd/30/27654c1dbaf7e4a3531fa1fc77986d04aefa4d6d78259a62c9dc13d7ad36/cryptography-46.0.3-cp314-cp314t-win32.whl", hash = "sha256:8a6e050cb6164d3f830453754094c086ff2d0b2f3a897a1d9820f6139a1f0914", size = 3022339, upload-time = "2025-10-15T23:17:40.888Z" },
{ url = "https://files.pythonhosted.org/packages/f6/30/640f34ccd4d2a1bc88367b54b926b781b5a018d65f404d409aba76a84b1c/cryptography-46.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:760f83faa07f8b64e9c33fc963d790a2edb24efb479e3520c14a45741cd9b2db", size = 3494315, upload-time = "2025-10-15T23:17:42.769Z" },
{ url = "https://files.pythonhosted.org/packages/ba/8b/88cc7e3bd0a8e7b861f26981f7b820e1f46aa9d26cc482d0feba0ecb4919/cryptography-46.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:516ea134e703e9fe26bcd1277a4b59ad30586ea90c365a87781d7887a646fe21", size = 2919331, upload-time = "2025-10-15T23:17:44.468Z" },
{ url = "https://files.pythonhosted.org/packages/fd/23/45fe7f376a7df8daf6da3556603b36f53475a99ce4faacb6ba2cf3d82021/cryptography-46.0.3-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:cb3d760a6117f621261d662bccc8ef5bc32ca673e037c83fbe565324f5c46936", size = 7218248, upload-time = "2025-10-15T23:17:46.294Z" },
{ url = "https://files.pythonhosted.org/packages/27/32/b68d27471372737054cbd34c84981f9edbc24fe67ca225d389799614e27f/cryptography-46.0.3-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:4b7387121ac7d15e550f5cb4a43aef2559ed759c35df7336c402bb8275ac9683", size = 4294089, upload-time = "2025-10-15T23:17:48.269Z" },
{ url = "https://files.pythonhosted.org/packages/26/42/fa8389d4478368743e24e61eea78846a0006caffaf72ea24a15159215a14/cryptography-46.0.3-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:15ab9b093e8f09daab0f2159bb7e47532596075139dd74365da52ecc9cb46c5d", size = 4440029, upload-time = "2025-10-15T23:17:49.837Z" },
{ url = "https://files.pythonhosted.org/packages/5f/eb/f483db0ec5ac040824f269e93dd2bd8a21ecd1027e77ad7bdf6914f2fd80/cryptography-46.0.3-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:46acf53b40ea38f9c6c229599a4a13f0d46a6c3fa9ef19fc1a124d62e338dfa0", size = 4297222, upload-time = "2025-10-15T23:17:51.357Z" },
{ url = "https://files.pythonhosted.org/packages/fd/cf/da9502c4e1912cb1da3807ea3618a6829bee8207456fbbeebc361ec38ba3/cryptography-46.0.3-cp38-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:10ca84c4668d066a9878890047f03546f3ae0a6b8b39b697457b7757aaf18dbc", size = 4012280, upload-time = "2025-10-15T23:17:52.964Z" },
{ url = "https://files.pythonhosted.org/packages/6b/8f/9adb86b93330e0df8b3dcf03eae67c33ba89958fc2e03862ef1ac2b42465/cryptography-46.0.3-cp38-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:36e627112085bb3b81b19fed209c05ce2a52ee8b15d161b7c643a7d5a88491f3", size = 4978958, upload-time = "2025-10-15T23:17:54.965Z" },
{ url = "https://files.pythonhosted.org/packages/d1/a0/5fa77988289c34bdb9f913f5606ecc9ada1adb5ae870bd0d1054a7021cc4/cryptography-46.0.3-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:1000713389b75c449a6e979ffc7dcc8ac90b437048766cef052d4d30b8220971", size = 4473714, upload-time = "2025-10-15T23:17:56.754Z" },
{ url = "https://files.pythonhosted.org/packages/14/e5/fc82d72a58d41c393697aa18c9abe5ae1214ff6f2a5c18ac470f92777895/cryptography-46.0.3-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:b02cf04496f6576afffef5ddd04a0cb7d49cf6be16a9059d793a30b035f6b6ac", size = 4296970, upload-time = "2025-10-15T23:17:58.588Z" },
{ url = "https://files.pythonhosted.org/packages/78/06/5663ed35438d0b09056973994f1aec467492b33bd31da36e468b01ec1097/cryptography-46.0.3-cp38-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:71e842ec9bc7abf543b47cf86b9a743baa95f4677d22baa4c7d5c69e49e9bc04", size = 4940236, upload-time = "2025-10-15T23:18:00.897Z" },
{ url = "https://files.pythonhosted.org/packages/fc/59/873633f3f2dcd8a053b8dd1d38f783043b5fce589c0f6988bf55ef57e43e/cryptography-46.0.3-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:402b58fc32614f00980b66d6e56a5b4118e6cb362ae8f3fda141ba4689bd4506", size = 4472642, upload-time = "2025-10-15T23:18:02.749Z" },
{ url = "https://files.pythonhosted.org/packages/3d/39/8e71f3930e40f6877737d6f69248cf74d4e34b886a3967d32f919cc50d3b/cryptography-46.0.3-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:ef639cb3372f69ec44915fafcd6698b6cc78fbe0c2ea41be867f6ed612811963", size = 4423126, upload-time = "2025-10-15T23:18:04.85Z" },
{ url = "https://files.pythonhosted.org/packages/cd/c7/f65027c2810e14c3e7268353b1681932b87e5a48e65505d8cc17c99e36ae/cryptography-46.0.3-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:3b51b8ca4f1c6453d8829e1eb7299499ca7f313900dd4d89a24b8b87c0a780d4", size = 4686573, upload-time = "2025-10-15T23:18:06.908Z" },
{ url = "https://files.pythonhosted.org/packages/0a/6e/1c8331ddf91ca4730ab3086a0f1be19c65510a33b5a441cb334e7a2d2560/cryptography-46.0.3-cp38-abi3-win32.whl", hash = "sha256:6276eb85ef938dc035d59b87c8a7dc559a232f954962520137529d77b18ff1df", size = 3036695, upload-time = "2025-10-15T23:18:08.672Z" },
{ url = "https://files.pythonhosted.org/packages/90/45/b0d691df20633eff80955a0fc7695ff9051ffce8b69741444bd9ed7bd0db/cryptography-46.0.3-cp38-abi3-win_amd64.whl", hash = "sha256:416260257577718c05135c55958b674000baef9a1c7d9e8f306ec60d71db850f", size = 3501720, upload-time = "2025-10-15T23:18:10.632Z" },
{ url = "https://files.pythonhosted.org/packages/e8/cb/2da4cc83f5edb9c3257d09e1e7ab7b23f049c7962cae8d842bbef0a9cec9/cryptography-46.0.3-cp38-abi3-win_arm64.whl", hash = "sha256:d89c3468de4cdc4f08a57e214384d0471911a3830fcdaf7a8cc587e42a866372", size = 2918740, upload-time = "2025-10-15T23:18:12.277Z" },
]
[[package]]
name = "dictdiffer"
version = "0.9.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/61/7b/35cbccb7effc5d7e40f4c55e2b79399e1853041997fcda15c9ff160abba0/dictdiffer-0.9.0.tar.gz", hash = "sha256:17bacf5fbfe613ccf1b6d512bd766e6b21fb798822a133aa86098b8ac9997578", size = 31513, upload-time = "2021-07-22T13:24:29.276Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/47/ef/4cb333825d10317a36a1154341ba37e6e9c087bac99c1990ef07ffdb376f/dictdiffer-0.9.0-py2.py3-none-any.whl", hash = "sha256:442bfc693cfcadaf46674575d2eba1c53b42f5e404218ca2c2ff549f2df56595", size = 16754, upload-time = "2021-07-22T13:24:26.783Z" },
]
[[package]]
name = "distlib"
version = "0.4.0"
@@ -1446,6 +1595,15 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/3c/52/5600104ef7b85f89fb8ec54f73504ead3f6f0294027e08d281f3cafb5c1a/pybase64-1.4.2-cp314-cp314t-win_arm64.whl", hash = "sha256:f25140496b02db0e7401567cd869fb13b4c8118bf5c2428592ec339987146d8b", size = 31600, upload-time = "2025-07-27T13:05:52.24Z" },
]
[[package]]
name = "pycparser"
version = "2.23"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/fe/cf/d2d3b9f5699fb1e4615c8e32ff220203e43b248e1dfcc6736ad9057731ca/pycparser-2.23.tar.gz", hash = "sha256:78816d4f24add8f10a06d6f05b4d424ad9e96cfebf68a4ddc99c65c0720d00c2", size = 173734, upload-time = "2025-09-09T13:23:47.91Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/a0/e3/59cd50310fc9b59512193629e1984c1f95e5c8ae6e5d8c69532ccc65a7fe/pycparser-2.23-py3-none-any.whl", hash = "sha256:e5c6e8d3fbad53479cab09ac03729e0a9faf2bee3db8208a550daf5af81a5934", size = 118140, upload-time = "2025-09-09T13:23:46.651Z" },
]
[[package]]
name = "pydantic"
version = "2.11.9"
@@ -1669,6 +1827,8 @@ name = "raggr"
version = "0.1.0"
source = { virtual = "." }
dependencies = [
{ name = "aerich" },
{ name = "authlib" },
{ name = "bcrypt" },
{ name = "black" },
{ name = "chromadb" },
@@ -1687,11 +1847,15 @@ dependencies = [
{ name = "python-dotenv" },
{ name = "quart" },
{ name = "quart-jwt-extended" },
{ name = "tomlkit" },
{ name = "tortoise-orm" },
{ name = "tortoise-orm-stubs" },
]
[package.metadata]
requires-dist = [
{ name = "aerich", specifier = ">=0.8.0" },
{ name = "authlib", specifier = ">=1.3.0" },
{ name = "bcrypt", specifier = ">=5.0.0" },
{ name = "black", specifier = ">=25.9.0" },
{ name = "chromadb", specifier = ">=1.1.0" },
@@ -1710,7 +1874,9 @@ requires-dist = [
{ name = "python-dotenv", specifier = ">=1.0.0" },
{ name = "quart", specifier = ">=0.20.0" },
{ name = "quart-jwt-extended", specifier = ">=0.1.0" },
{ name = "tomlkit", specifier = ">=0.13.3" },
{ name = "tortoise-orm", specifier = ">=0.25.1" },
{ name = "tortoise-orm-stubs", specifier = ">=1.0.2" },
]
[[package]]
@@ -1918,14 +2084,23 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/b3/46/e33a8c93907b631a99377ef4c5f817ab453d0b34f93529421f42ff559671/tokenizers-0.22.1-cp39-abi3-win_amd64.whl", hash = "sha256:65fd6e3fb11ca1e78a6a93602490f134d1fdeb13bcef99389d5102ea318ed138", size = 2674684, upload-time = "2025-09-19T09:49:24.953Z" },
]
[[package]]
name = "tomlkit"
version = "0.13.3"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/cc/18/0bbf3884e9eaa38819ebe46a7bd25dcd56b67434402b66a58c4b8e552575/tomlkit-0.13.3.tar.gz", hash = "sha256:430cf247ee57df2b94ee3fbe588e71d362a941ebb545dec29b53961d61add2a1", size = 185207, upload-time = "2025-06-05T07:13:44.947Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/bd/75/8539d011f6be8e29f339c42e633aae3cb73bffa95dd0f9adec09b9c58e85/tomlkit-0.13.3-py3-none-any.whl", hash = "sha256:c89c649d79ee40629a9fda55f8ace8c6a1b42deb912b2a8fd8d942ddadb606b0", size = 38901, upload-time = "2025-06-05T07:13:43.546Z" },
]
[[package]]
name = "tortoise-orm"
version = "0.25.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "aiosqlite" },
{ name = "iso8601", marker = "python_full_version < '4.0'" },
{ name = "pypika-tortoise", marker = "python_full_version < '4.0'" },
{ name = "iso8601", marker = "python_full_version < '4'" },
{ name = "pypika-tortoise", marker = "python_full_version < '4'" },
{ name = "pytz" },
]
sdist = { url = "https://files.pythonhosted.org/packages/d7/9b/de966810021fa773fead258efd8deea2bb73bb12479e27f288bd8ceb8763/tortoise_orm-0.25.1.tar.gz", hash = "sha256:4d5bfd13d5750935ffe636a6b25597c5c8f51c47e5b72d7509d712eda1a239fe", size = 128341, upload-time = "2025-06-05T10:43:31.058Z" }
@@ -1933,6 +2108,18 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/70/55/2bda7f4445f4c07b734385b46d1647a388d05160cf5b8714a713e8709378/tortoise_orm-0.25.1-py3-none-any.whl", hash = "sha256:df0ef7e06eb0650a7e5074399a51ee6e532043308c612db2cac3882486a3fd9f", size = 167723, upload-time = "2025-06-05T10:43:29.309Z" },
]
[[package]]
name = "tortoise-orm-stubs"
version = "1.0.2"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "tortoise-orm" },
]
sdist = { url = "https://files.pythonhosted.org/packages/ba/49/45b06cda907e55226b8ed4ddc71d13ff61505bfe366d72276462eeee9d2b/tortoise_orm_stubs-1.0.2.tar.gz", hash = "sha256:f4d6a810f295bebd83aa71b05ebd2decd883517f3c9530bd2376b9209b0777c6", size = 4559, upload-time = "2023-11-20T14:48:26.806Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/33/b1/f0b111dcf9381987f8acb143dd95b77934a3e9120a6c63b2cf4255c2934c/tortoise_orm_stubs-1.0.2-py3-none-any.whl", hash = "sha256:5ae3c2b0eb0286669563634b98202bbdf46349966b1c85659f3160de4fb655d6", size = 4681, upload-time = "2023-11-20T14:48:22.536Z" },
]
[[package]]
name = "tqdm"
version = "4.67.1"