reorganization
This commit is contained in:
54
docs/MIGRATIONS.md
Normal file
54
docs/MIGRATIONS.md
Normal file
@@ -0,0 +1,54 @@
|
||||
# Database Migrations with Aerich
|
||||
|
||||
## Initial Setup (Run Once)
|
||||
|
||||
1. Install dependencies:
|
||||
```bash
|
||||
uv pip install -e .
|
||||
```
|
||||
|
||||
2. Initialize Aerich:
|
||||
```bash
|
||||
aerich init-db
|
||||
```
|
||||
|
||||
This will:
|
||||
- Create a `migrations/` directory
|
||||
- Generate the initial migration based on your models
|
||||
- Create all tables in the database
|
||||
|
||||
## When You Add/Change Models
|
||||
|
||||
1. Generate a new migration:
|
||||
```bash
|
||||
aerich migrate --name "describe_your_changes"
|
||||
```
|
||||
|
||||
Example:
|
||||
```bash
|
||||
aerich migrate --name "add_user_profile_model"
|
||||
```
|
||||
|
||||
2. Apply the migration:
|
||||
```bash
|
||||
aerich upgrade
|
||||
```
|
||||
|
||||
## Common Commands
|
||||
|
||||
- `aerich init-db` - Initialize database (first time only)
|
||||
- `aerich migrate --name "description"` - Generate new migration
|
||||
- `aerich upgrade` - Apply pending migrations
|
||||
- `aerich downgrade` - Rollback last migration
|
||||
- `aerich history` - Show migration history
|
||||
- `aerich heads` - Show current migration heads
|
||||
|
||||
## Docker Setup
|
||||
|
||||
In Docker, migrations run automatically on container startup via the startup script.
|
||||
|
||||
## Notes
|
||||
|
||||
- Migration files are stored in `migrations/models/`
|
||||
- Always commit migration files to version control
|
||||
- Don't modify migration files manually after they're created
|
||||
97
docs/VECTORSTORE.md
Normal file
97
docs/VECTORSTORE.md
Normal file
@@ -0,0 +1,97 @@
|
||||
# Vector Store Management
|
||||
|
||||
This document describes how to manage the ChromaDB vector store used for RAG (Retrieval-Augmented Generation).
|
||||
|
||||
## Configuration
|
||||
|
||||
The vector store location is controlled by the `CHROMADB_PATH` environment variable:
|
||||
|
||||
- **Development (local)**: Set in `.env` to a local path (e.g., `/path/to/chromadb`)
|
||||
- **Docker**: Automatically set to `/app/data/chromadb` and persisted via Docker volume
|
||||
|
||||
## Management Commands
|
||||
|
||||
### CLI (Command Line)
|
||||
|
||||
Use the `scripts/manage_vectorstore.py` script for vector store operations:
|
||||
|
||||
```bash
|
||||
# Show statistics
|
||||
python scripts/manage_vectorstore.py stats
|
||||
|
||||
# Index documents from Paperless-NGX (incremental)
|
||||
python scripts/manage_vectorstore.py index
|
||||
|
||||
# Clear and reindex all documents
|
||||
python scripts/manage_vectorstore.py reindex
|
||||
|
||||
# List documents
|
||||
python scripts/manage_vectorstore.py list 10
|
||||
python scripts/manage_vectorstore.py list 20 --show-content
|
||||
```
|
||||
|
||||
### Docker
|
||||
|
||||
Run commands inside the Docker container:
|
||||
|
||||
```bash
|
||||
# Show statistics
|
||||
docker compose exec raggr python scripts/manage_vectorstore.py stats
|
||||
|
||||
# Reindex all documents
|
||||
docker compose exec raggr python scripts/manage_vectorstore.py reindex
|
||||
```
|
||||
|
||||
### API Endpoints
|
||||
|
||||
The following authenticated endpoints are available:
|
||||
|
||||
- `GET /api/rag/stats` - Get vector store statistics
|
||||
- `POST /api/rag/index` - Trigger indexing of new documents
|
||||
- `POST /api/rag/reindex` - Clear and reindex all documents
|
||||
|
||||
## How It Works
|
||||
|
||||
1. **Document Fetching**: Documents are fetched from Paperless-NGX via the API
|
||||
2. **Chunking**: Documents are split into chunks of ~1000 characters with 200 character overlap
|
||||
3. **Embedding**: Chunks are embedded using OpenAI's `text-embedding-3-large` model
|
||||
4. **Storage**: Embeddings are stored in ChromaDB with metadata (filename, document type, date)
|
||||
5. **Retrieval**: User queries are embedded and similar chunks are retrieved for RAG
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### "Error creating hnsw segment reader"
|
||||
|
||||
This indicates a corrupted index. Solution:
|
||||
|
||||
```bash
|
||||
python scripts/manage_vectorstore.py reindex
|
||||
```
|
||||
|
||||
### Empty results
|
||||
|
||||
Check if documents are indexed:
|
||||
|
||||
```bash
|
||||
python scripts/manage_vectorstore.py stats
|
||||
```
|
||||
|
||||
If count is 0, run:
|
||||
|
||||
```bash
|
||||
python scripts/manage_vectorstore.py index
|
||||
```
|
||||
|
||||
### Different results in Docker vs local
|
||||
|
||||
Docker and local environments use separate ChromaDB instances. To sync:
|
||||
|
||||
1. Index inside Docker: `docker compose exec raggr python scripts/manage_vectorstore.py reindex`
|
||||
2. Or mount the same volume for both environments
|
||||
|
||||
## Production Considerations
|
||||
|
||||
1. **Volume Persistence**: Use Docker volumes or persistent storage for ChromaDB
|
||||
2. **Backup**: Regularly backup the ChromaDB data directory
|
||||
3. **Reindexing**: Schedule periodic reindexing to keep data fresh
|
||||
4. **Monitoring**: Monitor the `/api/rag/stats` endpoint for document counts
|
||||
@@ -170,11 +170,12 @@ docker compose exec raggr bash -c "sleep 5 && aerich upgrade"
|
||||
|
||||
| File | Purpose |
|
||||
|------|---------|
|
||||
| `services/raggr/pyproject.toml` | Aerich config (`[tool.aerich]` section) |
|
||||
| `services/raggr/migrations/models/` | Migration files |
|
||||
| `services/raggr/startup.sh` | Production startup (runs `aerich upgrade`) |
|
||||
| `services/raggr/startup-dev.sh` | Dev startup (runs `aerich upgrade` or `init-db`) |
|
||||
| `services/raggr/app.py` | Contains `TORTOISE_CONFIG` |
|
||||
| `pyproject.toml` | Aerich config (`[tool.aerich]` section) |
|
||||
| `migrations/models/` | Migration files |
|
||||
| `startup.sh` | Production startup (runs `aerich upgrade`) |
|
||||
| `startup-dev.sh` | Dev startup (runs `aerich upgrade` or `init-db`) |
|
||||
| `app.py` | Contains `TORTOISE_CONFIG` |
|
||||
| `aerich_config.py` | Aerich initialization configuration |
|
||||
|
||||
## Quick Reference
|
||||
|
||||
|
||||
258
docs/development.md
Normal file
258
docs/development.md
Normal file
@@ -0,0 +1,258 @@
|
||||
# Development Guide
|
||||
|
||||
This guide explains how to run SimbaRAG in development mode.
|
||||
|
||||
## Quick Start
|
||||
|
||||
### Option 1: Local Development (Recommended)
|
||||
|
||||
Run PostgreSQL in Docker and the application locally for faster iteration:
|
||||
|
||||
```bash
|
||||
# 1. Start PostgreSQL
|
||||
docker compose -f docker-compose.dev.yml up -d
|
||||
|
||||
# 2. Set environment variables
|
||||
export DATABASE_URL="postgres://raggr:raggr_dev_password@localhost:5432/raggr"
|
||||
export CHROMADB_PATH="./chromadb"
|
||||
export $(grep -v '^#' .env | xargs) # Load other vars from .env
|
||||
|
||||
# 3. Install dependencies (first time)
|
||||
pip install -r requirements.txt
|
||||
cd raggr-frontend && yarn install && yarn build && cd ..
|
||||
|
||||
# 4. Run migrations
|
||||
aerich upgrade
|
||||
|
||||
# 5. Start the server
|
||||
python app.py
|
||||
```
|
||||
|
||||
The application will be available at `http://localhost:8080`.
|
||||
|
||||
### Option 2: Full Docker Development
|
||||
|
||||
Run everything in Docker with hot reload (slower, but matches production):
|
||||
|
||||
```bash
|
||||
# Uncomment the raggr service in docker-compose.dev.yml first!
|
||||
|
||||
# Start all services
|
||||
docker compose -f docker-compose.dev.yml up --build
|
||||
|
||||
# View logs
|
||||
docker compose -f docker-compose.dev.yml logs -f raggr
|
||||
```
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
raggr/
|
||||
├── app.py # Quart application entry point
|
||||
├── main.py # RAG logic and LangChain agent
|
||||
├── llm.py # LLM client (Ollama + OpenAI fallback)
|
||||
├── aerich_config.py # Database migration configuration
|
||||
│
|
||||
├── blueprints/ # API route blueprints
|
||||
│ ├── users/ # Authentication (OIDC, JWT, RBAC)
|
||||
│ ├── conversation/ # Chat conversations and messages
|
||||
│ └── rag/ # Document indexing (admin only)
|
||||
│
|
||||
├── config/ # Configuration modules
|
||||
│ └── oidc_config.py # OIDC authentication settings
|
||||
│
|
||||
├── utils/ # Reusable utilities
|
||||
│ ├── chunker.py # Document chunking for embeddings
|
||||
│ ├── cleaner.py # PDF cleaning and summarization
|
||||
│ ├── image_process.py # Image description with LLM
|
||||
│ └── request.py # Paperless-NGX API client
|
||||
│
|
||||
├── scripts/ # Administrative scripts
|
||||
│ ├── add_user.py # Create users manually
|
||||
│ ├── user_message_stats.py # User message statistics
|
||||
│ ├── manage_vectorstore.py # Vector store management
|
||||
│ ├── inspect_vector_store.py # Inspect ChromaDB contents
|
||||
│ └── query.py # Query generation utilities
|
||||
│
|
||||
├── raggr-frontend/ # React frontend
|
||||
│ └── src/ # Frontend source code
|
||||
│
|
||||
├── migrations/ # Database migrations
|
||||
└── docs/ # Documentation
|
||||
```
|
||||
|
||||
## Making Changes
|
||||
|
||||
### Backend Changes
|
||||
|
||||
**Local development:**
|
||||
1. Edit Python files
|
||||
2. Save
|
||||
3. Restart `python app.py` (or use a tool like `watchdog` for auto-reload)
|
||||
|
||||
**Docker development:**
|
||||
1. Edit Python files
|
||||
2. Files are synced via Docker watch mode
|
||||
3. Container automatically restarts
|
||||
|
||||
### Frontend Changes
|
||||
|
||||
```bash
|
||||
cd raggr-frontend
|
||||
|
||||
# Development mode with hot reload
|
||||
yarn dev
|
||||
|
||||
# Production build (for testing)
|
||||
yarn build
|
||||
```
|
||||
|
||||
The backend serves built files from `raggr-frontend/dist/`.
|
||||
|
||||
### Database Model Changes
|
||||
|
||||
When you modify Tortoise ORM models:
|
||||
|
||||
```bash
|
||||
# Generate migration
|
||||
aerich migrate --name "describe_your_change"
|
||||
|
||||
# Apply migration
|
||||
aerich upgrade
|
||||
|
||||
# View history
|
||||
aerich history
|
||||
```
|
||||
|
||||
See [deployment.md](deployment.md) for detailed migration workflows.
|
||||
|
||||
### Adding Dependencies
|
||||
|
||||
**Backend:**
|
||||
```bash
|
||||
# Add to requirements.txt or use uv
|
||||
pip install package-name
|
||||
pip freeze > requirements.txt
|
||||
```
|
||||
|
||||
**Frontend:**
|
||||
```bash
|
||||
cd raggr-frontend
|
||||
yarn add package-name
|
||||
```
|
||||
|
||||
## Useful Commands
|
||||
|
||||
### Database
|
||||
|
||||
```bash
|
||||
# Connect to PostgreSQL
|
||||
docker compose -f docker-compose.dev.yml exec postgres psql -U raggr -d raggr
|
||||
|
||||
# Reset database
|
||||
docker compose -f docker-compose.dev.yml down -v
|
||||
docker compose -f docker-compose.dev.yml up -d
|
||||
aerich init-db
|
||||
```
|
||||
|
||||
### Vector Store
|
||||
|
||||
```bash
|
||||
# Show statistics
|
||||
python scripts/manage_vectorstore.py stats
|
||||
|
||||
# Index new documents from Paperless
|
||||
python scripts/manage_vectorstore.py index
|
||||
|
||||
# Clear and reindex everything
|
||||
python scripts/manage_vectorstore.py reindex
|
||||
```
|
||||
|
||||
See [vectorstore.md](vectorstore.md) for details.
|
||||
|
||||
### Scripts
|
||||
|
||||
```bash
|
||||
# Add a new user
|
||||
python scripts/add_user.py
|
||||
|
||||
# View message statistics
|
||||
python scripts/user_message_stats.py
|
||||
|
||||
# Inspect vector store contents
|
||||
python scripts/inspect_vector_store.py
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
Copy `.env.example` to `.env` and configure:
|
||||
|
||||
| Variable | Description | Example |
|
||||
|----------|-------------|---------|
|
||||
| `DATABASE_URL` | PostgreSQL connection | `postgres://user:pass@localhost:5432/db` |
|
||||
| `CHROMADB_PATH` | ChromaDB storage path | `./chromadb` |
|
||||
| `OLLAMA_URL` | Ollama server URL | `http://localhost:11434` |
|
||||
| `OPENAI_API_KEY` | OpenAI API key (fallback LLM) | `sk-...` |
|
||||
| `PAPERLESS_TOKEN` | Paperless-NGX API token | `...` |
|
||||
| `BASE_URL` | Paperless-NGX URL | `https://paperless.example.com` |
|
||||
| `OIDC_ISSUER` | OIDC provider URL | `https://auth.example.com` |
|
||||
| `OIDC_CLIENT_ID` | OIDC client ID | `simbarag` |
|
||||
| `OIDC_CLIENT_SECRET` | OIDC client secret | `...` |
|
||||
| `JWT_SECRET_KEY` | JWT signing key | `random-secret` |
|
||||
| `TAVILY_KEY` | Tavily web search API key | `tvly-...` |
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Port Already in Use
|
||||
|
||||
```bash
|
||||
# Find and kill process on port 8080
|
||||
lsof -ti:8080 | xargs kill -9
|
||||
|
||||
# Or change the port in app.py
|
||||
```
|
||||
|
||||
### Database Connection Errors
|
||||
|
||||
```bash
|
||||
# Check if PostgreSQL is running
|
||||
docker compose -f docker-compose.dev.yml ps postgres
|
||||
|
||||
# View PostgreSQL logs
|
||||
docker compose -f docker-compose.dev.yml logs postgres
|
||||
```
|
||||
|
||||
### Frontend Not Building
|
||||
|
||||
```bash
|
||||
cd raggr-frontend
|
||||
rm -rf node_modules dist
|
||||
yarn install
|
||||
yarn build
|
||||
```
|
||||
|
||||
### ChromaDB Errors
|
||||
|
||||
```bash
|
||||
# Clear and recreate ChromaDB
|
||||
rm -rf chromadb/
|
||||
python scripts/manage_vectorstore.py reindex
|
||||
```
|
||||
|
||||
### Import Errors After Reorganization
|
||||
|
||||
Ensure you're in the project root directory when running scripts, or use:
|
||||
|
||||
```bash
|
||||
# Add project root to Python path
|
||||
export PYTHONPATH="${PYTHONPATH}:$(pwd)"
|
||||
python scripts/your_script.py
|
||||
```
|
||||
|
||||
## Hot Tips
|
||||
|
||||
- Use `python -m pdb app.py` for debugging
|
||||
- Enable Quart debug mode in `app.py`: `app.run(debug=True)`
|
||||
- Check API logs: They appear in the terminal running `python app.py`
|
||||
- Frontend logs: Open browser DevTools console
|
||||
- Use `docker compose -f docker-compose.dev.yml down -v` for a clean slate
|
||||
205
docs/index.md
205
docs/index.md
@@ -1,14 +1,203 @@
|
||||
# SimbaRAG Documentation
|
||||
|
||||
SimbaRAG is a RAG-powered conversational AI system with enterprise authentication.
|
||||
Welcome to the SimbaRAG documentation! This guide will help you understand, develop, and deploy the SimbaRAG conversational AI system.
|
||||
|
||||
## Architecture
|
||||
## Getting Started
|
||||
|
||||
- **Backend**: Quart (async Python) with Tortoise ORM
|
||||
- **Vector Store**: LangChain with configurable embeddings
|
||||
- **Auth Stack**: LLDAP → Authelia → OAuth2/OIDC
|
||||
- **Database**: PostgreSQL
|
||||
New to SimbaRAG? Start here:
|
||||
|
||||
## Sections
|
||||
1. Read the main [README](../README.md) for project overview and architecture
|
||||
2. Follow the [Development Guide](development.md) to set up your environment
|
||||
3. Learn about [Authentication](authentication.md) setup with OIDC and LDAP
|
||||
|
||||
- [Authentication](authentication.md) - OIDC flow, user management, and RBAC planning
|
||||
## Documentation Structure
|
||||
|
||||
### Core Guides
|
||||
|
||||
- **[Development Guide](development.md)** - Local development setup, project structure, and workflows
|
||||
- **[Deployment Guide](deployment.md)** - Database migrations, deployment workflows, and troubleshooting
|
||||
- **[Vector Store Guide](VECTORSTORE.md)** - Managing ChromaDB, indexing documents, and RAG operations
|
||||
- **[Migrations Guide](MIGRATIONS.md)** - Database migration reference
|
||||
- **[Authentication Guide](authentication.md)** - OIDC, Authelia, LLDAP configuration and user management
|
||||
|
||||
### Quick Reference
|
||||
|
||||
| Task | Documentation |
|
||||
|------|---------------|
|
||||
| Set up local dev environment | [Development Guide → Quick Start](development.md#quick-start) |
|
||||
| Run database migrations | [Deployment Guide → Migration Workflow](deployment.md#migration-workflow) |
|
||||
| Index documents | [Vector Store Guide → Management Commands](VECTORSTORE.md#management-commands) |
|
||||
| Configure authentication | [Authentication Guide](authentication.md) |
|
||||
| Run administrative scripts | [Development Guide → Scripts](development.md#scripts) |
|
||||
|
||||
## Common Tasks
|
||||
|
||||
### Development
|
||||
|
||||
```bash
|
||||
# Start local development
|
||||
docker compose -f docker-compose.dev.yml up -d
|
||||
export DATABASE_URL="postgres://raggr:raggr_dev_password@localhost:5432/raggr"
|
||||
export CHROMADB_PATH="./chromadb"
|
||||
python app.py
|
||||
```
|
||||
|
||||
### Database Migrations
|
||||
|
||||
```bash
|
||||
# Generate migration
|
||||
aerich migrate --name "your_change"
|
||||
|
||||
# Apply migrations
|
||||
aerich upgrade
|
||||
|
||||
# View history
|
||||
aerich history
|
||||
```
|
||||
|
||||
### Vector Store Management
|
||||
|
||||
```bash
|
||||
# Show statistics
|
||||
python scripts/manage_vectorstore.py stats
|
||||
|
||||
# Index new documents
|
||||
python scripts/manage_vectorstore.py index
|
||||
|
||||
# Reindex everything
|
||||
python scripts/manage_vectorstore.py reindex
|
||||
```
|
||||
|
||||
## Architecture Overview
|
||||
|
||||
SimbaRAG is built with:
|
||||
|
||||
- **Backend**: Quart (async Python), LangChain, Tortoise ORM
|
||||
- **Frontend**: React 19, Rsbuild, Tailwind CSS
|
||||
- **Database**: PostgreSQL (users, conversations)
|
||||
- **Vector Store**: ChromaDB (document embeddings)
|
||||
- **LLM**: Ollama (primary), OpenAI (fallback)
|
||||
- **Auth**: Authelia (OIDC), LLDAP (user directory)
|
||||
|
||||
See the [README](../README.md#system-architecture) for detailed architecture diagram.
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
simbarag/
|
||||
├── app.py # Quart app entry point
|
||||
├── main.py # RAG & LangChain agent
|
||||
├── llm.py # LLM client
|
||||
├── blueprints/ # API routes
|
||||
├── config/ # Configuration
|
||||
├── utils/ # Utilities
|
||||
├── scripts/ # Admin scripts
|
||||
├── raggr-frontend/ # React UI
|
||||
├── migrations/ # Database migrations
|
||||
├── docs/ # This documentation
|
||||
├── docker-compose.yml # Production Docker setup
|
||||
└── docker-compose.dev.yml # Development Docker setup
|
||||
```
|
||||
|
||||
## Key Concepts
|
||||
|
||||
### RAG (Retrieval-Augmented Generation)
|
||||
|
||||
SimbaRAG uses RAG to answer questions about Simba:
|
||||
|
||||
1. Documents are fetched from Paperless-NGX
|
||||
2. Documents are chunked and embedded using OpenAI
|
||||
3. Embeddings are stored in ChromaDB
|
||||
4. User queries are embedded and matched against the store
|
||||
5. Relevant chunks are passed to the LLM for context
|
||||
6. LLM generates an answer using retrieved context
|
||||
|
||||
### LangChain Agent
|
||||
|
||||
The conversational agent has two tools:
|
||||
|
||||
- **simba_search**: Queries the vector store for Simba's documents
|
||||
- **web_search**: Searches the web via Tavily API
|
||||
|
||||
The agent automatically selects tools based on the query.
|
||||
|
||||
### Authentication Flow
|
||||
|
||||
1. User initiates OIDC login via Authelia
|
||||
2. Authelia authenticates against LLDAP
|
||||
3. Backend receives OIDC tokens and issues JWT
|
||||
4. Frontend stores JWT in localStorage
|
||||
5. Subsequent requests use JWT for authorization
|
||||
|
||||
## Environment Variables
|
||||
|
||||
Key environment variables (see `.env.example` for complete list):
|
||||
|
||||
| Variable | Purpose |
|
||||
|----------|---------|
|
||||
| `DATABASE_URL` | PostgreSQL connection |
|
||||
| `CHROMADB_PATH` | Vector store location |
|
||||
| `OLLAMA_URL` | Local LLM server |
|
||||
| `OPENAI_API_KEY` | OpenAI for embeddings/fallback |
|
||||
| `PAPERLESS_TOKEN` | Document source API |
|
||||
| `OIDC_*` | Authentication configuration |
|
||||
| `TAVILY_KEY` | Web search API |
|
||||
|
||||
## API Endpoints
|
||||
|
||||
### Authentication
|
||||
- `GET /api/user/oidc/login` - Start OIDC flow
|
||||
- `GET /api/user/oidc/callback` - OIDC callback
|
||||
- `POST /api/user/refresh` - Refresh JWT
|
||||
|
||||
### Conversations
|
||||
- `POST /api/conversation/` - Create conversation
|
||||
- `GET /api/conversation/` - List conversations
|
||||
- `POST /api/conversation/query` - Chat message
|
||||
|
||||
### RAG (Admin Only)
|
||||
- `GET /api/rag/stats` - Vector store stats
|
||||
- `POST /api/rag/index` - Index documents
|
||||
- `POST /api/rag/reindex` - Reindex all
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Common Issues
|
||||
|
||||
| Issue | Solution |
|
||||
|-------|----------|
|
||||
| Port already in use | Check if services are running: `lsof -ti:8080` |
|
||||
| Database connection error | Ensure PostgreSQL is running: `docker compose ps` |
|
||||
| ChromaDB errors | Clear and reindex: `python scripts/manage_vectorstore.py reindex` |
|
||||
| Import errors | Check you're in `services/raggr/` directory |
|
||||
| Frontend not building | `cd raggr-frontend && yarn install && yarn build` |
|
||||
|
||||
See individual guides for detailed troubleshooting.
|
||||
|
||||
## Contributing
|
||||
|
||||
1. Read the [Development Guide](development.md)
|
||||
2. Set up your local environment
|
||||
3. Make changes and test locally
|
||||
4. Generate migrations if needed
|
||||
5. Submit a pull request
|
||||
|
||||
## Additional Resources
|
||||
|
||||
- [LangChain Documentation](https://python.langchain.com/)
|
||||
- [ChromaDB Documentation](https://docs.trychroma.com/)
|
||||
- [Quart Documentation](https://quart.palletsprojects.com/)
|
||||
- [Tortoise ORM Documentation](https://tortoise.github.io/)
|
||||
- [Authelia Documentation](https://www.authelia.com/)
|
||||
|
||||
## Need Help?
|
||||
|
||||
- Check the relevant guide in this documentation
|
||||
- Review troubleshooting sections
|
||||
- Check application logs: `docker compose logs -f`
|
||||
- Inspect database: `docker compose exec postgres psql -U raggr`
|
||||
|
||||
---
|
||||
|
||||
**Documentation Version**: 1.0
|
||||
**Last Updated**: January 2026
|
||||
|
||||
Reference in New Issue
Block a user