Files
yottob/celery_app.py
Ryan Chen cf692d2299 Migrate to Docker Compose with PostgreSQL
- Created docker-compose.yml with 4 services:
  - postgres: PostgreSQL 16 database with persistent volume
  - redis: Redis 7 message broker
  - app: Flask web application (port 5000)
  - celery: Celery worker for async downloads
- Created Dockerfile with Python 3.14, FFmpeg, and uv
- Added psycopg2-binary dependency for PostgreSQL driver
- Updated database.py to use DATABASE_URL environment variable
  - Supports PostgreSQL in production
  - Falls back to SQLite for local development
- Updated celery_app.py to use environment variables:
  - CELERY_BROKER_URL and CELERY_RESULT_BACKEND
- Created .env.example with all configuration variables
- Created .dockerignore to optimize Docker builds
- Updated .gitignore to exclude .env and Docker files
- Updated CLAUDE.md with comprehensive Docker documentation:
  - Quick start with docker-compose commands
  - Environment variable configuration
  - Local development setup instructions
  - Service architecture overview

All services have health checks and automatic restart configured.
Start entire stack with: docker-compose up

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2025-11-26 14:09:40 -05:00

31 lines
890 B
Python

"""Celery application configuration."""
import os
from celery import Celery
# Get configuration from environment variables
CELERY_BROKER_URL = os.getenv("CELERY_BROKER_URL", "redis://localhost:6379/0")
CELERY_RESULT_BACKEND = os.getenv("CELERY_RESULT_BACKEND", "redis://localhost:6379/0")
# Configure Celery
celery_app = Celery(
"yottob",
broker=CELERY_BROKER_URL,
backend=CELERY_RESULT_BACKEND,
include=["download_service"]
)
# Celery configuration
celery_app.conf.update(
task_serializer="json",
accept_content=["json"],
result_serializer="json",
timezone="UTC",
enable_utc=True,
task_track_started=True,
task_time_limit=3600, # 1 hour max per task
task_soft_time_limit=3300, # 55 minutes soft limit
worker_prefetch_multiplier=1, # Process one task at a time
worker_max_tasks_per_child=50, # Restart worker after 50 tasks
)