Add async video downloads with yt-dlp and Celery
- Added yt-dlp, celery, and redis dependencies to pyproject.toml - Extended VideoEntry model with download tracking fields: - download_status (enum: pending, downloading, completed, failed) - download_path, download_started_at, download_completed_at - download_error, file_size - Created celery_app.py with Redis broker configuration - Created download_service.py with async download tasks: - download_video() task downloads as MP4 format - Configured yt-dlp for best MP4 quality with fallback - Automatic retries on failure (max 3 attempts) - Progress tracking and database updates - Added Flask API endpoints in main.py: - POST /api/download/<video_id> to trigger download - GET /api/download/status/<video_id> to check status - POST /api/download/batch for bulk downloads - Generated and applied Alembic migration for new fields - Created downloads/ directory for video storage - Updated .gitignore to exclude downloads/ directory - Updated CLAUDE.md with comprehensive documentation: - Redis and Celery setup instructions - Download workflow and architecture - yt-dlp configuration details - New API endpoint examples Videos are downloaded as MP4 files using Celery workers. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
This commit is contained in:
3
.gitignore
vendored
3
.gitignore
vendored
@@ -14,3 +14,6 @@ wheels/
|
||||
*.db-journal
|
||||
*.db-shm
|
||||
*.db-wal
|
||||
|
||||
# Downloaded videos
|
||||
downloads/
|
||||
|
||||
105
CLAUDE.md
105
CLAUDE.md
@@ -4,7 +4,7 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co
|
||||
|
||||
## Project Overview
|
||||
|
||||
`yottob` is a Flask-based web application for processing YouTube RSS feeds with SQLAlchemy ORM persistence. The project provides both a REST API and CLI interface for fetching and parsing YouTube channel feeds, with filtering logic to exclude YouTube Shorts. All fetched feeds are automatically saved to a SQLite database for historical tracking.
|
||||
`yottob` is a Flask-based web application for processing YouTube RSS feeds with SQLAlchemy ORM persistence and async video downloads. The project provides both a REST API and CLI interface for fetching and parsing YouTube channel feeds, with filtering logic to exclude YouTube Shorts. All fetched feeds are automatically saved to a SQLite database for historical tracking. Videos can be downloaded asynchronously as MP4 files using Celery workers and yt-dlp.
|
||||
|
||||
## Development Setup
|
||||
|
||||
@@ -26,6 +26,26 @@ source .venv/bin/activate # On macOS/Linux
|
||||
source .venv/bin/activate && alembic upgrade head
|
||||
```
|
||||
|
||||
**Start Redis (required for Celery):**
|
||||
```bash
|
||||
# macOS with Homebrew
|
||||
brew services start redis
|
||||
|
||||
# Linux
|
||||
sudo systemctl start redis
|
||||
|
||||
# Docker
|
||||
docker run -d -p 6379:6379 redis:alpine
|
||||
|
||||
# Verify Redis is running
|
||||
redis-cli ping # Should return "PONG"
|
||||
```
|
||||
|
||||
**Start Celery worker (required for video downloads):**
|
||||
```bash
|
||||
source .venv/bin/activate && celery -A celery_app worker --loglevel=info
|
||||
```
|
||||
|
||||
## Running the Application
|
||||
|
||||
**Run the CLI feed parser:**
|
||||
@@ -43,6 +63,9 @@ The web server exposes:
|
||||
- `/api/feed` - API endpoint for fetching feeds and saving to database
|
||||
- `/api/channels` - List all tracked channels
|
||||
- `/api/history/<channel_id>` - Get video history for a specific channel
|
||||
- `/api/download/<video_id>` - Trigger video download (POST)
|
||||
- `/api/download/status/<video_id>` - Check download status (GET)
|
||||
- `/api/download/batch` - Batch download multiple videos (POST)
|
||||
|
||||
**API Usage Examples:**
|
||||
```bash
|
||||
@@ -57,6 +80,20 @@ curl http://localhost:5000/api/channels
|
||||
|
||||
# Get video history for a channel (limit 20 videos)
|
||||
curl "http://localhost:5000/api/history/CHANNEL_ID?limit=20"
|
||||
|
||||
# Trigger download for a specific video
|
||||
curl -X POST http://localhost:5000/api/download/123
|
||||
|
||||
# Check download status
|
||||
curl http://localhost:5000/api/download/status/123
|
||||
|
||||
# Batch download all pending videos for a channel
|
||||
curl -X POST "http://localhost:5000/api/download/batch?channel_id=CHANNEL_ID&status=pending"
|
||||
|
||||
# Batch download specific video IDs
|
||||
curl -X POST http://localhost:5000/api/download/batch \
|
||||
-H "Content-Type: application/json" \
|
||||
-d '{"video_ids": [1, 2, 3, 4, 5]}'
|
||||
```
|
||||
|
||||
## Architecture
|
||||
@@ -66,8 +103,10 @@ The codebase follows a clean layered architecture with separation of concerns:
|
||||
### Database Layer
|
||||
**`models.py`** - SQLAlchemy ORM models
|
||||
- `Base`: Declarative base for all models
|
||||
- `DownloadStatus`: Enum for download states (pending, downloading, completed, failed)
|
||||
- `Channel`: Stores YouTube channel metadata (channel_id, title, link, last_fetched)
|
||||
- `VideoEntry`: Stores individual video entries with foreign key to Channel
|
||||
- `VideoEntry`: Stores individual video entries with foreign key to Channel, plus download tracking fields:
|
||||
- `download_status`, `download_path`, `download_started_at`, `download_completed_at`, `download_error`, `file_size`
|
||||
- Relationships: One Channel has many VideoEntry records
|
||||
|
||||
**`database.py`** - Database configuration and session management
|
||||
@@ -76,6 +115,20 @@ The codebase follows a clean layered architecture with separation of concerns:
|
||||
- `init_db()`: Creates all tables
|
||||
- `get_db_session()`: Context manager for database sessions
|
||||
|
||||
### Async Task Queue Layer
|
||||
**`celery_app.py`** - Celery configuration
|
||||
- Celery instance configured with Redis broker
|
||||
- Task serialization and worker configuration
|
||||
- 1-hour task timeout with automatic retries
|
||||
|
||||
**`download_service.py`** - Video download tasks
|
||||
- `download_video(video_id)`: Celery task to download a single video as MP4
|
||||
- Uses yt-dlp with MP4 format preference
|
||||
- Updates database with download progress and status
|
||||
- Automatic retry on failure (max 3 attempts)
|
||||
- `download_videos_batch(video_ids)`: Queue multiple downloads
|
||||
- Downloads saved to `downloads/` directory
|
||||
|
||||
### Core Logic Layer
|
||||
**`feed_parser.py`** - Reusable YouTube feed parsing module
|
||||
- `YouTubeFeedParser`: Main parser class that encapsulates channel-specific logic
|
||||
@@ -86,13 +139,16 @@ The codebase follows a clean layered architecture with separation of concerns:
|
||||
|
||||
### Web Server Layer
|
||||
**`main.py`** - Flask application and routes
|
||||
- `app`: Flask application instance (main.py:9)
|
||||
- `app`: Flask application instance (main.py:10)
|
||||
- Database initialization on startup (main.py:16)
|
||||
- `index()`: Homepage route handler (main.py:20)
|
||||
- `get_feed()`: REST API endpoint (main.py:26) that fetches and saves to DB
|
||||
- `get_channels()`: Lists all tracked channels (main.py:59)
|
||||
- `get_history()`: Returns video history for a channel (main.py:86)
|
||||
- `main()`: CLI entry point for testing (main.py:132)
|
||||
- `index()`: Homepage route handler (main.py:21)
|
||||
- `get_feed()`: REST API endpoint (main.py:27) that fetches and saves to DB
|
||||
- `get_channels()`: Lists all tracked channels (main.py:60)
|
||||
- `get_history()`: Returns video history for a channel (main.py:87)
|
||||
- `trigger_download()`: Queue video download task (main.py:134)
|
||||
- `get_download_status()`: Check download status (main.py:163)
|
||||
- `trigger_batch_download()`: Queue multiple downloads (main.py:193)
|
||||
- `main()`: CLI entry point for testing (main.py:251)
|
||||
|
||||
### Templates
|
||||
**`templates/index.html`** - Frontend HTML (currently static placeholder)
|
||||
@@ -158,7 +214,37 @@ source .venv/bin/activate && alembic downgrade -1
|
||||
- `title`: Video title
|
||||
- `link`: Video URL (unique)
|
||||
- `created_at`: Timestamp when video was first recorded
|
||||
- `download_status`: Enum (pending, downloading, completed, failed)
|
||||
- `download_path`: Local file path to downloaded MP4
|
||||
- `download_started_at`: When download began
|
||||
- `download_completed_at`: When download finished
|
||||
- `download_error`: Error message if download failed
|
||||
- `file_size`: Size in bytes of downloaded file
|
||||
- Index: `idx_channel_created` on (channel_id, created_at) for fast queries
|
||||
- Index: `idx_download_status` on download_status for filtering
|
||||
|
||||
## Video Download System
|
||||
|
||||
The application uses Celery with Redis for asynchronous video downloads:
|
||||
|
||||
**Download Workflow:**
|
||||
1. User triggers download via `/api/download/<video_id>` (POST)
|
||||
2. VideoEntry status changes to "downloading"
|
||||
3. Celery worker picks up task and uses yt-dlp to download as MP4
|
||||
4. Progress updates written to database
|
||||
5. On completion, status changes to "completed" with file path
|
||||
6. On failure, status changes to "failed" with error message (auto-retry 3x)
|
||||
|
||||
**yt-dlp Configuration:**
|
||||
- Format: `bestvideo[ext=mp4]+bestaudio[ext=m4a]/best[ext=mp4]/best`
|
||||
- Output format: MP4 (converted if necessary using FFmpeg)
|
||||
- Output location: `downloads/<video_id>_<title>.mp4`
|
||||
- Progress hooks for real-time status updates
|
||||
|
||||
**Requirements:**
|
||||
- Redis server must be running (localhost:6379)
|
||||
- Celery worker must be running to process downloads
|
||||
- FFmpeg recommended for format conversion (yt-dlp will use it if available)
|
||||
|
||||
## Dependencies
|
||||
|
||||
@@ -166,4 +252,7 @@ source .venv/bin/activate && alembic downgrade -1
|
||||
- **feedparser 6.0.12+**: RSS/Atom feed parsing
|
||||
- **SQLAlchemy 2.0.0+**: ORM for database operations
|
||||
- **Alembic 1.13.0+**: Database migration tool
|
||||
- **Celery 5.3.0+**: Distributed task queue for async jobs
|
||||
- **Redis 5.0.0+**: Message broker for Celery
|
||||
- **yt-dlp 2024.0.0+**: YouTube video downloader
|
||||
- **Python 3.14+**: Required runtime version
|
||||
|
||||
@@ -0,0 +1,44 @@
|
||||
"""Add download tracking fields to VideoEntry
|
||||
|
||||
Revision ID: 1b18a0e65b0d
|
||||
Revises: 270efe6976bc
|
||||
Create Date: 2025-11-26 14:01:41.900938
|
||||
|
||||
"""
|
||||
from typing import Sequence, Union
|
||||
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision: str = '1b18a0e65b0d'
|
||||
down_revision: Union[str, Sequence[str], None] = '270efe6976bc'
|
||||
branch_labels: Union[str, Sequence[str], None] = None
|
||||
depends_on: Union[str, Sequence[str], None] = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
"""Upgrade schema."""
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.add_column('video_entries', sa.Column('download_status', sa.Enum('PENDING', 'DOWNLOADING', 'COMPLETED', 'FAILED', name='downloadstatus'), nullable=False))
|
||||
op.add_column('video_entries', sa.Column('download_path', sa.String(length=1000), nullable=True))
|
||||
op.add_column('video_entries', sa.Column('download_started_at', sa.DateTime(), nullable=True))
|
||||
op.add_column('video_entries', sa.Column('download_completed_at', sa.DateTime(), nullable=True))
|
||||
op.add_column('video_entries', sa.Column('download_error', sa.String(length=2000), nullable=True))
|
||||
op.add_column('video_entries', sa.Column('file_size', sa.BigInteger(), nullable=True))
|
||||
op.create_index('idx_download_status', 'video_entries', ['download_status'], unique=False)
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
"""Downgrade schema."""
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.drop_index('idx_download_status', table_name='video_entries')
|
||||
op.drop_column('video_entries', 'file_size')
|
||||
op.drop_column('video_entries', 'download_error')
|
||||
op.drop_column('video_entries', 'download_completed_at')
|
||||
op.drop_column('video_entries', 'download_started_at')
|
||||
op.drop_column('video_entries', 'download_path')
|
||||
op.drop_column('video_entries', 'download_status')
|
||||
# ### end Alembic commands ###
|
||||
25
celery_app.py
Normal file
25
celery_app.py
Normal file
@@ -0,0 +1,25 @@
|
||||
"""Celery application configuration."""
|
||||
|
||||
from celery import Celery
|
||||
|
||||
# Configure Celery
|
||||
celery_app = Celery(
|
||||
"yottob",
|
||||
broker="redis://localhost:6379/0",
|
||||
backend="redis://localhost:6379/0",
|
||||
include=["download_service"]
|
||||
)
|
||||
|
||||
# Celery configuration
|
||||
celery_app.conf.update(
|
||||
task_serializer="json",
|
||||
accept_content=["json"],
|
||||
result_serializer="json",
|
||||
timezone="UTC",
|
||||
enable_utc=True,
|
||||
task_track_started=True,
|
||||
task_time_limit=3600, # 1 hour max per task
|
||||
task_soft_time_limit=3300, # 55 minutes soft limit
|
||||
worker_prefetch_multiplier=1, # Process one task at a time
|
||||
worker_max_tasks_per_child=50, # Restart worker after 50 tasks
|
||||
)
|
||||
165
download_service.py
Normal file
165
download_service.py
Normal file
@@ -0,0 +1,165 @@
|
||||
"""Video download service using yt-dlp and Celery."""
|
||||
|
||||
import os
|
||||
from datetime import datetime
|
||||
from pathlib import Path
|
||||
|
||||
import yt_dlp
|
||||
from celery import Task
|
||||
|
||||
from celery_app import celery_app
|
||||
from database import SessionLocal
|
||||
from models import VideoEntry, DownloadStatus
|
||||
|
||||
|
||||
# Download configuration
|
||||
DOWNLOAD_DIR = Path("downloads")
|
||||
DOWNLOAD_DIR.mkdir(exist_ok=True)
|
||||
|
||||
|
||||
class DatabaseTask(Task):
|
||||
"""Base task with database session management."""
|
||||
|
||||
_session = None
|
||||
|
||||
def after_return(self, *args, **kwargs):
|
||||
"""Close database session after task completion."""
|
||||
if self._session is not None:
|
||||
self._session.close()
|
||||
|
||||
@property
|
||||
def session(self):
|
||||
"""Get or create database session."""
|
||||
if self._session is None:
|
||||
self._session = SessionLocal()
|
||||
return self._session
|
||||
|
||||
|
||||
@celery_app.task(base=DatabaseTask, bind=True, max_retries=3)
|
||||
def download_video(self, video_id: int) -> dict:
|
||||
"""Download a video using yt-dlp.
|
||||
|
||||
Args:
|
||||
video_id: Database ID of the VideoEntry to download
|
||||
|
||||
Returns:
|
||||
Dictionary with download result information
|
||||
"""
|
||||
session = self.session
|
||||
|
||||
# Get video entry from database
|
||||
video = session.query(VideoEntry).filter_by(id=video_id).first()
|
||||
if not video:
|
||||
return {"error": f"Video ID {video_id} not found"}
|
||||
|
||||
# Update status to downloading
|
||||
video.download_status = DownloadStatus.DOWNLOADING
|
||||
video.download_started_at = datetime.utcnow()
|
||||
session.commit()
|
||||
|
||||
try:
|
||||
# Extract video ID from YouTube URL
|
||||
youtube_url = video.link
|
||||
|
||||
# Configure yt-dlp options for MP4 output
|
||||
ydl_opts = {
|
||||
'format': 'bestvideo[ext=mp4]+bestaudio[ext=m4a]/best[ext=mp4]/best',
|
||||
'outtmpl': str(DOWNLOAD_DIR / f'{video_id}_%(title)s.%(ext)s'),
|
||||
'merge_output_format': 'mp4', # Ensure output is MP4
|
||||
'postprocessors': [{
|
||||
'key': 'FFmpegVideoConvertor',
|
||||
'preferedformat': 'mp4', # Convert to MP4 if needed
|
||||
}],
|
||||
'quiet': False,
|
||||
'no_warnings': False,
|
||||
'progress_hooks': [lambda d: _progress_hook(d, video_id, session)],
|
||||
}
|
||||
|
||||
# Download the video
|
||||
with yt_dlp.YoutubeDL(ydl_opts) as ydl:
|
||||
info = ydl.extract_info(youtube_url, download=True)
|
||||
filename = ydl.prepare_filename(info)
|
||||
|
||||
# Handle cases where extension might change
|
||||
if not filename.endswith('.mp4'):
|
||||
# Find the actual file with .mp4 extension
|
||||
base = os.path.splitext(filename)[0]
|
||||
mp4_file = f"{base}.mp4"
|
||||
if os.path.exists(mp4_file):
|
||||
filename = mp4_file
|
||||
|
||||
# Get file size
|
||||
file_size = os.path.getsize(filename) if os.path.exists(filename) else None
|
||||
|
||||
# Update video entry with success
|
||||
video.download_status = DownloadStatus.COMPLETED
|
||||
video.download_path = filename
|
||||
video.download_completed_at = datetime.utcnow()
|
||||
video.file_size = file_size
|
||||
video.download_error = None
|
||||
session.commit()
|
||||
|
||||
return {
|
||||
"video_id": video_id,
|
||||
"status": "completed",
|
||||
"path": filename,
|
||||
"file_size": file_size
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
# Update video entry with error
|
||||
video.download_status = DownloadStatus.FAILED
|
||||
video.download_error = str(e)
|
||||
video.download_completed_at = datetime.utcnow()
|
||||
session.commit()
|
||||
|
||||
# Retry if we haven't exceeded max retries
|
||||
if self.request.retries < self.max_retries:
|
||||
raise self.retry(exc=e, countdown=60) # Retry after 60 seconds
|
||||
|
||||
return {
|
||||
"video_id": video_id,
|
||||
"status": "failed",
|
||||
"error": str(e)
|
||||
}
|
||||
|
||||
|
||||
def _progress_hook(d: dict, video_id: int, session) -> None:
|
||||
"""Progress hook for yt-dlp downloads.
|
||||
|
||||
Args:
|
||||
d: Progress dictionary from yt-dlp
|
||||
video_id: Database ID of the video
|
||||
session: Database session
|
||||
"""
|
||||
if d['status'] == 'finished':
|
||||
print(f"Download finished for video {video_id}, now converting...")
|
||||
elif d['status'] == 'downloading':
|
||||
if 'total_bytes' in d:
|
||||
percent = d['downloaded_bytes'] / d['total_bytes'] * 100
|
||||
print(f"Downloading video {video_id}: {percent:.1f}%")
|
||||
|
||||
|
||||
@celery_app.task
|
||||
def download_videos_batch(video_ids: list[int]) -> dict:
|
||||
"""Download multiple videos in batch.
|
||||
|
||||
Args:
|
||||
video_ids: List of VideoEntry IDs to download
|
||||
|
||||
Returns:
|
||||
Dictionary with batch download results
|
||||
"""
|
||||
results = []
|
||||
for video_id in video_ids:
|
||||
# Queue each download as a separate task
|
||||
task = download_video.delay(video_id)
|
||||
results.append({
|
||||
"video_id": video_id,
|
||||
"task_id": task.id
|
||||
})
|
||||
|
||||
return {
|
||||
"total_queued": len(results),
|
||||
"tasks": results
|
||||
}
|
||||
121
main.py
121
main.py
@@ -3,7 +3,8 @@
|
||||
from flask import Flask, render_template, request, jsonify
|
||||
from feed_parser import YouTubeFeedParser
|
||||
from database import init_db, get_db_session
|
||||
from models import Channel, VideoEntry
|
||||
from models import Channel, VideoEntry, DownloadStatus
|
||||
from download_service import download_video, download_videos_batch
|
||||
|
||||
|
||||
app = Flask(__name__)
|
||||
@@ -129,6 +130,124 @@ def get_history(channel_id: str):
|
||||
return jsonify({"error": f"Failed to fetch history: {str(e)}"}), 500
|
||||
|
||||
|
||||
@app.route("/api/download/<int:video_id>", methods=["POST"])
|
||||
def trigger_download(video_id: int):
|
||||
"""Trigger video download for a specific video.
|
||||
|
||||
Args:
|
||||
video_id: Database ID of the VideoEntry
|
||||
|
||||
Returns:
|
||||
JSON response with task information
|
||||
"""
|
||||
try:
|
||||
with get_db_session() as session:
|
||||
video = session.query(VideoEntry).filter_by(id=video_id).first()
|
||||
if not video:
|
||||
return jsonify({"error": "Video not found"}), 404
|
||||
|
||||
# Queue download task
|
||||
task = download_video.delay(video_id)
|
||||
|
||||
return jsonify({
|
||||
"video_id": video_id,
|
||||
"task_id": task.id,
|
||||
"status": "queued",
|
||||
"message": "Download task queued successfully"
|
||||
})
|
||||
except Exception as e:
|
||||
return jsonify({"error": f"Failed to queue download: {str(e)}"}), 500
|
||||
|
||||
|
||||
@app.route("/api/download/status/<int:video_id>", methods=["GET"])
|
||||
def get_download_status(video_id: int):
|
||||
"""Get download status for a specific video.
|
||||
|
||||
Args:
|
||||
video_id: Database ID of the VideoEntry
|
||||
|
||||
Returns:
|
||||
JSON response with download status
|
||||
"""
|
||||
try:
|
||||
with get_db_session() as session:
|
||||
video = session.query(VideoEntry).filter_by(id=video_id).first()
|
||||
if not video:
|
||||
return jsonify({"error": "Video not found"}), 404
|
||||
|
||||
return jsonify({
|
||||
"video_id": video_id,
|
||||
"title": video.title,
|
||||
"download_status": video.download_status.value,
|
||||
"download_path": video.download_path,
|
||||
"download_started_at": video.download_started_at.isoformat() if video.download_started_at else None,
|
||||
"download_completed_at": video.download_completed_at.isoformat() if video.download_completed_at else None,
|
||||
"download_error": video.download_error,
|
||||
"file_size": video.file_size
|
||||
})
|
||||
except Exception as e:
|
||||
return jsonify({"error": f"Failed to fetch download status: {str(e)}"}), 500
|
||||
|
||||
|
||||
@app.route("/api/download/batch", methods=["POST"])
|
||||
def trigger_batch_download():
|
||||
"""Trigger batch download for multiple videos.
|
||||
|
||||
Query parameters:
|
||||
channel_id: Download all pending videos for this channel (optional)
|
||||
status: Filter by download status (default: pending)
|
||||
|
||||
Request body (alternative to query params):
|
||||
video_ids: List of video IDs to download
|
||||
|
||||
Returns:
|
||||
JSON response with batch task information
|
||||
"""
|
||||
try:
|
||||
with get_db_session() as session:
|
||||
# Check if video_ids provided in request body
|
||||
data = request.get_json(silent=True)
|
||||
if data and 'video_ids' in data:
|
||||
video_ids = data['video_ids']
|
||||
else:
|
||||
# Filter by channel and/or status
|
||||
channel_id = request.args.get("channel_id")
|
||||
status_str = request.args.get("status", "pending")
|
||||
|
||||
try:
|
||||
status = DownloadStatus(status_str)
|
||||
except ValueError:
|
||||
return jsonify({"error": f"Invalid status: {status_str}"}), 400
|
||||
|
||||
query = session.query(VideoEntry).filter_by(download_status=status)
|
||||
|
||||
if channel_id:
|
||||
channel = session.query(Channel).filter_by(
|
||||
channel_id=channel_id
|
||||
).first()
|
||||
if not channel:
|
||||
return jsonify({"error": "Channel not found"}), 404
|
||||
query = query.filter_by(channel_id=channel.id)
|
||||
|
||||
videos = query.all()
|
||||
video_ids = [v.id for v in videos]
|
||||
|
||||
if not video_ids:
|
||||
return jsonify({"message": "No videos to download", "total_queued": 0})
|
||||
|
||||
# Queue batch download task
|
||||
task = download_videos_batch.delay(video_ids)
|
||||
|
||||
return jsonify({
|
||||
"task_id": task.id,
|
||||
"total_queued": len(video_ids),
|
||||
"video_ids": video_ids,
|
||||
"message": "Batch download queued successfully"
|
||||
})
|
||||
except Exception as e:
|
||||
return jsonify({"error": f"Failed to queue batch download: {str(e)}"}), 500
|
||||
|
||||
|
||||
def main():
|
||||
"""CLI entry point for testing feed parser."""
|
||||
parser = YouTubeFeedParser(DEFAULT_CHANNEL_ID)
|
||||
|
||||
36
models.py
36
models.py
@@ -1,12 +1,21 @@
|
||||
"""Database models for YouTube feed storage."""
|
||||
|
||||
from datetime import datetime
|
||||
from typing import List
|
||||
from typing import List, Optional
|
||||
from enum import Enum as PyEnum
|
||||
|
||||
from sqlalchemy import String, DateTime, ForeignKey, Index
|
||||
from sqlalchemy import String, DateTime, ForeignKey, Index, Enum, BigInteger
|
||||
from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column, relationship
|
||||
|
||||
|
||||
class DownloadStatus(PyEnum):
|
||||
"""Download status enumeration."""
|
||||
PENDING = "pending"
|
||||
DOWNLOADING = "downloading"
|
||||
COMPLETED = "completed"
|
||||
FAILED = "failed"
|
||||
|
||||
|
||||
class Base(DeclarativeBase):
|
||||
"""Base class for all database models."""
|
||||
pass
|
||||
@@ -41,16 +50,29 @@ class VideoEntry(Base):
|
||||
link: Mapped[str] = mapped_column(String(500), unique=True, nullable=False)
|
||||
created_at: Mapped[datetime] = mapped_column(DateTime, nullable=False, default=datetime.utcnow)
|
||||
|
||||
# Download tracking fields
|
||||
download_status: Mapped[DownloadStatus] = mapped_column(
|
||||
Enum(DownloadStatus),
|
||||
nullable=False,
|
||||
default=DownloadStatus.PENDING
|
||||
)
|
||||
download_path: Mapped[Optional[str]] = mapped_column(String(1000), nullable=True)
|
||||
download_started_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
|
||||
download_completed_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
|
||||
download_error: Mapped[Optional[str]] = mapped_column(String(2000), nullable=True)
|
||||
file_size: Mapped[Optional[int]] = mapped_column(BigInteger, nullable=True)
|
||||
|
||||
# Relationship to channel
|
||||
channel: Mapped["Channel"] = relationship("Channel", back_populates="videos")
|
||||
|
||||
# Index for faster queries
|
||||
__table_args__ = (
|
||||
Index('idx_channel_created', 'channel_id', 'created_at'),
|
||||
Index('idx_download_status', 'download_status'),
|
||||
)
|
||||
|
||||
def __repr__(self) -> str:
|
||||
return f"<VideoEntry(id={self.id}, title='{self.title}', link='{self.link}')>"
|
||||
return f"<VideoEntry(id={self.id}, title='{self.title}', link='{self.link}', status='{self.download_status.value}')>"
|
||||
|
||||
def to_dict(self) -> dict:
|
||||
"""Convert to dictionary for API responses."""
|
||||
@@ -58,5 +80,11 @@ class VideoEntry(Base):
|
||||
"id": self.id,
|
||||
"title": self.title,
|
||||
"link": self.link,
|
||||
"created_at": self.created_at.isoformat()
|
||||
"created_at": self.created_at.isoformat(),
|
||||
"download_status": self.download_status.value,
|
||||
"download_path": self.download_path,
|
||||
"download_started_at": self.download_started_at.isoformat() if self.download_started_at else None,
|
||||
"download_completed_at": self.download_completed_at.isoformat() if self.download_completed_at else None,
|
||||
"download_error": self.download_error,
|
||||
"file_size": self.file_size
|
||||
}
|
||||
|
||||
@@ -6,7 +6,10 @@ readme = "README.md"
|
||||
requires-python = ">=3.14"
|
||||
dependencies = [
|
||||
"alembic>=1.13.0",
|
||||
"celery>=5.3.0",
|
||||
"feedparser>=6.0.12",
|
||||
"flask>=3.1.2",
|
||||
"redis>=5.0.0",
|
||||
"sqlalchemy>=2.0.0",
|
||||
"yt-dlp>=2024.0.0",
|
||||
]
|
||||
|
||||
185
uv.lock
generated
185
uv.lock
generated
@@ -16,6 +16,27 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/ba/88/6237e97e3385b57b5f1528647addea5cc03d4d65d5979ab24327d41fb00d/alembic-1.17.2-py3-none-any.whl", hash = "sha256:f483dd1fe93f6c5d49217055e4d15b905b425b6af906746abb35b69c1996c4e6", size = 248554, upload-time = "2025-11-14T20:35:05.699Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "amqp"
|
||||
version = "5.3.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "vine" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/79/fc/ec94a357dfc6683d8c86f8b4cfa5416a4c36b28052ec8260c77aca96a443/amqp-5.3.1.tar.gz", hash = "sha256:cddc00c725449522023bad949f70fff7b48f0b1ade74d170a6f10ab044739432", size = 129013, upload-time = "2024-11-12T19:55:44.051Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/26/99/fc813cd978842c26c82534010ea849eee9ab3a13ea2b74e95cb9c99e747b/amqp-5.3.1-py3-none-any.whl", hash = "sha256:43b3319e1b4e7d1251833a93d672b4af1e40f3d632d479b98661a95f117880a2", size = 50944, upload-time = "2024-11-12T19:55:41.782Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "billiard"
|
||||
version = "4.2.3"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/6a/50/cc2b8b6e6433918a6b9a3566483b743dcd229da1e974be9b5f259db3aad7/billiard-4.2.3.tar.gz", hash = "sha256:96486f0885afc38219d02d5f0ccd5bec8226a414b834ab244008cbb0025b8dcb", size = 156450, upload-time = "2025-11-16T17:47:30.281Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/b3/cc/38b6f87170908bd8aaf9e412b021d17e85f690abe00edf50192f1a4566b9/billiard-4.2.3-py3-none-any.whl", hash = "sha256:989e9b688e3abf153f307b68a1328dfacfb954e30a4f920005654e276c69236b", size = 87042, upload-time = "2025-11-16T17:47:29.005Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "blinker"
|
||||
version = "1.9.0"
|
||||
@@ -25,6 +46,25 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/10/cb/f2ad4230dc2eb1a74edf38f1a38b9b52277f75bef262d8908e60d957e13c/blinker-1.9.0-py3-none-any.whl", hash = "sha256:ba0efaa9080b619ff2f3459d1d500c57bddea4a6b424b60a91141db6fd2f08bc", size = 8458, upload-time = "2024-11-08T17:25:46.184Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "celery"
|
||||
version = "5.5.3"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "billiard" },
|
||||
{ name = "click" },
|
||||
{ name = "click-didyoumean" },
|
||||
{ name = "click-plugins" },
|
||||
{ name = "click-repl" },
|
||||
{ name = "kombu" },
|
||||
{ name = "python-dateutil" },
|
||||
{ name = "vine" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/bb/7d/6c289f407d219ba36d8b384b42489ebdd0c84ce9c413875a8aae0c85f35b/celery-5.5.3.tar.gz", hash = "sha256:6c972ae7968c2b5281227f01c3a3f984037d21c5129d07bf3550cc2afc6b10a5", size = 1667144, upload-time = "2025-06-01T11:08:12.563Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/c9/af/0dcccc7fdcdf170f9a1585e5e96b6fb0ba1749ef6be8c89a6202284759bd/celery-5.5.3-py3-none-any.whl", hash = "sha256:0b5761a07057acee94694464ca482416b959568904c9dfa41ce8413a7d65d525", size = 438775, upload-time = "2025-06-01T11:08:09.94Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "click"
|
||||
version = "8.3.1"
|
||||
@@ -37,6 +77,43 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/98/78/01c019cdb5d6498122777c1a43056ebb3ebfeef2076d9d026bfe15583b2b/click-8.3.1-py3-none-any.whl", hash = "sha256:981153a64e25f12d547d3426c367a4857371575ee7ad18df2a6183ab0545b2a6", size = 108274, upload-time = "2025-11-15T20:45:41.139Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "click-didyoumean"
|
||||
version = "0.3.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "click" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/30/ce/217289b77c590ea1e7c24242d9ddd6e249e52c795ff10fac2c50062c48cb/click_didyoumean-0.3.1.tar.gz", hash = "sha256:4f82fdff0dbe64ef8ab2279bd6aa3f6a99c3b28c05aa09cbfc07c9d7fbb5a463", size = 3089, upload-time = "2024-03-24T08:22:07.499Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/1b/5b/974430b5ffdb7a4f1941d13d83c64a0395114503cc357c6b9ae4ce5047ed/click_didyoumean-0.3.1-py3-none-any.whl", hash = "sha256:5c4bb6007cfea5f2fd6583a2fb6701a22a41eb98957e63d0fac41c10e7c3117c", size = 3631, upload-time = "2024-03-24T08:22:06.356Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "click-plugins"
|
||||
version = "1.1.1.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "click" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/c3/a4/34847b59150da33690a36da3681d6bbc2ec14ee9a846bc30a6746e5984e4/click_plugins-1.1.1.2.tar.gz", hash = "sha256:d7af3984a99d243c131aa1a828331e7630f4a88a9741fd05c927b204bcf92261", size = 8343, upload-time = "2025-06-25T00:47:37.555Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/3d/9a/2abecb28ae875e39c8cad711eb1186d8d14eab564705325e77e4e6ab9ae5/click_plugins-1.1.1.2-py2.py3-none-any.whl", hash = "sha256:008d65743833ffc1f5417bf0e78e8d2c23aab04d9745ba817bd3e71b0feb6aa6", size = 11051, upload-time = "2025-06-25T00:47:36.731Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "click-repl"
|
||||
version = "0.3.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "click" },
|
||||
{ name = "prompt-toolkit" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/cb/a2/57f4ac79838cfae6912f997b4d1a64a858fb0c86d7fcaae6f7b58d267fca/click-repl-0.3.0.tar.gz", hash = "sha256:17849c23dba3d667247dc4defe1757fff98694e90fe37474f3feebb69ced26a9", size = 10449, upload-time = "2023-06-15T12:43:51.141Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/52/40/9d857001228658f0d59e97ebd4c346fe73e138c6de1bce61dc568a57c7f8/click_repl-0.3.0-py3-none-any.whl", hash = "sha256:fb7e06deb8da8de86180a33a9da97ac316751c094c6899382da7feeeeb51b812", size = 10289, upload-time = "2023-06-15T12:43:48.626Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "colorama"
|
||||
version = "0.4.6"
|
||||
@@ -113,6 +190,21 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/62/a1/3d680cbfd5f4b8f15abc1d571870c5fc3e594bb582bc3b64ea099db13e56/jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67", size = 134899, upload-time = "2025-03-05T20:05:00.369Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "kombu"
|
||||
version = "5.5.4"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "amqp" },
|
||||
{ name = "packaging" },
|
||||
{ name = "tzdata" },
|
||||
{ name = "vine" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/0f/d3/5ff936d8319ac86b9c409f1501b07c426e6ad41966fedace9ef1b966e23f/kombu-5.5.4.tar.gz", hash = "sha256:886600168275ebeada93b888e831352fe578168342f0d1d5833d88ba0d847363", size = 461992, upload-time = "2025-06-01T10:19:22.281Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/ef/70/a07dcf4f62598c8ad579df241af55ced65bed76e42e45d3c368a6d82dbc1/kombu-5.5.4-py3-none-any.whl", hash = "sha256:a12ed0557c238897d8e518f1d1fdf84bd1516c5e305af2dacd85c2015115feb8", size = 210034, upload-time = "2025-06-01T10:19:20.436Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "mako"
|
||||
version = "1.3.10"
|
||||
@@ -155,12 +247,63 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/70/bc/6f1c2f612465f5fa89b95bead1f44dcb607670fd42891d8fdcd5d039f4f4/markupsafe-3.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:32001d6a8fc98c8cb5c947787c5d08b0a50663d139f1305bac5885d98d9b40fa", size = 14146, upload-time = "2025-09-27T18:37:28.327Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "packaging"
|
||||
version = "25.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/a1/d4/1fc4078c65507b51b96ca8f8c3ba19e6a61c8253c72794544580a7b6c24d/packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f", size = 165727, upload-time = "2025-04-19T11:48:59.673Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/20/12/38679034af332785aac8774540895e234f4d07f7545804097de4b666afd8/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484", size = 66469, upload-time = "2025-04-19T11:48:57.875Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "prompt-toolkit"
|
||||
version = "3.0.52"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "wcwidth" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/a1/96/06e01a7b38dce6fe1db213e061a4602dd6032a8a97ef6c1a862537732421/prompt_toolkit-3.0.52.tar.gz", hash = "sha256:28cde192929c8e7321de85de1ddbe736f1375148b02f2e17edd840042b1be855", size = 434198, upload-time = "2025-08-27T15:24:02.057Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/84/03/0d3ce49e2505ae70cf43bc5bb3033955d2fc9f932163e84dc0779cc47f48/prompt_toolkit-3.0.52-py3-none-any.whl", hash = "sha256:9aac639a3bbd33284347de5ad8d68ecc044b91a762dc39b7c21095fcd6a19955", size = 391431, upload-time = "2025-08-27T15:23:59.498Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "python-dateutil"
|
||||
version = "2.9.0.post0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "six" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/66/c0/0c8b6ad9f17a802ee498c46e004a0eb49bc148f2fd230864601a86dcf6db/python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", size = 342432, upload-time = "2024-03-01T18:36:20.211Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/ec/57/56b9bcc3c9c6a792fcbaf139543cee77261f3651ca9da0c93f5c1221264b/python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427", size = 229892, upload-time = "2024-03-01T18:36:18.57Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "redis"
|
||||
version = "7.1.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/43/c8/983d5c6579a411d8a99bc5823cc5712768859b5ce2c8afe1a65b37832c81/redis-7.1.0.tar.gz", hash = "sha256:b1cc3cfa5a2cb9c2ab3ba700864fb0ad75617b41f01352ce5779dabf6d5f9c3c", size = 4796669, upload-time = "2025-11-19T15:54:39.961Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/89/f0/8956f8a86b20d7bb9d6ac0187cf4cd54d8065bc9a1a09eb8011d4d326596/redis-7.1.0-py3-none-any.whl", hash = "sha256:23c52b208f92b56103e17c5d06bdc1a6c2c0b3106583985a76a18f83b265de2b", size = 354159, upload-time = "2025-11-19T15:54:38.064Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "sgmllib3k"
|
||||
version = "1.0.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/9e/bd/3704a8c3e0942d711c1299ebf7b9091930adae6675d7c8f476a7ce48653c/sgmllib3k-1.0.0.tar.gz", hash = "sha256:7868fb1c8bfa764c1ac563d3cf369c381d1325d36124933a726f29fcdaa812e9", size = 5750, upload-time = "2010-08-24T14:33:52.445Z" }
|
||||
|
||||
[[package]]
|
||||
name = "six"
|
||||
version = "1.17.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/94/e7/b2c673351809dca68a0e064b6af791aa332cf192da575fd474ed7d6f16a2/six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81", size = 34031, upload-time = "2024-12-04T17:35:28.174Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050, upload-time = "2024-12-04T17:35:26.475Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "sqlalchemy"
|
||||
version = "2.0.44"
|
||||
@@ -183,6 +326,33 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/18/67/36e9267722cc04a6b9f15c7f3441c2363321a3ea07da7ae0c0707beb2a9c/typing_extensions-4.15.0-py3-none-any.whl", hash = "sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548", size = 44614, upload-time = "2025-08-25T13:49:24.86Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "tzdata"
|
||||
version = "2025.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/95/32/1a225d6164441be760d75c2c42e2780dc0873fe382da3e98a2e1e48361e5/tzdata-2025.2.tar.gz", hash = "sha256:b60a638fcc0daffadf82fe0f57e53d06bdec2f36c4df66280ae79bce6bd6f2b9", size = 196380, upload-time = "2025-03-23T13:54:43.652Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/5c/23/c7abc0ca0a1526a0774eca151daeb8de62ec457e77262b66b359c3c7679e/tzdata-2025.2-py2.py3-none-any.whl", hash = "sha256:1a403fada01ff9221ca8044d701868fa132215d84beb92242d9acd2147f667a8", size = 347839, upload-time = "2025-03-23T13:54:41.845Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "vine"
|
||||
version = "5.1.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/bd/e4/d07b5f29d283596b9727dd5275ccbceb63c44a1a82aa9e4bfd20426762ac/vine-5.1.0.tar.gz", hash = "sha256:8b62e981d35c41049211cf62a0a1242d8c1ee9bd15bb196ce38aefd6799e61e0", size = 48980, upload-time = "2023-11-05T08:46:53.857Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/03/ff/7c0c86c43b3cbb927e0ccc0255cb4057ceba4799cd44ae95174ce8e8b5b2/vine-5.1.0-py3-none-any.whl", hash = "sha256:40fdf3c48b2cfe1c38a49e9ae2da6fda88e4794c810050a728bd7413811fb1dc", size = 9636, upload-time = "2023-11-05T08:46:51.205Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "wcwidth"
|
||||
version = "0.2.14"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/24/30/6b0809f4510673dc723187aeaf24c7f5459922d01e2f794277a3dfb90345/wcwidth-0.2.14.tar.gz", hash = "sha256:4d478375d31bc5395a3c55c40ccdf3354688364cd61c4f6adacaa9215d0b3605", size = 102293, upload-time = "2025-09-22T16:29:53.023Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/af/b5/123f13c975e9f27ab9c0770f514345bd406d0e8d3b7a0723af9d43f710af/wcwidth-0.2.14-py2.py3-none-any.whl", hash = "sha256:a7bb560c8aee30f9957e5f9895805edd20602f2d7f720186dfd906e82b4982e1", size = 37286, upload-time = "2025-09-22T16:29:51.641Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "werkzeug"
|
||||
version = "3.1.3"
|
||||
@@ -201,15 +371,30 @@ version = "0.1.0"
|
||||
source = { virtual = "." }
|
||||
dependencies = [
|
||||
{ name = "alembic" },
|
||||
{ name = "celery" },
|
||||
{ name = "feedparser" },
|
||||
{ name = "flask" },
|
||||
{ name = "redis" },
|
||||
{ name = "sqlalchemy" },
|
||||
{ name = "yt-dlp" },
|
||||
]
|
||||
|
||||
[package.metadata]
|
||||
requires-dist = [
|
||||
{ name = "alembic", specifier = ">=1.13.0" },
|
||||
{ name = "celery", specifier = ">=5.3.0" },
|
||||
{ name = "feedparser", specifier = ">=6.0.12" },
|
||||
{ name = "flask", specifier = ">=3.1.2" },
|
||||
{ name = "redis", specifier = ">=5.0.0" },
|
||||
{ name = "sqlalchemy", specifier = ">=2.0.0" },
|
||||
{ name = "yt-dlp", specifier = ">=2024.0.0" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "yt-dlp"
|
||||
version = "2025.11.12"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/cf/41/53ad8c6e74d6627bd598dfbb8ad7c19d5405e438210ad0bbaf1b288387e7/yt_dlp-2025.11.12.tar.gz", hash = "sha256:5f0795a6b8fc57a5c23332d67d6c6acf819a0b46b91a6324bae29414fa97f052", size = 3076928, upload-time = "2025-11-12T01:00:38.43Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/5f/16/fdebbee6473473a1c0576bd165a50e4a70762484d638c1d59fa9074e175b/yt_dlp-2025.11.12-py3-none-any.whl", hash = "sha256:b47af37bbb16b08efebb36825a280ea25a507c051f93bf413a6e4a0e586c6e79", size = 3279151, upload-time = "2025-11-12T01:00:35.813Z" },
|
||||
]
|
||||
|
||||
Reference in New Issue
Block a user