celery
Distributed task queue for Python — sends tasks to workers asynchronously via message brokers (Redis, RabbitMQ). celery 5.x features: @app.task decorator for task definition, .delay()/.apply_async() for dispatching, task.AsyncResult for result retrieval, periodic tasks via celery beat, task chaining/grouping/chord with canvas API, retry logic, task routing to specific queues, priority queues, result backends (Redis, database), task monitoring via Flower, soft/hard time limits, task ETA (eta= parameter), countdown=, rate_limit=, workflow primitives (chain, group, chord), and task signals.
Score Breakdown
⚙ Agent Friendliness
🔒 Security
Task queue. Never use pickle serializer with untrusted input — arbitrary code execution. Secure broker connection with credentials and TLS. Task arguments visible in broker queue — do not pass secrets as task args. Worker runs arbitrary Python — secure worker network access. Revoke tasks via revoke() for cancelled work.
⚡ Reliability
Best When
Distributed background task processing at scale — celery is the standard for Python background jobs with monitoring, retries, and complex workflows for production services.
Avoid When
Simple async (use asyncio), real-time processing, lightweight needs (use RQ/huey), or when infrastructure overhead of broker+backend is not justified.
Use Cases
- • Agent background task — from celery import Celery; app = Celery('tasks', broker='redis://localhost:6379/0', backend='redis://localhost:6379/1'); @app.task; def process_file(file_id): result = do_work(file_id); return result; result = process_file.delay(file_id); task_result = result.get(timeout=300) — basic; agent dispatches long-running work to background workers
- • Agent task with retry — @app.task(bind=True, max_retries=3, default_retry_delay=60); def fetch_data(self, url): try: return requests.get(url).json(); except RequestException as exc: self.retry(exc=exc, countdown=2**self.request.retries) — retry; agent retries failed tasks with exponential backoff; bind=True for self access
- • Agent task chaining — from celery import chain; workflow = chain(download.s(url), process.s(), store.s()); result = workflow.apply_async(); final = result.get() — chain; agent builds sequential workflow where output of each task feeds next; chain() creates lazy signature; .s() is shorthand for signature()
- • Agent periodic tasks — from celery.schedules import crontab; app.conf.beat_schedule = {'daily-report': {'task': 'tasks.generate_report', 'schedule': crontab(hour=8, minute=0), 'args': ('daily',)}}; — beat; agent schedules recurring tasks; requires celery beat process; crontab for cron syntax; timedelta for fixed interval
- • Agent task routing — @app.task(queue='high_priority'); def urgent_task(data): process(data); app.conf.task_routes = {'tasks.urgent_task': {'queue': 'high_priority'}}; urgent_task.apply_async(queue='high_priority') — routing; agent routes tasks to specific worker queues; run workers with celery -A app worker -Q high_priority
Not For
- • Simple async operations — for simple async use asyncio; celery adds broker infrastructure overhead
- • Real-time tasks — celery has startup latency; for real-time use asyncio/threading
- • Lightweight task queues — for simpler task queues use RQ (Redis Queue) or huey with less infrastructure
Interface
Authentication
No auth in celery itself. Broker authentication via connection URL (Redis AUTH, RabbitMQ credentials).
Pricing
Celery is BSD 3-Clause licensed. Free for all use. Requires separate message broker (Redis/RabbitMQ).
Agent Metadata
Known Gotchas
- ⚠ result.get() blocks and can deadlock — calling result.get() inside a celery task (waiting for another task) can deadlock worker; agent code: never call result.get() synchronously inside a task; use chord/chain for dependent tasks; or use apply_async with link= callback; dedicated result-fetching process or API endpoint
- ⚠ Tasks must be importable by workers — worker process imports task module; if task uses local variables or closures that don't serialize: pickle error; agent code: define tasks in importable modules (not inside functions); task arguments must be serializable (JSON by default); use task_serializer='pickle' for complex objects (security risk)
- ⚠ Celery app must match between producer and worker — same Celery app name, same broker URL, same task module structure; agent code: share celery app configuration via shared module; worker command: celery -A myapp.celery worker; producer imports from same myapp.tasks; mismatch = tasks never consumed
- ⚠ beat requires separate process — celery beat for periodic tasks is NOT automatic with worker; must run: celery -A app beat; agent code: add beat to deployment (separate container/process); beat and worker can share host but beat is single-process (not distributed); beat stores schedule in database or local file
- ⚠ @app.task vs @shared_task — @app.task binds to specific Celery app instance; @shared_task creates task not bound to app (reusable in multiple apps); agent code for reusable library tasks: use @shared_task; for app-specific tasks: @app.task; @shared_task requires celery.current_app which resolves at runtime
- ⚠ Task serialization defaults to JSON — celery 4+ uses JSON by default (not pickle); JSON cannot serialize datetime, UUID, Decimal without custom encoder; agent code: use str(uuid) or uuid.hex before passing; datetime.isoformat() for dates; or configure: task_serializer='pickle', accept_content=['pickle'] (security risk with untrusted input)
Alternatives
Full Evaluation Report
Detailed scoring breakdown, competitive positioning, security analysis, and improvement recommendations for celery.
Scores are editorial opinions as of 2026-03-06.