$120 tested Claude codes · real before/after data · Full tier $15 one-timebuy --sheet=15 →
$Free 40-page Claude guide — setup, 120 prompt codes, MCP servers, AI agents. download --free →
clskills.sh — terminal v2.4 — 2,347 skills indexed● online
[CL]Skills_
PythonintermediateNew

Celery Setup

Share

Set up Celery for distributed task processing in Python

Works with OpenClaude

You are a Python backend developer. The user wants to set up Celery for distributed task processing, including broker configuration, task definitions, and worker execution.

What to check first

  • Run pip list | grep -i celery to verify Celery is installed; if not, run pip install celery[redis] (or pip install celery[amqp] for RabbitMQ)
  • Confirm a message broker is running: redis-cli ping (Redis) or check RabbitMQ status with sudo rabbitmqctl status
  • Verify Python version is 3.7+: python --version

Steps

  1. Create a Celery app instance in a new file (e.g., celery_app.py) with broker and result backend URLs
  2. Configure the broker connection string (Redis example: redis://localhost:6379/0 or RabbitMQ: amqp://guest:guest@localhost//)
  3. Define task functions in a separate module using the @app.task decorator
  4. Configure task settings (serializer, timezone, task routing) in the Celery app config
  5. Start a Celery worker process using the celery -A celery_app worker --loglevel=info command
  6. Send tasks asynchronously from your application code using task_name.delay() or task_name.apply_async()
  7. Optionally set up a Celery Beat scheduler for periodic tasks with celery_app.conf.beat_schedule
  8. Monitor task execution using Celery Flower: pip install flower && celery -A celery_app events in one terminal, then flower in another

Code

# celery_app.py
from celery import Celery
from kombu import Exchange, Queue
import os

# Initialize Celery app
app = Celery(__name__)

# Broker configuration (Redis example)
app.conf.broker_url = os.getenv('CELERY_BROKER_URL', 'redis://localhost:6379/0')
app.conf.result_backend = os.getenv('CELERY_RESULT_BACKEND', 'redis://localhost:6379/0')

# Task serialization and timezone
app.conf.update(
    task_serializer='json',
    accept_content=['json'],
    result_serializer='json',
    timezone='UTC',
    enable_utc=True,
    task_track_started=True,
    task_time_limit=30 * 60,  # Hard limit: 30 minutes
    task_soft_time_limit=25 * 60,  # Soft limit: 25 minutes
)

# Define queues and routing
app.conf.task_queues = (
    Queue('default', Exchange('default'), routing_key='default'),
    Queue('high_priority', Exchange('priority'), routing_key='high'),
)

app.conf.task_routes =

Note: this example was truncated in the source. See the GitHub repo for the latest full version.

Common Pitfalls

  • Treating this skill as a one-shot solution — most workflows need iteration and verification
  • Skipping the verification steps — you don't know it worked until you measure
  • Applying this skill without understanding the underlying problem — read the related docs first

When NOT to Use This Skill

  • When a simpler manual approach would take less than 10 minutes
  • On critical production systems without testing in staging first
  • When you don't have permission or authorization to make these changes

How to Verify It Worked

  • Run the verification steps documented above
  • Compare the output against your expected baseline
  • Check logs for any warnings or errors — silent failures are the worst kind

Production Considerations

  • Test in staging before deploying to production
  • Have a rollback plan — every change should be reversible
  • Monitor the affected systems for at least 24 hours after the change

Quick Info

CategoryPython
Difficultyintermediate
Version1.0.0
AuthorClaude Skills Hub
pythonceleryasync

Install command:

curl -o ~/.claude/skills/celery-setup.md https://claude-skills-hub.vercel.app/skills/python/celery-setup.md

Related Python Skills

Other Claude Code skills in the same category — free to download.

Want a Python skill personalized to YOUR project?

This is a generic skill that works for everyone. Our AI can generate one tailored to your exact tech stack, naming conventions, folder structure, and coding patterns — with 3x more detail.