rabbitmq - Celery, route all tasks in a Django project to a specific queue -
i have machine runs 2 copies of same django project, let's call them a , b, , want use celery process background tasks.
i set supervisor launch 2 workers, 1 each project, given tasks have same names in both projects, tasks run wrong worker.
my next step use different queue each worker, using -q queuename
parameter. using rabbitmqctl list_queues
see both queues created. command i'm using issue workers is
python3 -m celery worker -a project -l info -q q1 --hostname=celery1@ubuntu
and
python3 -m celery worker -a project -l info -q q2 --hostname=celery2@ubuntu
the question is, how route tasks project a queue a, , tasks project b queue b? yes, i've seen can add parameter task
decorator select queue, i'm looking global setting or that.
edit 1: i've tried using celery_default_queue
doesn't work, setting gets ignored. i've tried creating dumb router, this:
class myrouter(object): def route_for_task(self, task, args=none, kwargs=none): return 'q1' celery_routes = (myrouter(), )
and works (obviously return different queues in each project), i'm baffled, why celery_default_queue
setting ignored?
at end easier expected. had set both default queue , default routing key (and optionally default exchange, long it's direct
exchange).
celery_default_queue = 'q2' celery_default_exchange = 'q2' celery_default_routing_key = 'q2'
i had concepts unclear, after following official rabbitmq's tutorials got clearer , able fix problem.
Comments
Post a Comment