Start celery worker throws “no attribute ‘worker_state_db'”

The bug appears if an exception is raised while parsing settings. Such as when we set Django’s SECRET_KEY (or any other setting) via an environment variable: SECRET_KEY = os.environ[‘SECRET_KEY’] To solve the problem you can switch back to: SECRET_KEY = “asdfasdfasdf” or use: SECRET_KEY = os.environ.get(‘SECRET_KEY’, ”) You can also find which setting caused the … Read more

InterfaceError: connection already closed (using django + celery + Scrapy)

Unfortunately this is a problem with django + psycopg2 + celery combo. It’s an old and unsolved problem. Take a look on this thread to understand: https://github.com/celery/django-celery/issues/121 Basically, when celery starts a worker, it forks a database connection from django.db framework. If this connection drops for some reason, it doesn’t create a new one. Celery … Read more

celery – Tasks that need to run in priority

Celery does not support task priority. (v3.0) http://docs.celeryproject.org/en/master/faq.html#does-celery-support-task-priorities You may solve this problem by routing tasks. http://docs.celeryproject.org/en/latest/userguide/routing.html Prepare default and priority_high Queue. from kombu import Queue CELERY_DEFAULT_QUEUE = ‘default’ CELERY_QUEUES = ( Queue(‘default’), Queue(‘priority_high’), ) Run two daemon. user@x:/$ celery worker -Q priority_high user@y:/$ celery worker -Q default,priority_high And route task. your_task.apply_async(args=[‘…’], queue=”priority_high”)

How to log exceptions occurring in a django celery task

The question: I’d like Celery to catch exceptions and write them to a log file instead of apparently swallowing them… The current top answer here is so-so for purposes of a professional solution. Many python developers will consider blanket error catching on a case-by-case basis a red flag. A reasonable aversion to this was well-articulated … Read more

celery – chaining groups and subtasks. -> out of order execution

So as it turns out, in celery you cannot chain two groups together. I suspect this is because groups chained with tasks automatically become a chord –> Celery docs: http://docs.celeryproject.org/en/latest/userguide/canvas.html Chaining a group together with another task will automatically upgrade it to be a chord: Groups return a parent task. When chaining two groups together, … Read more

Celery task schedule (Ensuring a task is only executed one at a time)

It is invalid to access local variables since you can have several celery workers running tasks. And those workers might even be on different hosts. So, basically, there is as many is_locked variable instances as many Celery workers are running your async_work task. Thus, even though your code won’t raise any errors you wouldn’t get … Read more

tech