celerybeat - multiple instances & monitoring

DmitrySemenov picture DmitrySemenov · Jan 27, 2015 · Viewed 7.4k times · Source

I'm having application built using celery and recently we got a requirement to run certain tasks on schedule.

I think celerybeat is perfect for this, but I got few questions:

  1. Is it possible to run multiple celerybeat instances, so that tasks are not duplicated?
  2. How to make sure that celerybeat is always up & running?

So far I read this: https://github.com/celery/celery/issues/251 and https://github.com/ybrs/single-beat

It looks like a single instance of celerybeat should be running.

I'm running application inside AWS elasticbeanstalk docker containers and celery workers are also docker containers (so it's quickly scaleable when needed).

It would be best to have celerybeat run through supervisord along with celery workers, but it seems this is not proper way to do this.

At the same time having that single instance of celerybeat would require manual provision/start and monitoring.

Answer

NoamG picture NoamG · Jan 27, 2015

To answer your 2 questions:

  1. If you run several celerybeat instances you get duplicated tasks, so afaik you should have only single celerybeat instance.

  2. I'm using supervisord as you mentioned to run celery workers and celerybeat workers as deamon so they should always be up & running.

my supervisord config:

[program:my_regular_worker]
command=python2.7 /home/ubuntu/workspace/src/manage.py celery worker -Q my_regular_worker-queue_name -c 1 -l info --without-mingle
process_name=my_regular_worker
directory=/home/ubuntu/workspace/src
autostart=true
autorestart=true
user=ubuntu
stdout_logfile=/tmp/my_regular_worker.log
redirect_stderr=true



[program:my_celerybeat_worker]
command=python2.7 /home/ubuntu/workspace/src/manage.py celery worker -Q my_celerybeat_worker-queue_name -c 1 -l info --without-mingle -B -s /tmp/celerybeat-schedule