So I already looked around a lot for this but couldn't find a good answer. I'm using Celery 3.1.7 and Django 1.5.1., without django-celery package since newer versions of Celery don't require it anymore. I managed to set up tasks and execute them using RabbitMQ. Everything is working as it should there. However, I am integrating this in a existing, quite large, Django project. There we specified couple of Django settings files, not just one. We run different one depending on environment, for instance one for local machines and one for server. My problem is that I can't seem to be able to track down which settings file is "active" from the celery worker, which runs celery.py file in my project root (as documentation specifies). There the documentation requires to specify Django settings file like this:
os.environ.setdefault('DJANGO_SETTINGS_MODULE', "project.settings.server")
Now this works, but if I move the stuff locally I need to change it to settings.local
to make it work, and that every time. Reading settings object in runtime like I do in standard Django files didn't work since celery worker executes in a different process.
So, using this situation, does anyone have any idea on how to dynamically fetch active Django settings file from celery worker? Or perhaps pass it in as a variable when starting celery worker? (like for Django, etc --settings=project.settings.local)
Thanks!
When initializing the celery worker on the command line, just set the environment variable prior to the celery command.
DJANGO_SETTINGS_MODULE='proj.settings' celery -A proj worker -l info