Celery Tasks Not Being Processed

Nat Dempkowski picture Nat Dempkowski · Apr 15, 2014 · Viewed 7.8k times · Source

I'm trying to process some tasks using celery, and I'm not having too much luck. I'm running celeryd and celerybeat as daemons. I have a tasks.py file that look like this with a simple app and task defined:

from celery import Celery

app = Celery('tasks', broker='amqp://user:pass@hostname:5672/vhostname')

@app.task
def process_file(f):
    # do some stuff
    # and log results

And this file is referenced from another file process.py I use to monitor for file changes that looks like:

from tasks import process_file

file_name = '/file/to/process'
result = process_file.delay(file_name)
result.get()

And with that little code, celery is unable to see tasks and process them. I can execute similar code in the python interpreter and celery processes them:

py >>> from tasks import process_file
py >>> process_file.delay('/file/to/process')
<AsyncResult: 8af23a4e-3f26-469c-8eee-e646b9d28c7b>

When I run the tasks from the interpreter however, beat.log and worker1.log don't show any indication that the tasks were received, but using logging I can confirm the task code was executed. There are also no obvious errors in the .log files. Any ideas what could be causing this problem?

My /etc/default/celerybeat looks like:

CELERY_BIN="/usr/local/bin/celery"
CELERYBEAT_CHDIR="/opt/dirwithpyfiles"
CELERYBEAT_OPTS="--schedule=/var/run/celery/celerybeat-schedule"

And /etc/default/celeryd:

CELERYD_NODES="worker1"
CELERY_BIN="/usr/local/bin/celery"
CELERYD_CHDIR="/opt/dirwithpyfiles"
CELERYD_OPTS="--time-limit=300 --concurrency=8"
CELERYD_USER="celery"
CELERYD_GROUP="celery"
CELERYD_LOG_FILE="/var/log/celery/%N.log"
CELERYD_PID_FILE="/var/run/celery/%N.pid"
CELERY_CREATE_DIRS=1

Answer

Nat Dempkowski picture Nat Dempkowski · Apr 16, 2014

So I figured out my issue here by running celery from the cli instead of as a daemon, enabling me to see more detailed output of errors that happened. I did this by running:

user@hostname   /opt/dirwithpyfiles $ su celery
celery@hostname /opt/dirwithpyfiles $ celery -A tasks worker --loglevel=info

There I could see that a permissions issue was happening as the celery user that did not happen when I ran the commands from the python interpreter as my normal user. I fixed this by changing the permissions of /file/to/process so that both users could read from it.