I would add multiple tasks to celery queue and wait for results. I have various ideas how I would achieve this utilising some form of shared storage (memcached, redis, db, etc.), however, I would have thought it's something that Celery can handle automatically but I can't find any resources online.
Code example
def do_tasks(b):
for a in b:
c.delay(a)
return c.all_results_some_how()
For Celery >= 3.0, TaskSet is deprecated in favour of group.
from celery import group
from tasks import add
job = group([
add.s(2, 2),
add.s(4, 4),
add.s(8, 8),
add.s(16, 16),
add.s(32, 32),
])
Start the group in the background:
result = job.apply_async()
Wait:
result.join()