Mulitprocess Pools with different functions

dorvak picture dorvak · Aug 8, 2011 · Viewed 18.3k times · Source

Most examples of the Multiprocess Worker Pools execute a single function in different processes, f.e.

def foo(args):
   pass

if __name__ == '__main__':
   pool = multiprocessing.Pool(processes=30)
   res=pool.map_async(foo,args)

Is there a way to handle two different and independent functions within the pool? So that you could assign f.e. 15 processes for foo() and 15 processes for bar() or is a pool bounded to a single function? Or du you have to create different processes for different functions manually with

 p = Process(target=foo, args=(whatever,))
 q = Process(target=bar, args=(whatever,))
 q.start()
 p.start()

and forget about the worker pool?

Answer

Ocaj Nires picture Ocaj Nires · Aug 8, 2011

To pass different functions, you can simply call map_async multiple times.

Here is an example to illustrate that,

from multiprocessing import Pool
from time import sleep

def square(x):
    return x * x

def cube(y):
    return y * y * y

pool = Pool(processes=20)

result_squares = pool.map_async(f, range(10))
result_cubes = pool.map_async(g, range(10))

The result will be:

>>> print result_squares.get(timeout=1)
[0, 1, 4, 9, 16, 25, 36, 49, 64, 81]

>>> print result_cubes.get(timeout=1)
[0, 1, 8, 27, 64, 125, 216, 343, 512, 729]