Python multicore programming

Larry picture Larry · May 8, 2014 · Viewed 52k times · Source

Please consider a class as follow:

class Foo:
    def __init__(self, data):
        self.data = data

    def do_task(self):
        #do something with data 

In my application I've a list containing several instances of Foo class. The aim is to execute do_task for all Foo objects. A first implementation is simply:

 #execute tasks of all Foo Object instantiated
 for f_obj in my_foo_obj_list:
     f_obj.do_task()

I'd like to take advantage of multi-core architecture sharing the for cycle between 4 CPUs of my machine.

What's the best way to do it?

Answer

timrau picture timrau · May 8, 2014

You can use process pools in multiprocessing module.

def work(foo):
    foo.do_task()

from multiprocessing import Pool

pool = Pool()
pool.map(work, my_foo_obj_list)
pool.close()
pool.join()