I was sure there was something like this in the standard library, but it seems I was wrong.
I have a bunch of urls that I want to urlopen
in parallel. I want something like the builtin map
function, except the work is done in parallel by a bunch of threads.
Is there a good module that does this?
There is a map
method in multiprocessing.Pool. That does multiple processes.
And if multiple processes aren't your dish, you can use multiprocessing.dummy which uses threads.
import urllib
import multiprocessing.dummy
p = multiprocessing.dummy.Pool(5)
def f(post):
return urllib.urlopen('http://stackoverflow.com/questions/%u' % post)
print p.map(f, range(3329361, 3329361 + 5))