I have two pieces of code that I'm using to learn about multiprocessing in Python 3.1. My goal is to use 100% of all the available processors. However, the code snippets here only reach 30% - 50% on all processors.
Is there anyway to 'force' python to use all 100%? Is the OS (windows 7, 64bit) limiting Python's access to the processors? While the code snippets below are running, I open the task manager and watch the processor's spike, but never reach and maintain 100%. In addition to that, I can see multiple python.exe processes created and destroyed along the way. How do these processes relate to processors? For example, if I spawn 4 processes, each process isn't using it's own core. Instead, what are the processes using? Are they sharing all cores? And if so, is it the OS that is forcing the processes to share the cores?
import multiprocessing
def worker():
#worker function
print ('Worker')
x = 0
while x < 1000:
print(x)
x += 1
return
if __name__ == '__main__':
jobs = []
for i in range(50):
p = multiprocessing.Process(target=worker)
jobs.append(p)
p.start()
from multiprocessing import Process, Lock
def f(l, i):
l.acquire()
print('worker ', i)
x = 0
while x < 1000:
print(x)
x += 1
l.release()
if __name__ == '__main__':
lock = Lock()
for num in range(50):
Process(target=f, args=(lock, num)).start()
To use 100% of all cores, do not create and destroy new processes.
Create a few processes per core and link them with a pipeline.
At the OS-level, all pipelined processes run concurrently.
The less you write (and the more you delegate to the OS) the more likely you are to use as many resources as possible.
python p1.py | python p2.py | python p3.py | python p4.py ...
Will make maximal use of your CPU.