I am currently trying to implement a job queue in php. The queue will then be processed as a batch job and should be able to process some jobs in parallel.
I already did some research and found several ways to implement it, but I am not really aware of their advantages and disadvantages.
E.g. doing the parallel processing by calling a script several times through fsockopen
like explained here:
Easy parallel processing in PHP
Another way I found was using the curl_multi
functions.
curl_multi_exec PHP docs
But I think those 2 ways will add pretty much overhead for creating batch processing on a queue that should mainly run on the background?
I also read about pcntl_fork
which also seems to be a way to handle the problem. But that looks like it can get really messy if you don't really know what you are doing (like me at the moment).
I also had a look at Gearman
, but there I would also need to spawn the worker threads dynamically as needed and not just run a few and let the gearman job server then sent it to the free workers. Especially because the threads should be exit cleanly after one job has been executed, to not run into eventual memory leaks (code may not be perfect in that issue).
Gearman Getting Started
So my question is, how do you handle parallel processing in PHP? And why do you choose your method, which advantages/disadvantages may the different methods have?
i use exec()
. Its easy and clean. You basically need to build a thread manager, and thread scripts, that will do what you need.
I dont like fsockopen()
because it will open a server connection, that will build up and may hit the apache's connection limit
I dont like curl
functions for the same reason
I dont like pnctl
because it needs the pnctl extension available, and you have to keep track of parent/child relations.
never played with gearman...