Limiting certain processes to CPU % - Linux

asparagus picture asparagus · Dec 22, 2008 · Viewed 36.7k times · Source

I have the following problem: some processes, generated dynamically, have a tendency to eat 100% of CPU. I would like to limit all the process matching some criterion (e.g. process name) to a certain amount of CPU percentage.

The specific problem I'm trying to solve is harnessing folding@home worker processes. The best solution I could think of is a perl script that's executed periodically and uses the cpulimit utility to limit the processes (if you're interested in more details, check this blog post). It works, but it's a hack :/

Any ideas? I would like to leave the handling of processes to the OS :)


Thanks again for the suggestions, but we're still missing the point :)

The "slowDown" solution is essentially what the "cpulimit" utility does. I still have to take care about what processes to slow down, kill the "slowDown" process once the worker process is finished and start new ones for new worker processes. It's precisely what I did with the Perl script and a cron job.

The main problem is that I don't know beforehand what processes to limit. They are generated dynamically.

Maybe there's a way to limit all the processes of one user to a certain amount of CPU percentage? I already set up a user for executing the folding@home jobs, hoping that i could limit him with the /etc/security/limits.conf file. But the nearest I could get there is the total CPU time per user...

It would be cool if to have something that enables you to say: "The sum of all CPU % usage of this user's processes cannot exceed 50%". And then let the processes fight for that 50% of CPU regarding to their priorities...


Guys, thanks for your suggestions, but it's not about priorities - I want to limit the CPU % even when there's plenty of CPU time available. The processes are already low priority, so they don't cause any performance issues.

I would just like to prevent the CPU from running on 100% for extended periods...

Answer

Datageek picture Datageek · Dec 14, 2011

I had a slightly similar issue with gzip.

Assuming we want to decrease the CPU of a gzip process:

    gzip backup.tar & sleep 2 & cpulimit --limit 10 -e gzip -z

Options:

  • I found sleep useful as the cpulimit sometimes didn't pick up the new gzip process immediately
  • --limit 10 limits gzip CPU usage to 10%
  • -z automatically closes cpulimit when gzip process finishes

Another option is to run the cpulimit daemon.