5

Is there any limit for parallel execution? if yes, how to find out the maximum limit?

I am creating a script which create a string of scripts concatenated by '&' and uses eval to execute them all together. Something like this:

scriptBuilder="ksh -x script1.sh & ksh -x script2.sh & ksh -x script3.sh";
eval $scriptBuilder;

Just want to make sure what is the max limit for parallel execution on the server.

Jeff Schaller
  • 67,283
  • 35
  • 116
  • 255
Sas
  • 1,083
  • 2
    how to find out the maximum limit? -> you could start with a fork bomb. Just remember to save any work you have open first ;) The limit is really just the limit on overall resources (processing power, memory). – goldilocks Nov 12 '14 at 21:06
  • 5
    Are you just looking for the max number of processes you can spawn, or the max that can actually run concurrently? If the latter, it will be a function of how many processor cores you have and how CPU bound your process are. You'll have to benchmark that, we cannot tell you. – casey Nov 12 '14 at 21:23
  • @casey max number of processes I can spawn? Just want to know if there is any limit on that. – Sas Nov 12 '14 at 22:44

1 Answers1

6

The command ulimit -u shows the maximum number of processes that you can start. However, do not actually start that many processes in the background: your machine would spend time switching between processes and wouldn't get around to getting actual work done.

For CPU-bound tasks, run as many tasks as there are cores on your machine, or one more. This is if there's enough RAM to accommodate all these processes and their file cache: if the parallel processes are competing for I/O bandwidth to keep reloading their data, they'll run slower than if you run them sequentially. You can find the number of cores in /proc/cpuinfo.

The easy way to run one task per processor is to use GNU parallel.

If the tasks are I/O-bound and use the same peripherals (e.g. they access files on the same disk), it's usually best to run them sequentially.