Possible Duplicate:
Four tasks in parallel… how do I do that?
I have a group of jobs to run (a few hundred jobs) - they have a variety of command names and parameters. Each one runs anywhere from a few minutes to a few hours (and I don't really know which ones will run longer than others until I run them).
They are pretty much CPU bound, but some do significant I/O, so I want to run 4 at a time (one per CPU core).
Is there already a simple command to do this? Something like runbatch -n 4 filename that will read the file and execute n jobs at a time until it's done. Since some of the command lines have special characters and spaces in parameters, using xargs -P to run them under bash -c seems a little problematic to get everything escaped properly.
I've thought about just building a makefile and using make -j or building a simple Perl job scheduler that runs jobs in the background and dispatches a new one as each one finishes, but wanted to see if there was a more elegant solution that already exists?