It is possible to run parallel in background. Recently, I was working on similar scripts which I needed to execute in parallel and I got the suggestions in this answer. If you go to the link specified as part of my answer, I see the below answer.
GNU Parallel can work as a counting semaphore. This is slower and less efficient than its normal mode.
An alias for parallel --semaphore
is sem
. The default is to allow only one program to run at a time (technically called a mutex). The program is started in the background. Use --wait
for all 'sem's to finish:
sem 'sleep 1; echo The first finished' &&
echo The first is now running in the background &&
sem 'sleep 1; echo The second finished' &&
echo The second is now running in the background
sem --wait
If you see the examples in the link that I had provided, you can definitely find what you are looking for.
wget
's in parallel will not work, all you'll be doing is dividing up the fixed amount of bandwidth over many vs. 1 or 2. Rather then bother withparallel
here you should just runwget
backgrounded directly. Running 1-3 will likely saturate your network connection. – slm Apr 10 '14 at 15:12axel
might be a better fit here too if you're downloading a file, but want to download it over multiple connections. – slm Apr 10 '14 at 15:17wget
. I wish I could write some sort of implementation in C++ to do all the operations and have the program send the URLs to a fixed number ofwget
s. I've been reading a bit on Swift to do this but I'm no master programmer unfortunately. – Dominique Apr 10 '14 at 16:56