You could try installing GNU parallel. You could get some GNU parallel examples from here.
Testing
I installed gnu-parallel
from source in my machine and I could get it to work.
You could install it from source from here. I have a redhat system and so I downloaded the fedora package and then ran the .configure
, make
and make install
to get the parallel
installed in my system.
Now, after the successful installation, I created a directory checking
and ran the below command.
seq 10 | parallel -n0 wget http://www.betaservice.domain.host.com/web/hasChanged?ver=0
As expected the above command downloaded me 10 copies of the web page. You could set the number that you wish with seq
.
For more information on how to run the same command in parallel, you could verify the examples provided by gnu-parallel from here. From the example page,
If you want to run the same command with the same arguments 10 times
in parallel you can do:
seq 10 | parallel -n0 my_command my_args
EDIT
Now, to take advantage of the parallel
execution, you could use the command as,
seq 70 | parallel -j70 wget http://www.betaservice.domain.host.com/web/hasChanged?ver=0
The -j
option is something that could specify the total jobs that can be executed in parallel based on the total CPU cores.
pgrep: invalid option -- 'c'
. I am not sure why, can you think of what might be the reason? – david Sep 08 '14 at 23:09pgrep
implementation. Is this Linux? Which one? Anyway, you can just change that line toif [ $(pgrep wget | wc -l) -lt 70 ]; then
– terdon Sep 08 '14 at 23:13