I have a file which has urls of files to download.
For example:
https://url-of/file1.zip
https://url-of/file2.zip
https://url-of/file3.zip
...
The command i am currently using to download files is:
wget --continue --tries=0 https://url-of/file.zip
But now I need a bash script which will read urls from the file and download, two at a time.
The script i came up with so far is:
#!/bin/sh
cat file.txt |while read url
do
wget --continue --tries=0 "$url"
done
But it download single file.
How can i edit it so that it download two files at a time.
xargs -P
or GNUparallel
for this:xargs -a file.txt -n1 -P 2 wget --continue --tries=0
– muru Sep 11 '23 at 16:08cat
in your loop. Just feed the file directly into the loop:while IFS= read -r url
//do
//wget...
//done <file.txt
– Chris Davies Sep 11 '23 at 16:13gnu parallel
command will look like. – Ahmad Ismail Sep 12 '23 at 11:42cat batch-file.txt | parallel -j2 wget --continue --tries=0 --timeout=60 --waitretry=60 --quiet --show-progress {}
. However, it does not show any output. I would like to see the progress bar for both. – Ahmad Ismail Sep 12 '23 at 11:50parallel
frommoreutils
(which has a different syntax than GNU parallel). The example command I gave forxargs
should work with GNU xargs. Progress bar for both simultaneously is going to be tricky, because you will get overlapping output. What I'd suggest is doing something like:xargs -a file.txt -n1 -P 2 --process-slot-var INDEX sh -c 'exec wget --continue --tries=0 "$1" 2>>"wget-$INDEX.log" _
and runningtail -f wget-1.log
,tail -f wget-2.log
in separate terminals. – muru Sep 12 '23 at 12:58cat batch-file.txt | parallel -j2 --lb wget --continue --tries=0 --timeout=60 --waitretry=60 --quiet --show-progress {}
. After--lb
now it shows output. – Ahmad Ismail Sep 12 '23 at 13:06