I have a file which has urls of files to download.
For example:
https://url-of/file1.zip https://url-of/file2.zip https://url-of/file3.zip ... The command i am currently using to download files is:
wget --continue --tries=0 https://url-of/file.zip But now I need a bash script which will read urls from the file and download, two at a time.
The script i came up with so far is:
#!/bin/sh cat file.txt |while read url do wget --continue --tries=0 "$url" done But it download single file.
How can i edit it so that it download two files at a time.
xargs -Por GNUparallelfor this:xargs -a file.txt -n1 -P 2 wget --continue --tries=0catin your loop. Just feed the file directly into the loop:while IFS= read -r url//do//wget...//done <file.txtgnu parallelcommand will look like.cat batch-file.txt | parallel -j2 wget --continue --tries=0 --timeout=60 --waitretry=60 --quiet --show-progress {}. However, it does not show any output. I would like to see the progress bar for both.parallelfrommoreutils(which has a different syntax than GNU parallel). The example command I gave forxargsshould work with GNU xargs. Progress bar for both simultaneously is going to be tricky, because you will get overlapping output. What I'd suggest is doing something like:xargs -a file.txt -n1 -P 2 --process-slot-var INDEX sh -c 'exec wget --continue --tries=0 "$1" 2>>"wget-$INDEX.log" _and runningtail -f wget-1.log,tail -f wget-2.login separate terminals.