0

I have a file which has urls of files to download.

For example:

https://url-of/file1.zip https://url-of/file2.zip https://url-of/file3.zip ... 

The command i am currently using to download files is:

wget --continue --tries=0 https://url-of/file.zip 

But now I need a bash script which will read urls from the file and download, two at a time.

The script i came up with so far is:

#!/bin/sh cat file.txt |while read url do wget --continue --tries=0 "$url" done 

But it download single file.

How can i edit it so that it download two files at a time.

6
  • 5
    Don't reinvent the wheel - use xargs -P or GNU parallel for this: xargs -a file.txt -n1 -P 2 wget --continue --tries=0 Commented Sep 11, 2023 at 16:08
  • 1
    Just for future reference, you don't need cat in your loop. Just feed the file directly into the loop: while IFS= read -r url // do // wget... // done <file.txt Commented Sep 11, 2023 at 16:13
  • @muru I am currently using your solution. I am interested to know how the gnu parallel command will look like. Commented Sep 12, 2023 at 11:42
  • @muru I am assuming it is something like cat batch-file.txt | parallel -j2 wget --continue --tries=0 --timeout=60 --waitretry=60 --quiet --show-progress {}. However, it does not show any output. I would like to see the progress bar for both. Commented Sep 12, 2023 at 11:50
  • 1
    If it doesn't show any output at all, check if you have parallel from moreutils (which has a different syntax than GNU parallel). The example command I gave for xargs should work with GNU xargs. Progress bar for both simultaneously is going to be tricky, because you will get overlapping output. What I'd suggest is doing something like: xargs -a file.txt -n1 -P 2 --process-slot-var INDEX sh -c 'exec wget --continue --tries=0 "$1" 2>>"wget-$INDEX.log" _ and running tail -f wget-1.log, tail -f wget-2.log in separate terminals. Commented Sep 12, 2023 at 12:58

1 Answer 1

2

You can put multiple read commands in the while loop. For "two at a time", but them both in the background and then wait for them to complete

while read -r first && read -r second; do wget "$first" & wget "$second" & wait done < input.file 

But @muru's comment is preferable.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.