I'm downloading files in parallel through lftp onto a remote server, and I use the commands:
lftp ftp:*url* cd into directory mirror --parallel=10 I'm downloading a total of 365 files separated into 12 directories. When I use this command it takes a few hours, which isn't a deal breaker but I figured the more files I download in parallel the faster it will go. Obviously there has to be a point of diminishing returns, as I expect downloading all 365 files in parallel would overload the system. (Each file is little over a gigabyte). Plus, as I have increased the number of files to download in parallel, i increasingly get the message waiting to reconnect....trying in 30s etc. as I believe I am overtaxing the system.
Does anyone have thoughts on efficient ways to download files in parallel? Thanks in advance