Here's my take on it. By avoiding several commands you should see some minor improvement in speed though it might not be noticeable. I did add error checking which can save you time on broken URLs.
#file that contains youtube links FILE="/srv/backup/temp/youtube.txt" while read URL ; do [ -z "$URL" ] && continue #get video name if NAME=$(youtube-dl --get-filename -o "%(title)s.%(ext)s" "$URL" --restrict-filenames) ; then #real video url if vURL=$(youtube-dl --get-url $URL) ; then #download file axel -n 10 -o "$NAME" $vURL & else echo "Could not get vURL from $URL" fi else echo "Could not get NAME from $URL" fi done << "$FILE"
By request, here's my proposal for paralleling the vURL and NAME fetching as well as the download. Note: Since the download depends on both vURL and NAME there is no point in creating three processes, two gives you about the best return. Below I've put the NAME fetch in its own process, but if it turned out that vURL was consistently faster, there might be a small payoff in swapping it with the NAME fetch. (That way the while loop in the download process won't waste even a second sleeping.) Note 2: This is fairly crude, and untested, it's just off the cuff and probably needs work. And there's probably a much cooler way in any case. Be afraid...
#!/bin/bash #file that contains youtube links FILE="/srv/backup/temp/youtube.txt" GetName () { # URL, filename if NAME=$(youtube-dl --get-filename -o "%(title)s.%(ext)s" "$1" --restrict-filenames) ; then # Create a sourceable file with NAME value echo "NAME='$NAME'" > "$2" else echo "Could not get NAME from $1" fi } Download () { # URL, filename if vURL=$(youtube-dl --get-url $1) ; then # Wait to see if GetName's file appears timeout=300 # Wait up to 5 minutes, adjust this if needed while (( timeout-- )) ; do if [ -f "$2" ] ; then source "$2" rm "$2" #download file if axel -n 10 -o "$NAME" "$vURL" ; then echo "Download of $NAME from $1 finished" return 0 else echo "Download of $NAME from $1 failed" fi fi sleep 1 done echo "Download timed out waiting for file $2" else echo "Could not get vURL from $1" fi return 1 } filebase="tempfile${$}_" filecount=0 while read URL ; do [ -z "$URL" ] && continue filename="$filebase$filecount" [ -f "$filename" ] && rm "$filename" # Just in case (( filecount++ )) ( GetName "$URL" "$filename" ) & ( Download "$URL" "$filename" ) & done << "$FILE"