6

I am using Linux Mint 20.

I am using a vpn with a kill switch (protonvpn-cli ks --on).

So, if the vpn connection drops for some reason, the network get disconnected.

When the network get disconnected, my youtube-dl download stops permanently with the error

ERROR: Unable to download JSON metadata: <urlopen error [Errno -2] Name or service not known> (caused by URLError(gaierror(-2, 'Name or service not known')))

The issue is, I want youtube-dl to pause instead of closing, and resume when the connection is back.

I checked Retry when connection disconnect not working but I do not think it is relevant to my problem.

My config file looks like

--abort-on-error
--no-warnings
--console-title
--batch-file='batch-file.txt'
--socket-timeout 10
--retries 10
--continue
--fragment-retries 10 

As I use batch files, I do not want to start the process from the beginning. I just want to pause the youtube-dl process till I get connected again and then continue the process.

How can I do that?

Update 1:

So far, what I have found is, to pause a process we can do something like:

$ kill -STOP 16143

To resume a process we can do something like:

$ kill -CONT 16143

I am not sure but think that we can know if my network is up or not by pinging1 2:

#!/bin/bash
HOSTS="cyberciti.biz theos.in router"

COUNT=4

for myHost in $HOSTS do count=$(ping -c $COUNT $myHost | grep 'received' | awk -F',' '{ print $2 }' | awk '{ print $1 }') if [ $count -eq 0 ]; then # 100% failed echo "Host : $myHost is down (ping failed) at $(date)" fi done

However, it does not seem like an efficient solution.

Linux: execute a command when network connection is restored suggested using ifplugd or using /etc/network/if-up.d/.

There is another question and a blog post which mention using /etc/NetworkManager/dispatcher.d.

As I am using Linux Mint, I think any solution revolving around NetworkManager will be easier for me.

Ahmad Ismail
  • 2,678
  • I have been using --socket-timeout 3600 and my download did not stop for a long time now. However, the documentation is not clear and I am not sure what --socket-timeout actually does. I am not even sure whether (in the mean time) the vpn connection dropped and I got disconnected or not. – Ahmad Ismail Jan 05 '21 at 06:05

2 Answers2

1

Here is something I wrote now that run on each line of batch-file.txt and run youtube-dl on it.
If there is no connection to the web site you trying to download from it will loop until connection restored (you should probably need to add some timeout on it since it will not stop.)
I used curl with expected 200 status code since if this don't work you probably won't be able to download.

The content of batch-file.txt is the same as before.

Running the script:

download.sh ./batch-file.txt {download_web_site_url}
# Example:
download.sh ./batch-file.txt google.com`
#!/bin/bash
# B"H

Insted of using youtube-dl batch-file option read the file using bash.

URLS=$1 # "batch-file.txt"

Pass the URL for site you trying to download from.

SITE_URL=$2

This function check if url is returning 200 status code.

check_site_connection() { # curl to 'your_site_url' with -I option for only the respons info only # pipe the respons tp awk with 'NR < 2' so only the first line with the status code is printed. # Return up if respons is 200. if [[ $(curl -Is $SITE_URL | awk 'NR < 2 {print $2}') == "200" ]]; then echo up; else echo down; fi }

read the batch-file.

while IFS= read -r line do # for each line in batch-file loop until conection to site is up with 2 seconds delay. while [[ $(check_site_connection) == "down" ]] do echo "waiting for internet conection" sleep 2 done # run youtube-dl where line is the url to dowmload. ./youtube-dl "$line" done < "$URLS"

Edit:

Just found this in the README.md and tested it working .
This for case each line is a playlist and not separate video.

youtube-dl --download-archive archive.txt URL_TO_PLAYLIST

This will download only new videos each time you run so if you add --download-archive archive.txt to the script above in ./youtube-dl --download-archive archive.txt "$line" So if it's start again it will go over all the playlist but will only start download from where it's stops.

Shmuel
  • 341
  • why can't you use pgrep youtube-dl with kill -STOP and kill -CONT. Why don't you use https://developer.gnome.org/NetworkManager/unstable/NetworkManager.html it has vpn-up, vpn-down and /etc/NetworkManager/dispatcher.d. I think it will be much reasonable solution. – Ahmad Ismail Jan 24 '21 at 16:50
  • First. The curl is running against the site without username and password, just the main page or something like this, so your user should not be associated with anything, maybe your IP. – Shmuel Jan 24 '21 at 16:54
  • Second. The curl command is running in loop if the status code is not ok = 200 so if that the case assuming there is no connection then the site not getting any requests. – Shmuel Jan 24 '21 at 16:56
  • Also you can use different site for the up check like https://google.com if you worry, but I think there is no problem using the download site URL – Shmuel Jan 24 '21 at 16:58
  • Please correct me if I am wrong. I am assuming youtube-dl "$line" will start over the playlist from the beginning in case the network goes down. Suppose I am on the 55th video of the playlist. Then it get disconnected. It will then start over from the beginning of that playlist. It will go through each link of the video in the playlist till it reach 56th fast. In that case, It will also block my account and trigger captcha. – Ahmad Ismail Jan 24 '21 at 16:59
1

please use below code YoutubeDown.bash

#!/bin/bash
if [[ $2 == 0 ]]
then
  if [[ ! -a youtubePlayListNames ]] 
  then
    youtube-dl --get-filename -o '%(title)s' $1  --restrict-filenames | awk 'BEGIN{OFS=",";}{print NR,$0}' > youtubePlayListNames
  fi
  for q in $(cat youtubePlayListNames)
  do
    row=$(echo $q | awk -F',' '{print $1}')
    fName=$(echo "$q" | awk -F',' '{print $2}')
    [[ $(find . -name *$fName* | wc -l) -eq 1 ]] && continue || break
  done
  youtube-dl --playlist-start $row -o '%(uploader)s/%(playlist)s/%(playlist_index)s - %(title)s.%(ext)s' $1 --restrict-filenames
elif [[ $2 == 1 ]]
then
  youtube-dl -o '%(uploader)s/%(playlist)s/%(playlist_index)s - %(title)s.%(ext)s' $1 --restrict-filenames
fi

I used like below:

  • first time YoutubeDown.bash https://youtubeURL 0
  • after that and when the download finished, I passed 1 as the second parameter to ensure all things downloaded I changed my code as previous unfortunately did not work.I test it and work well.