youtube-dl download stops on playlist with video missing - youtube-dl

I am using youtube-dl to download from a playlist for offline viewing. The operators of the playlist have started putting a scheduled video in the playlist that causes the downloads to fail. When trying to download the videos on the playlist, when it tries to download a video that isn't available (the scheduled video), it fails and the downloads abort.
How can I have the playlist download continue when there is a missing video?
My command:
/share/Multimedia/temp/youtube-dl -f 'best[ext=mp4]' -o "/share/Multimedia/YouTube/TheNational/%(upload_date)s.%(title)s.%(ext)s" --restrict-filenames --dateafter today-3day --no-mtime --download-archive "/share/Multimedia/temp/dllist-thenational.txt" --playlist-end 10 https://www.youtube.com/playlist?list=PLvntPLkd9IMcbAHH-x19G85v_RE-ScYjk
The download results from today:
[youtube:playlist] PLvntPLkd9IMcbAHH-x19G85v_RE-ScYjk: Downloading webpage
[download] Downloading playlist: The National | Full Show | Live Streaming Nightly at 9PM ET
[youtube:playlist] playlist The National | Full Show | Live Streaming Nightly at 9PM ET: Downloading 10 videos
[download] Downloading video 1 of 10
[youtube] pZ2AG5roG-A: Downloading webpage
[youtube] pZ2AG5roG-A: Downloading video info webpage
ERROR: This video is unavailable.
I want to playlist download to ignore the missing file and continue to the next available video.
Thanks.

I would add these before -f
-i, --ignore-errors
Continue on download errors, for example to skip unavailable videos in a playlist
-c, --continue
Force resume of partially downloaded files. By default, youtube-dl will resume downloads if possible.

Related

Youtube-dl error messages & infinite looping

I've used youtube-dl successfully for quite a while, until recently. Now it seems to me to be useless for grabbing videos off of YouTube or Odysee, which are the only platforms I really use.
I have already tried uninstalling & reinstalling.
Here are the error messages; similar errors occur regardless of the way I format the command, or which video I'm attempting to grab.
Here is an example from Odysee:
will#will-Inspiron-15-7579:~$ youtube-dl https://odysee.com/#betterthanfood:4/brave-new-world-aldous-huxley-book:f
[generic] brave-new-world-aldous-huxley-book:f: Requesting header
WARNING: Falling back on generic information extractor.
[generic] brave-new-world-aldous-huxley-book:f: Downloading webpage
[generic] brave-new-world-aldous-huxley-book:f: Extracting information
[generic] f0338bc5ed0263d3832d5288807d43fb222e0a10?: Requesting header
[redirect] Following redirect to https://odysee.com/$/embed/brave-new-world-aldous-huxley-book/f0338bc5ed0263d3832d5288807d43fb222e0a10
[generic] f0338bc5ed0263d3832d5288807d43fb222e0a10: Requesting header
WARNING: Falling back on generic information extractor.
[generic] f0338bc5ed0263d3832d5288807d43fb222e0a10: Downloading webpage
[generic] f0338bc5ed0263d3832d5288807d43fb222e0a10: Extracting information
[generic] f0338bc5ed0263d3832d5288807d43fb222e0a10?: Requesting header
[redirect] Following redirect to https://odysee.com/$/embed/brave-new-world-aldous-huxley-book/f0338bc5ed0263d3832d5288807d43fb222e0a10
[generic] f0338bc5ed0263d3832d5288807d43fb222e0a10: Requesting header
WARNING: Falling back on generic information extractor.
[generic] f0338bc5ed0263d3832d5288807d43fb222e0a10: Downloading webpage
^C
ERROR: Interrupted by user
As you can see, this command triggered infinite recursion/iteration which would have continued indefinitely; I had to manually terminate the program with CTRL+C
Here's an example from Youtube, which has a different problem:
will#will-Inspiron-15-7579:~$ youtube-dl https://www.youtube.com/watch?v=U5afsxvz75c
[youtube] U5afsxvz75c: Downloading webpage
[youtube] U5afsxvz75c: Downloading video info webpage
WARNING: unable to download video info webpage: HTTP Error 404: Not Found
WARNING: unable to download video info webpage: HTTP Error 404: Not Found
WARNING: unable to download video info webpage: HTTP Error 404: Not Found
WARNING: unable to download video info webpage: HTTP Error 404: Not Found
WARNING: unable to download video info webpage: HTTP Error 404: Not Found
Traceback (most recent call last):
File "/usr/bin/youtube-dl", line 6, in <module>
youtube_dl.main()
File "/usr/lib/python3/dist-packages/youtube_dl/__init__.py", line 476, in main
_real_main(argv)
File "/usr/lib/python3/dist-packages/youtube_dl/__init__.py", line 466, in _real_main
retcode = ydl.download(all_urls)
File "/usr/lib/python3/dist-packages/youtube_dl/YoutubeDL.py", line 1989, in download
url, force_generic_extractor=self.params.get('force_generic_extractor', False))
File "/usr/lib/python3/dist-packages/youtube_dl/YoutubeDL.py", line 785, in extract_info
ie_result = ie.extract(url)
File "/usr/lib/python3/dist-packages/youtube_dl/extractor/common.py", line 440, in extract
ie_result = self._real_extract(url)
File "/usr/lib/python3/dist-packages/youtube_dl/extractor/youtube.py", line 1607, in _real_extract
token = video_info.get('token') or video_info.get('account_playback_token')
AttributeError: 'NoneType' object has no attribute 'get'
Again, I just uninstalled and reinstalled youtube-dl using apt, so this should be the latest version. Output of youtube-dl --version is 2018.03.14.
Thanks for any help you may be able to provide :)
You are using a very outdated version. First, uninstall apt package, then run:
sudo curl -L https://yt-dl.org/downloads/latest/youtube-dl -o /usr/local/bin/youtube-dl
and
sudo chmod a+rx /usr/local/bin/youtube-dl
Latest version is now 2021-05-16, and works well with your URL

Does youtube-dl still work(newest version youtube-dl-2020.2.16)?

Used command:
youtube-dl --max-filesize 30m -f m4a -o "/home/dc3014b3c6a1a23bba88b2a1fbcc1447.m4a" "https://www.youtube.com/watch?v=_Xa0ydtx8PM"
youtube-dl can't work at all for me. Error happened like these:
ERROR: Unable to download webpage: <urlopen error EOF occurred in violation of protocol (_ssl.c:618)> (caused by URLError(SSLEOFError(8, u'EOF occurred in violation of protocol (_ssl.c:618)'),))
OR
ERROR: Unable to download webpage: HTTP Error 429: Too Many Requests (caused by HTTPError()); please report this issue on https://yt-dl.org/bug . Make sure you are using the latest version; type youtube-dl -U to update. Be sure to call youtube-dl with the --verbose flag and include its complete output.
But When I use curl command to get url content, it's ok.
curl -v https://www.youtube.com/watch?v=_Xa0ydtx8PM
How can I resolve it?
From the error message, You might want to be sure you are using the latest version of youtube-dl. you might want to update it. Am assuming you are using a *nix system. Also depending on how you first installed, there are several options for updating. here are a few options;
For manual installations:
you can simply run youtube-dl -U or, on Linux, sudo youtube-dl -U.
If you are already running on an update version, you may want to consider the below as best methods to download videos. Mind you that with the new version of youtube-dl, it automatically downloads the best version for you so you do not need to specify although you could still do this to be sure.
# Download best mp4 format available or any other best if no mp4 available
$ youtube-dl -f 'bestvideo[ext=mp4]+bestaudio[ext=m4a]/best[ext=mp4]/best'
# Download best format available but no better than 480p
$ youtube-dl -f 'bestvideo[height<=480]+bestaudio/best[height<=480]'
# Download best video only format but no bigger than 50 MB
$ youtube-dl -f 'best[filesize<50M]'
# Download best format available via direct link over HTTP/HTTPS protocol
$ youtube-dl -f '(bestvideo+bestaudio/best)[protocol^=http]'
# Download the best video format and the best audio format without merging them
$ youtube-dl -f 'bestvideo,bestaudio' -o '%(title)s.f%(format_id)s.%(ext)s'
Here is a link for reference and further instructions/support.
I hope this helps. If not, let me know. Glad to help to the end.
Just wanted to share the youtube-dl alternative (it is just a fork) that for now (September 2022) works fine and supports almost all youtube-dl features yt-dlp.
I switched to this project and now I am using the same scripts which I had have previously.

How get Uploader Name of Downloded video from youtube-dl?

I have downloaded some video from youtube using youtube-dl from many different playlists. Now i want all video's title should be included there uploader's name or channels name without downloading all video again so which cmd i need i am using window 10.
You may extract uploader name from -j JSON metadata. E.g. as:
youtube-dl.exe -j https://www.youtube.com/watch?v=YOUR-URL | python.exe -c "import sys, json; print(json.load(sys.stdin)['uploader'])"
-j option doesn't download a whole video.

How to get kaggle competition data via command line on virtual machine?

I am looking for the easiest way to download the kaggle competition data (train and test) on the virtual machine using bash to be able to train it there without uploading it on git.
Fast-forward three years later and you can use Kaggle's API using the CLI, for example:
kaggle competitions download favorita-grocery-sales-forecasting
First you need to copy your cookie information for kaggle site in a text file. There is a chrome extension which will help you to do this.
Copy the cookie information and save it as cookies.txt.
Now transfer the file to the EC2 instance using the command
scp -i /path/my-key-pair.pem /path/cookies.txt user-name#ec2-xxx-xx-xxx-x.compute-1.amazonaws.com:~
Accept the competitions rules and copy the URLs of the datasets you want to download from kaggle.com. For example the URL to download the sample_submission.csv file of Intel & MobileODT Cervical Cancer Screening competition is: https://kaggle.com/c/intel-mobileodt-cervical-cancer-screening/download/sample_submission.csv.zip
Now, from the terminal use the following command to download the dataset into the instance.
wget -x --load-cookies cookies.txt https://kaggle.com/c/intel-mobileodt-cervical-cancer-screening/download/sample_submission.csv.zip
Install CurlWget chrome extension.
start downloading your kaggle data-set. CurlWget will give you full wget command. paste this command to terminal with sudo.
Job is done.
Install cookies.txt extension on chrome and enable it.
Login to kaggle
Go to the challenge page that you want the data from
Click on cookie.txt extension on top right and it download the current page's cookie. It will download the cookies in cookies.txt file
Transfer the file to the remote service using scp or other methods
Copy the data link shown on kaggle page (right click and copy link address)
run wget -x --load-cookies cookies.txt <datalink>

Process stops when one URL in file causes error

I use youtube-dl -a filename to download the videos. However, when one URL in the list of URLs fail, the process exits, is there a way to skip the failing URL and proceeding with the remaining URLs?
The man page of youtube-dl says:
-i, --ignore-errors Continue on download errors, for example to skip unavailable
videos in a playlist
Thus:
youtube-dl -i -a filename
edit: I strongly advice you to run
youtube-dl -U
prior to any download, as the world of online videos is fast changing and updates often fix download errors. Moreover, some errors are due to content restriction and can be solved by adding login and password to the tool:
youtube-dl -u USERNAME -p PASSWORD