How to embed auto-generated English subtitles with either youtube-dl or yt-dlp? - youtube-dl

The key here is to be able to embed the auto-generated subtitles.
The tool can download the autogenerated subtitles but can only embed the real subtitle file, if it exists. I wondered if there would be a way to embed or capture in the video the auto-generated subs. Thanks!

Emmbed Subtitle To Video
--embed-subs: Embed subtitles in the video (only for mp4, webm and mkv videos)
When --embed-subs and --write-subs are used together, the subtitles are written to disk and also embedded in the media file. You can use just --embed-subs to embed the subs and automatically delete the separate file.
don't forget to list the available subtitles using --list-subs.
You can find more like this here
Burn Subtitles To Video
Burn subtitles into the video is also an easy step. You can do this just using commonly used VLC Media Player and can be downloaded from here. I found a great video about this. you can do this with just 4 easy steps by following this video

Related

Saving recorded audio as .wav via openAL

I am trying to record audio using C++ with openAL and save it as .wav file, so far i suceeded with first part, but can't find a way to save audio in a file. I read documentation and didn't find any way to do so. Am I missing something?
To save a *.wav file, you don't really need OpenAL. Look at the *.wav type-specifications, which are all over the web. You have just to create a header which describes your data and paste in your recorded data.

Gstreamer subtitles while recording

I am new to GStreamer and am trying to encode a video stream (for now v4l2src) with a subtitle stream and muxed into an MPEG ts container. I am able to use 'textoverlay' to set the data but I don't want to burn the data into the image. However I am wanting to use the subtitle stream to encode 'metadata' that is generated while the video is being recorded.
Is there a way that I can add subtitles into the MPEG ts as time passes? The content of the subtitle text is not known before hand, for example the gps coords of a moving camera.
There is the 'subtitleoverlay' plugin but I do not fully understand this one. Does it burn the text into the image like the 'textoverlay' or does it add a separate stream?
I think that subtitleoverlay renders and burn text into video frames.. check the example pipeline there is no magic - after subtitle overlay there is videoconvert which works with video frames..
I guess you can just attach subtitle stream into mpegtsmux element. I hope this is possible now - there is this bug/feature request which I hope makes this possible..
I checked the capabilities of mpegtsmux and it supports:
subpicture/x-dvb
 application/x-teletext
If you can somehow manage to input subtitles in form of subpicture/x-dvb then later on receiver you can use dvbsuboverlay element to display them..
I didnt find a way how you can actualy create such stream from text file(I found this question but no answer, maybe ask on IRC)..
I have a feeling that teletext was capable of showing subtitles.. but this may not be what you want (I dont know)..
In both cases I think that if you had rendered stream with rendered subtitles (only subtitles) in form of subtitles.mpg you could use that.. I guess there are some tools out there in wild that you can use for that..
Hope you can use that somehow

QMultimedia - Which Video/Audio encodings and containers are supported?

I'm trying to make a multimedia database system using Qt. I'm using QMultimedia to play back videos in a QVideoWidget.
The following is the code I am using to play a video file in a QVideo Widget:
mMediaPlayer = new QMediaPlayer();
mMediaPlaylist = new QMediaPlaylist();
QMediaContent content(QMediaResource())
mMediaPlaylist->addMedia(QUrl::fromLocalFile(QDir(QString("data")).absoluteFilePath(QString("%1.dat").arg(mMedia.GetUID()))));
mMediaPlayer->setPlaylist(mMediaPlaylist);
mVideoWidget = new QVideoWidget();
mMediaPlayer->setVideoOutput(mVideoWidget);
this->setCentralWidget(mVideoWidget);
mVideoWidget->show();
mMediaPlayer->play();
Basically, it plays a file called 1.dat for example, which is just a renamed video file (video.mp4 for example). However, playing the video never works, and the following error is produced:
DirectShowPlayerService::doRender: Unresolved error code 80040266
With some google searching, I found that this error is because QMultimedia doesn't have the required codecs/filters to play the format of the video. I've tried converting my videos to many different formats using ffmpeg, while trying formats specified at Supported Formats in DirectShow and Supported Media Formats in Media Foundation. I've also tried installing Directshow Filters for Ogg Vorbis, Speex, Theora, FLAC, and WebM, and converting my video to theora/vorbis in an ogg container. Still no go.
I should note that I did manage to play one mpg file, so I do know QMultimedia is working. But I tried converting another video to mimic the properties of that mpg file, and it didn't seem to work, so it seems QMultimedia is extremely specific in what formats it supports.
What system is QMultimedia using for its backend decoding? How can I find out what types of encodings and containers it supports? Is it possible to write my own decoder in Qt?
Thanks
this document describe features supported by QMultimedia backends. Render to widget is not supported now.
I recommend to use another library for encoding/decoding multimedia(f.e. ffmpeg) - QMultimedia now is not stable and I think that correct work only examples from documentation :(

c++ convert/play videos and images

I'm looking for build in library for converting videos/images. i heard something about DirectShow. Do you know any library you have used to convert videos/images?
For transcoding (converting one video format to another) using Directshow is bit tricky, you want to use Media Foundation for this job.
There is Transcode API available in Media Foundation to achieve this task. This link has more details on Transcode API, tutorials and samples to get you started.
You can use DirectShow for grabbing images from video stream. For it you must create your own filter node. It is complex task because of filter is COM object that will work within chain (DirectShow filter graph) of other filter nodes - codecs. So after creating you need register your filter in system. As for me i think you can try it because you can use all registered codecs in system and as result get decompressed/final image into your filter. As other solution i think that you can try to use modules from some open source media player. For example try VideoLAN but as i know it is big thing and not easy to use.
Good luck!

Creating web-browser playable webm files with vp8 SDK?

I'm using the vp8 SDK (www.webmproject.org) to create a vp8-encoded video file. However, the SDK sample produces an IVF file, which the browser doesn't play.
I know the webm format is a matroska container so I guess I should store the video data in that format, but the mkv format specification is lengthy and complex and I don't think I should reinvent the wheel by figuring it out by myself.
So I would like to know if someone can recommend a sample of how to encode and produce a playable webm vp8 file.
If there is no such sample (as my searches on google suggest) at least point me to a simple and usable matroska lib which is proven to work for the browsers.