playing a video with subtitles (no audio) using gstreamer pipeline is giving me errors - gstreamer

gst-launch-1.0 filesrc location=subtitleseng.srt ! subparse ! overlay. filesrc location=video.mp4 ! qtdemux ! queue ! theoradec ! ffmpegcolorspace ! subtitleoverlay name=overlay ! xvimagesink
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
WARNING: from element /GstPipeline:pipeline0/GstSubtitleOverlay:overlay: Internal GStreamer error: negotiation problem. Please file a bug at http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer.
Additional debug info:
gstsubtitleoverlay.c(799): _pad_blocked_cb (): /GstPipeline:pipeline0/GstSubtitleOverlay:overlay:
Subtitle sink is blocked but we have no subtitle caps
ERROR: from element /GstPipeline:pipeline0/GstQTDemux:qtdemux0: GStreamer encountered a general stream error.
Additional debug info:
qtdemux.c(3891): gst_qtdemux_loop (): /GstPipeline:pipeline0/GstQTDemux:qtdemux0:
streaming stopped, reason not-linked
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
I even tried
gst-launch-1.0 filesrc location=/subtitleseng.srt ! subparse ! input-selector ! sub. filesrc location=video.mp4 ! decodebin ! input-selector ! streamsynchronizer name=sub ! subtitleoverlay name=sub ! xvimagesink
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstFileSrc:filesrc1: Resource not found.
Additional debug info:
gstfilesrc.c(508): gst_file_src_start (): /GstPipeline:pipeline0/GstFileSrc:filesrc1:
No such file "home/usr/Downloads/video.mp4"
Setting pipeline to NULL ...
Freeing pipeline ...
Giving this error even if the file is present.
please help me solve this or directions that would help me do the same

Its working for me like this - and I indeed see the subtitles:
gst-launch-1.0 filesrc location=cartoon.mp4 ! decodebin ! video/x-raw ! videoconvert ! subtitleoverlay name=over ! autovideosink filesrc location=subs.srt ! subparse ! over.
The trick was in videoconvert before subtitleoverlay..
HTH

Related

gstreamer: Demux & Remux MKV, preserving video

I am trying to reencode the audio part of a MKV file that contains some video/x-h264 and some audio/x-raw. I can't manage to just demux the MKV and remux it. Even simply:
gst-launch-1.0 filesrc location=test.mkv ! matroskademux name=demux \
matroskamux name=mux ! filesink location=test2.mkv \
demux.video_00 ! mux.video_00 \
demux.audio_00 ! mux.audio_00
fails miserably with:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
WARNING: from element /GstPipeline:pipeline0/GstMatroskaDemux:demux: Delayed linking failed.
Additional debug info:
../gstreamer/gst/parse/grammar.y(506): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstMatroskaDemux:demux:
failed delayed linking pad video_00 of GstMatroskaDemux named demux to pad video_00 of GstMatroskaMux named mux
WARNING: from element /GstPipeline:pipeline0/GstMatroskaDemux:demux: Delayed linking failed.
Additional debug info:
../gstreamer/gst/parse/grammar.y(506): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstMatroskaDemux:demux:
failed delayed linking pad audio_00 of GstMatroskaDemux named demux to pad audio_00 of GstMatroskaMux named mux
ERROR: from element /GstPipeline:pipeline0/GstMatroskaDemux:demux: Internal data stream error.
Additional debug info:
../gst-plugins-good/gst/matroska/matroska-demux.c(5715): gst_matroska_demux_loop (): /GstPipeline:pipeline0/GstMatroskaDemux:demux:
streaming stopped, reason not-linked (-1)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
My best attempt at the transcoding mentioned above goes:
gst-launch-1.0 -v filesrc location=test.mkv ! matroskademux name=demux \
matroskamux name=mux ! filesink location=test2.mkv \
demux.video_00 ! queue ! 'video/x-h264' ! h264parse ! mux. \
demux.audio_00 ! queue ! rawaudioparse ! audioconvert ! audioresample ! avenc_aac ! mux.
with the same result. Removing the pad name audio_00 leads to gst being stuck at PREROLLING.
I have seen a few people facing similar problems:
http://gstreamer-devel.966125.n4.nabble.com/Putting-h264-file-inside-a-container-td4668158.html
http://gstreamer-devel.966125.n4.nabble.com/Changing-the-container-format-td3576914.html
As therein, keeping only video or only audio works.
I think the rawaudioparse should not be here. I tried your pipeline and trouble with it too. I just came up with something as I would have done it and it seemed to work:
filesrc location=test.mkv ! matroskademux \
matroskademux0. ! queue ! audioconvert ! avenc_aac ! matroskamux ! filesink location=test2.mkv \
matroskademux0. ! queue ! h264parse ! matroskamux0.
Audio in my case was:
Stream #0:0(eng): Audio: pcm_f32le, 44100 Hz, 2 channels, flt, 2822 kb/s (default)
Another format may require addiitonal transformations..
The problem is that the pads video_00 and audio_00 have been renamed video_0 and audio_0. This can be seen using gst-inspect-1.0 matroskademux, which indicates that the format for the pads now reads video_%u. Note that some documentation pages of gstreamer are not updated to reflect that.
The first command, MKV to MKV should read:
gst-launch-1.0 filesrc location=test.mkv ! matroskademux name=demux \
matroskamux name=mux ! filesink location=test2.mkv \
demux.video_0 ! queue ! mux.video_0 \
demux.audio_0 ! queue ! mux.audio_0
(Note the added queues)
The second command, MKV to MKV reencoding audio should read:
gst-launch-1.0 -v filesrc location=test.mkv ! matroskademux name=demux \
matroskamux name=mux ! filesink location=test2.mkv \
demux.video_0 ! queue ! 'video/x-h264' ! h264parse ! mux. \
demux.audio_0 ! queue ! rawaudioparse ! audioconvert ! audioresample ! avenc_aac ! mux.
The same result could have been achieved by not specifying the pads and using cap filters if needed.
Thanks go to user Florian Zwoch for providing a working pipeline.

Gstreamer: how to link decodebin to encodebin? (error: failed delayed linking some pad of ...)

Naively, I am trying to link decodebin to encodebin:
$ gst-launch-1.0 filesrc location="/tmp/sound.wav" ! decodebin ! encodebin profile="application/ogg:video/x-theora:audio/x-vorbis" ! filesink location="/tmp/sound.ogg"
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
WARNING: from element /GstPipeline:pipeline0/GstDecodeBin:decodebin0: Delayed linking failed.
Additional debug info:
./grammar.y(510): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstDecodeBin:decodebin0:
failed delayed linking some pad of GstDecodeBin named decodebin0 to some pad of GstEncodeBin named encodebin0
ERROR: from element /GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstWavParse:wavparse0: Internal data stream error.
Additional debug info:
gstwavparse.c(2293): gst_wavparse_loop (): /GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstWavParse:wavparse0:
streaming stopped, reason not-linked (-1)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
Surely, there is something missing. What is it?
Note, this works:
gst-launch-1.0 filesrc location="/tmp/sound.wav" ! decodebin ! audioconvert ! vorbisenc ! oggmux ! filesink location="/tmp/sound.ogg"
gst-launch-1.0 filesrc location="/tmp/sound.wav" ! decodebin ! encodebin profile="application/ogg:video/x-theora:audio/x-vorbis" ! filesink location="/tmp/sound.ogg"
encodebin doesn't have templates in their pads so gst-launch doesn't know which pad to request (video or audio). You can explicitly ask for one of them using:
gst-launch-1.0 filesrc location="/tmp/sound.wav" ! decodebin ! enc.audio_%u encodebin name=enc profile="application/ogg:video/x-theora:audio/x-vorbis" ! filesink location="/tmp/sound.ogg"
Notice how we give encodebin a name "enc" and then we link decodebin to the audio pad as we know that this is an audio-only file.
If we had both video and audio you'd need to link explicitly the video pad from decodebin to the video pad of encodebin and so forth. You would give a name to decodebin as well and link them. Something like: decodebin name=dec dec.audio_%u ! queue ! enc.audio_%u dec.video_%u ! queue ! enc.video_%u
As a final suggestion, it is recommended to have a queue after every element that can branch of into multiple paths, like decodebin. It is mandatory when you have more than one output from it, but doesn't hurt to have it even if you only have one.

GStreamer opusdec: Try decode Opus bitstream failed

Initial Issue
I would like to use gstreamer plugin opusdec to decode an Opus bitstream. The final purpose is to make glue around it with appsrc and appsink as input/output, to decode 20 ms Opus packets coming from a RTP packet payload and provide PCM sample.
Remark: I can't use gstreamer rtpopusdepay
The following pipeline works:
gst-launch-1.0 filesrc location=testvector01.bit.opus ! oggdemux !
opusdec ! fakesink
In my final application I'm no expected OGG contained data so I did the following:
1) Desencapsulate Opus bitstream
gst-launch-1.0 filesrc location=testvector01.bit.opus ! oggdemux !
filesink location = testvector01.bit.demux
That works. And then:
2) Decode Opus bitstream
gst-launch-1.0 filesrc location=testvector01.bit.demux ! opusdec !
fakesink
and I have the following error:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
ERROR: from element /GstPipeline:pipeline0/GstFileSrc:filesrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstFileSrc:filesrc0:
streaming task paused, reason error (-5)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ..
Input File
testvector01.bit.opus
From Opus test vector :https://people.xiph.org/~greg/opus_testvectors/
My question is:
What is the proper way to use gstreamer plugin opusec without transport container?
Update
Gstreamer version 1.2.4
As recommended I tried to add opusparse after filesrc and got the following error.
Pipeline is PREROLLING ...
(gst-launch-1.0:5147): GStreamer-WARNING **:
gstpad.c:4555:store_sticky_event:<opusparse0:src> Sticky event
misordering, got 'caps' before 'stream-start'
(gst-launch-1.0:5147): GStreamer-WARNING **:
gstpad.c:4555:store_sticky_event:<opusdec0:sink> Sticky event
misordering, got 'caps' before 'stream-start' Pipeline is PREROLLED
... Setting pipeline to PLAYING ... New clock: GstAudioSinkClock
ERROR: from element /GstPipeline:pipeline0/GstOpusDec:opusdec0:
Decoding error: -4 Additional debug info: gstopusdec.c(460):
opus_dec_chain_parse_data ():
/GstPipeline:pipeline0/GstOpusDec:opusdec0 Execution ended after
0:00:00.063372478 Setting pipeline to PAUSED ... Setting pipeline to
READY ... Setting pipeline to NULL ... Freeing pipeline ...
GStreamer 1.8.1
The following pipeline
gst-launch-1.0 filesrc location = testvector01.bit.demux ! opusparse !
opusdec ! audioconvert ! alsasink
halt here:
Setting pipeline to PAUSED ... Pipeline is PREROLLING ...
Gstreamer 1.13.1
gst-launch-1.0 filesrc location = testvector01.bit.demux ! opusparse !
opusdec ! alsasink
Playback just produce a short audio glitch while no gstreamer error is raised.
gst-launch-1.0 filesrc location = testvector01.bit.opus ! oggdemux ! opusparse !
opusdec ! alsasink
Playback is choppy while no gstreamer error is raised.
Regards,
appsrc is-live=true do-timestamp=true name=audiosrc ! opusparse ! oggmux ! filesink location=test.ogg
gstreamer 1.14.1 works fine
You need to have a parser (opusparse) in between as opusdec doesn’t know what format it is, try the following pipeline:
gst-launch-1.0 filesrc location=testvector01.bit.demux ! opusparse !
opusdec ! fakesink dump=true

play encoded stream in gstreamer

I used the following GStreamer pipeline to store my encoded stream in a binary file:
gst-launch v4l2src ! videorate ! video/x-raw-yuv, framerate=\(fraction\)10/1 \
! videoscale ! video/x-raw-yuv, format=\(fourcc\)I420, width=640, height=480\
! ffmpegcolorspace ! x264enc ! fdsink > vid.bin
Now i want to play previously recorded files in GStreamer using the following pipeline:
cat vid.bin | gst-launch fdsrc ! ffdec_h264 ! autovideosink
But then it gives the following error:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
ERROR: from element /GstPipeline:pipeline0/ffdec_h264:ffdec_h2640: Internal GStreamer error: negotiation problem. Please file a bug at http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer.
Additional debug info:
gstffmpegdec.c(2804): gst_ffmpegdec_chain (): /GstPipeline:pipeline0/ffdec_h264:ffdec_h2640:
ffdec_h264: input format was not set before data start
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
I know that the best way to capture video is using Muxers but is there any way to play my previous files?
Thanks
Not sure your pipeline is right.
If you want to write to a file why not simply use filesink and filesrc.
fdsink > vid.bin will not work fine because if you see the prints by gstreamer gst-launch will also go into the file. [Just open vid.bin in an text editor and you will see what I mean].
Also for x264 stream to be stored without a muxer you need to use byte-stream=1 in your x264enc to store it in annexb format so that it is decodable.
To play back raw x264 stream you need to have a color space convertor before the video sink
gst-launch filesrc location=inputfile ! legacyh264parse ! ffdec_h264 ! queue ! ffmpegcolorspace ! autovideosink
plays just fine here at my end
Or, to playback a raw h264 file with gstreamer 1.0:
gst-launch-1.0 filesrc location=/tmp/video.h264 ! h264parse ! avdec_h264 ! autovideosink

music visualization error with gstreamer

hi I am trying to visualize a music file in gstreamer using the following command:
gst-launch filesrc location=file.mp3 ! decodebin ! audioconvert !
tee name=myT myT. ! queue ! autoaudiosink myT. ! queue ! goom !
colorspace ! autovideosink
But I get this error : "There may be a timestamping problem, or this computer is too slow."
Pipeline is PREROLLING ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstAudioSinkClock
WARNING: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstDshowVideoSink:autovideosink0-actual-sink-dshowvideo: A lot of buffers are being dropped.
Additional debug info:
..\Source\gstreamer\libs\gst\base\gstbasesink.c(2572): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstDshowVideoSink:autovideosink0-actual-sink-dshowvideo:
There may be a timestamping problem, or this computer is too slow.
ERROR: from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0
Assuming this is something to do with the thread, I tried the following command:
gst-launch filesrc location=file.mp3 ! decodebin ! audioconvert ! tee name=myT
{ ! queue ! autoaudiosink } { tee. ! queue ! goom ! colorspace ! autovideosink }
But then it gives the folloiwng link error:
** (gst-launch-0.10:5308): WARNING **: Trying to connect elements that don't share a common ancestor: tee and queue1
0:00:00.125000000 5308 003342F0 ERROR GST_PIPELINE grammar.tab.c:656:gst_parse_perform_link: could not link tee to queue1
WARNING: erroneous pipeline: could not link tee to queue1
Can anyone tell what is wrong? Thanks
I cannot give you an exact answer because i don't have windows installed.
For debugging this use your first pipeline (in linux works). Use parameter -v with gst-launch and put element identity just before autovideosink. This will print buffer information that passes through element identity, look for anything strange.
Also you could try to use directdrawsink instead of autovideosink. Another test that i will do is to generate the audio with audiotestsrc.
Remember that if you find a bug you can open a bug report in gnome bugzilla so GStreamer developers are aware that there is a problem. Even you could fix it yourself and send a patch.
For There may be a timestamping problem, or this computer is too slow. Error Try sync=false like
`gst-launch filesrc location=file.mp3 ! decodebin ! audioconvert ! tee name=myT myT. ! queue ! autoaudiosink myT. ! queue ! goom ! colorspace ! autovideosink sync=false`
or you may have to try at both sink ends of the Tee like
`gst-launch filesrc location=file.mp3 ! decodebin ! audioconvert ! tee name=myT myT. ! queue ! autoaudiosink sync=false myT. ! queue ! goom ! colorspace ! autovideosink sync=false`
I also observed that if you replace autovideosink with xvimagesink or ximagesink the timestamping problem apparently seems to be solved.