Gstreamer pipeline for converting files with optional audio/video - gstreamer

I am using the following pipeline to convert an flv file to mp4.
gst-launch-1.0 -vvv -e filesrc location="c.flv" ! flvdemux name=demux \
demux.audio ! queue ! decodebin ! audioconvert ! faac bitrate=32000 ! mux. \
demux.video ! queue ! decodebin ! videoconvert ! video/x-raw,format=I420 ! x264enc speed-preset=superfast tune=zerolatency psy-tune=grain sync-lookahead=5 bitrate=480 key-int-max=50 ref=2 ! mux. \
mp4mux name=mux ! filesink location="c.mp4"
The problem is when (for example) audio is missing, the pipeline gets stuck. (Same thing happens if just hooking a fakesink to demux.audio).
I need a way for the filters to ignore missing tracks, or produce empty tracks.

Related

gstreamer saved files have no audio

I'm trying to use this command to create multiple files from stream but they have no audio playback, I think decodebin should be dealing with it, what am I doing wrong?
gst-launch-1.0 -e filesrc location=video.mp4 ! queue ! decodebin ! queue ! videoconvert ! queue ! timeoverlay ! x264enc key-int-max=10 ! h264parse ! splitmuxsink location=videos/test%02d.mp4 max-size-time=1000000000000
Why do you make that assumption that decodebin will handle it? decodebin will decode the audio track to raw audio and exposes an audio pad. If you don't make use of that pad it will not make itself into the file.
Since you transcode you will have to re-encode the audio too:
gst-launch-1.0 -e filesrc location=video.mp4 ! queue ! decodebin ! queue ! \
videoconvert ! queue ! timeoverlay ! x264enc key-int-max=10 ! h264parse ! \
splitmuxsink location=videos/test%02d.mp4 max-size-time=1000000000000 \
decodebin0. ! queue ! voaacenc ! aacparse ! splitmuxsink0.
If you don't want to re-encode but passthrough the audio decodebin is the wrong way. parsebin may be a better fit in that case.

capture segmented audio and video with gstreamer

I'm trying to record audio and video from internal webcam and mic to segmented files with gstreamer.
It works to a single file by doing:
gst-launch-1.0 -e avfvideosrc !
video/x-raw ! vtenc_h264 ! h264parse ! queue !
mpegtsmux name=mux ! filesink location=test.mp4 osxaudiosrc !
decodebin ! audioconvert ! faac ! aacparse ! queue ! mux.
It doesn't work when doing:
gst-launch-1.0 -e avfvideosrc !
video/x-raw ! vtenc_h264 ! h264parse ! queue !
splitmuxsink
muxer=mpegtsmux
location=test%04d.mp4
max-size-time=1000000000
name=mux osxaudiosrc !
decodebin ! audioconvert ! faac ! aacparse ! queue ! mux.
saying erroneous pipeline: could not link queue1 to mux
I'm using gstreamer 1.12.3 on Mac OSX Sierra
Note: The H264/AAC encoding isn't necessary for what I want to achieve, so if there are solutions that only work with e.g. avimux, for whatever reason, that's fine.
EDIT: I've tried this on a windows machine with the same error.
gst-launch-1.0 -ev ksvideosrc ! video/x-raw !
videoconvert ! queue !
splitmuxsink max-size-time=1000000000 muxer=avimux name=mux
location=video%04d.avi autoaudiosrc !
decodebin ! audioconvert ! queue ! mux.
Just like on Mac, replacing splitmuxsink with avimux ! filesink works. I'm sure I'm just missing out on some 'pipeline' logic so any clarifiction that can push me in the right direction would be helpful.
I needed to send the audio stream to the audio track of the muxer like so: mux.audio_0
gst-launch-1.0 -ev ksvideosrc ! video/x-raw !
videoconvert ! queue !
splitmuxsink max-size-time=1000000000 muxer=avimux name=mux
location=video%04d.avi autoaudiosrc !
decodebin ! audioconvert ! queue ! mux.audio_O
This happens when the documentation should be clear but you're missing out on some basic knowledge on how to interpret it.

Gstreamer picture-in-picture - two files playing in parallel

I need to compose a pipeline for "picture-in-picture" effect to combine media from two files:
1) video content from the first file is showed on the full window
2) video from the second file is resized and is showed in the top-left corner of a window,
3) audio from both files mixed
4) the content from both files should be played simultaneously
So far I got the following pipeline:
gst-launch-1.0 -e \
filesrc name="src0" location=$FILE0 \
! decodebin name="decodebin0" ! queue ! videoscale ! capsfilter caps="video/x-raw,width=120" ! videoconvert ! videomixer.sink_0 decodebin0. ! queue ! audioconvert ! audiomixer.sink_0 \
filesrc name="src1" location=$FILE1 \
! decodebin name="decodebin1" ! queue ! videoscale ! capsfilter caps="video/x-raw" ! videoconvert ! videomixer.sink_1 decodebin1. ! queue ! audioconvert ! audiomixer.sink_1 \
videomixer name="videomixer" ! autovideosink \
audiomixer name="audiomixer" ! autoaudiosink
However, it plays streams one by one, not in parallel. Does anyone know what should be changed here in order to play streams simultaneously ?
Ps: attaching the diagram of this pipeline visualized:
Surprisingly - the order of the sources in the pipeline does matter - after slight modification of the pipeline and placing the source with "larger" frame on the first place I was able to get the result as expected:
gst-launch-1.0 -ev \
filesrc name="src1" location=$FILE1 \
! decodebin name="decodebin1" ! queue ! videoscale ! capsfilter caps="video/x-raw,framerate=15/1" ! videoconvert ! videomixer.sink_1 decodebin1. ! queue ! audioconvert name="ac1" \
filesrc name="src0" location=$FILE0 \
! decodebin name="decodebin0" ! queue ! videoscale ! capsfilter caps="video/x-raw,width=120,framerate=15/1" ! videoconvert ! videomixer.sink_0 decodebin0. ! queue ! audioconvert name="ac0"\
ac0. ! audiomixer.sink_0 \
ac1. ! audiomixer.sink_1 \
videomixer name="videomixer" ! autovideosink \
audiomixer name="audiomixer" ! autoaudiosink \

How to use gstreamer for transcoding and resizing from mp4(h264/aac) to mp4(h264/mp3)?

I want to transcode and resize mp4.(mp4-h264_1920x1080/aac => mp4-h264_640x480/mp3) using gstreamer. I wrote down this command.
$ gst-launch-0.10 filesrc location=./gain_1.mp4 ! qtdemux name=demux demux.video_00 ! queue ! ffdec_h264 ! videoscale ! 'video/x-raw-yuv,width=640,height=480' ! x264enc ! queue ! qtmux name=mux mux.video_0 demux.audio_00 ! queue ! ffdec_aac ! lame bitrate=128 ! queue ! mux.audio_0 mux. ! filesink location=0000.mp4 –v -e
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Redistribute latency...
^CCaught interrupt -- handling interrupt.
Interrupt: Stopping pipeline ...
(gst-launch-0.10:17958): GLib-CRITICAL **: Source ID 1 was not found when attempting to remove it
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
which didn't work.
Transcoding video-only works :
gst-launch-0.10 filesrc location=./gain_1.mp4 ! qtdemux name=demux demux.video_00 ! queue ! ffdec_h264 ! videoscale ! 'video/x-raw-yuv, width=640, height=480' ! x264enc ! queue ! mux. mp4mux name=mux ! filesink location=0000.mp4 –v -e
And transcoding audio-only too:
gst-launch-0.10 filesrc location=./gain_1.mp4 ! qtdemux name=demux demux.audio_00 ! ffdec_aac ! lame bitrate=128 ! queue ! mux. mp4mux name=mux ! filesink location=0000.mp4 –v -e
How can I transcode audio and video with the same command?
#Lionel.J I would like to suggest two improvements:
if possible, use gstreamer-1
your solution reads the source file twice. That's not necessary. Furthermore, the audio and video streams are not synchronized when you do this. You can read both audio and video streams out of qtdemux.
This is a pipeline which does the job with gstreamer-1 and reads the source only once:
gst-launch-1.0 -e filesrc location=/path/to/big_buck_bunny_720p_h264.mov ! \
decodebin name=decode ! \
videoscale ! 'video/x-raw,width=640,height=480' ! \
x264enc ! queue ! mp4mux name=mp4mux ! filesink location=0000.mp4 \
decode. ! audioconvert ! lamemp3enc bitrate=128 ! queue ! mp4mux.
Oh~
I solved this problem.
Next command did good work.
gst-launch-0.10 ffmux_mp4 name=mux ! \
filesink location=0000.mp4 \
filesrc location=./gain_1.mp4 ! qtdemux name=vdemux vdemux.video_00 ! queue ! ffdec_h264 ! videoscale ! 'video/x-raw-yuv, width=640, height=480' ! x264enc ! queue ! mux. \
filesrc location=./gain_1.mp4 ! qtdemux name=ademux ademux.audio_00 ! ffdec_aac ! lame bitrate=128 ! queue ! mux.`

play and record stream in the same time using gstreamer

Hi to all i try to play and record mp3 souphttpsrc in the same time but i don't have a good result someone can help please?
gst-launch-1.0 -e filesrc location=/dev/fd/0 ! h264parse ! tee name=myvid \! queue ! decodebin ! xvimagesink sync=false \ myvid. ! queue ! mux.video_0 \ alsasrc device="plughw:2,0" ! "audio/x-raw,rate=44100,channels=1,depth=24" ! audioconvert ! queue ! filesink location=/tmp/out.mp4
thank you
Hi your pipeline is slightly wrong.
There is no encoding happening with the audio so you're saving raw audio into the container.
There is no muxer and mux.video_0 therefore does not resolve to any pad on any element.
Here is a pipeline without these issues:
gst-launch-1.0 -e mp4mux name=mux ! filesink location=/tmp/out.mp4 filesrc location=/dev/fd/0 ! h264parse ! tee name=myvid ! queue ! decodebin ! xvimagesink sync=false myvid. ! queue ! mux.video_0 \ alsasrc device="plughw:2,0" ! "audio/x-raw,rate=44100,channels=1,depth=24" ! audioconvert ! queue ! lame ! mux.audio_0