How to mix h264 format with audio on webcam with gstreamer?
gst-launch-1.0 -v v4l2src device=/dev/video2 ! video/x-h264,framerate=30/1,width=1920,height=1080 \
! queue ! mux. \
alsasrc device=hw:1 ! queue ! audioconvert ! fdkaacenc \
! mux. matroskamux name=mux ! filesink location=video.mkv
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstAudioSrcClock
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error.
Additional debug info:
../libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.001309727
Setting pipeline to NULL ...
Freeing pipeline ...
Preview works
gst-launch-1.0 -v \
v4l2src device=/dev/video2 ! video/x-h264,framerate=30/1,width=1920,height=1080 ! decodebin ! autovideosink
Audio works
gst-launch-1.0 -v alsasrc device=hw:1 ! queue ! audioconvert ! fdkaacenc ! fdkaacdec ! autoaudiosink
h264parse needed before mux
gst-launch-1.0 -v \
v4l2src device=/dev/video2 ! queue ! video/x-h264,framerate=30/1,width=1920,height=1080 \
! h264parse ! mux. \
alsasrc device=hw:1 ! queue ! audioconvert ! fdkaacenc ! mux. \
matroskamux name=mux ! filesink location=video.mp4
Related
I'm using gstreamer to make a picture-in-picture composition of two rtmp inputs into an rtmp output.
I've managed to create a pipeline that works very well when both streams are offline
However, when one of the rtmp streams is not live when starting the pipeline - the pipeline does not start.
Does anyone know how to overcome this issue, and make sure the pipeline is not blocked if one rtmp source is offline?
You may have to insert identity into compositor input sub-pipelines.
Simulating 2 sources on localhost with:
gst-launch-1.0 videotestsrc ! x264enc insert-vui=1 ! h264parse config-interval=1 ! mpegtsmux ! rtpmp2tpay ! udpsink port=5004
(and second source the same to port 5005), the following runs fine for 0, 1 or 2 sources active at launch time:
gst-launch-1.0 -v \
udpsrc port=5004 ! application/x-rtp,media=video,encoding-name=MP2T,clock-rate=90000,payload=33 ! rtpjitterbuffer latency=300 ! rtpmp2tdepay ! tsdemux ! h264parse ! avdec_h264 ! videoconvert ! identity ! queue ! comp.sink_0 \
udpsrc port=5005 ! application/x-rtp,media=video,encoding-name=MP2T,clock-rate=90000,payload=33 ! rtpjitterbuffer latency=300 ! rtpmp2tdepay ! tsdemux ! h264parse ! avdec_h264 ! videoconvert ! identity ! queue ! comp.sink_1 \
compositor name=comp sink_0::xpos=0 sink_0::ypos=0 sink_0::width=320 sink_0::height=240 sink_1::xpos=0 sink_1::ypos=240 sink_1::width=320 sink_1::height=240 ! video/x-raw,width=320,height=480 ! videoconvert ! xvimagesink
For rtmp, with mpeg audio from first source, it would be something like:
gst-launch-1.0 -v \
rtmpsrc <your source1 and options> ! flvdemux name=demux0 ! queue ! h264parse ! avdec_h264 ! videoconvert ! identity ! queue ! comp.sink_0 \
rtmpsrc <your source2 and options> ! flvdemux name=demux1 ! queue ! h264parse ! avdec_h264 ! videoconvert ! identity ! queue ! comp.sink_1 \
compositor name=comp sink_0::xpos=0 sink_0::ypos=0 sink_0::width=320 sink_0::height=240 sink_1::xpos=0 sink_1::ypos=240 sink_1::width=320 sink_1::height=240 ! video/x-raw,width=320,height=480 ! videoconvert ! autovideosink \
demux0. ! queue ! audio/mpeg ! decodebin ! audioconvert ! audioresample ! autoaudiosink
I am trying to reencode the audio part of a MKV file that contains some video/x-h264 and some audio/x-raw. I can't manage to just demux the MKV and remux it. Even simply:
gst-launch-1.0 filesrc location=test.mkv ! matroskademux name=demux \
matroskamux name=mux ! filesink location=test2.mkv \
demux.video_00 ! mux.video_00 \
demux.audio_00 ! mux.audio_00
fails miserably with:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
WARNING: from element /GstPipeline:pipeline0/GstMatroskaDemux:demux: Delayed linking failed.
Additional debug info:
../gstreamer/gst/parse/grammar.y(506): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstMatroskaDemux:demux:
failed delayed linking pad video_00 of GstMatroskaDemux named demux to pad video_00 of GstMatroskaMux named mux
WARNING: from element /GstPipeline:pipeline0/GstMatroskaDemux:demux: Delayed linking failed.
Additional debug info:
../gstreamer/gst/parse/grammar.y(506): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstMatroskaDemux:demux:
failed delayed linking pad audio_00 of GstMatroskaDemux named demux to pad audio_00 of GstMatroskaMux named mux
ERROR: from element /GstPipeline:pipeline0/GstMatroskaDemux:demux: Internal data stream error.
Additional debug info:
../gst-plugins-good/gst/matroska/matroska-demux.c(5715): gst_matroska_demux_loop (): /GstPipeline:pipeline0/GstMatroskaDemux:demux:
streaming stopped, reason not-linked (-1)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
My best attempt at the transcoding mentioned above goes:
gst-launch-1.0 -v filesrc location=test.mkv ! matroskademux name=demux \
matroskamux name=mux ! filesink location=test2.mkv \
demux.video_00 ! queue ! 'video/x-h264' ! h264parse ! mux. \
demux.audio_00 ! queue ! rawaudioparse ! audioconvert ! audioresample ! avenc_aac ! mux.
with the same result. Removing the pad name audio_00 leads to gst being stuck at PREROLLING.
I have seen a few people facing similar problems:
http://gstreamer-devel.966125.n4.nabble.com/Putting-h264-file-inside-a-container-td4668158.html
http://gstreamer-devel.966125.n4.nabble.com/Changing-the-container-format-td3576914.html
As therein, keeping only video or only audio works.
I think the rawaudioparse should not be here. I tried your pipeline and trouble with it too. I just came up with something as I would have done it and it seemed to work:
filesrc location=test.mkv ! matroskademux \
matroskademux0. ! queue ! audioconvert ! avenc_aac ! matroskamux ! filesink location=test2.mkv \
matroskademux0. ! queue ! h264parse ! matroskamux0.
Audio in my case was:
Stream #0:0(eng): Audio: pcm_f32le, 44100 Hz, 2 channels, flt, 2822 kb/s (default)
Another format may require addiitonal transformations..
The problem is that the pads video_00 and audio_00 have been renamed video_0 and audio_0. This can be seen using gst-inspect-1.0 matroskademux, which indicates that the format for the pads now reads video_%u. Note that some documentation pages of gstreamer are not updated to reflect that.
The first command, MKV to MKV should read:
gst-launch-1.0 filesrc location=test.mkv ! matroskademux name=demux \
matroskamux name=mux ! filesink location=test2.mkv \
demux.video_0 ! queue ! mux.video_0 \
demux.audio_0 ! queue ! mux.audio_0
(Note the added queues)
The second command, MKV to MKV reencoding audio should read:
gst-launch-1.0 -v filesrc location=test.mkv ! matroskademux name=demux \
matroskamux name=mux ! filesink location=test2.mkv \
demux.video_0 ! queue ! 'video/x-h264' ! h264parse ! mux. \
demux.audio_0 ! queue ! rawaudioparse ! audioconvert ! audioresample ! avenc_aac ! mux.
The same result could have been achieved by not specifying the pads and using cap filters if needed.
Thanks go to user Florian Zwoch for providing a working pipeline.
I'm trying to stream an arbitrary file with gstreamer, I have the following command line but it does not work (I will use python when I get this to work)
gst-launch-1.0 uridecodebin uri=file:///tmp/File.mkv name=decoder name=decbin \
! queue\
! videoconvert ! x264enc \
! mp4mux name=muxer ! udpsink host=127.0.0.1 port=1234 decbin. \
! queue \
! audioconvert ! lamemp3enc ! muxer.
and playing with
gst-launch-1.0 udpsrc port=1234 ! 'application/x-rtp,payload=96'\
! rtph264depay ! decodebin ! xvimagesink sync=false
I know I have to add rtph264pay and rtpmpapay but I don't know where.
I want to transcode and resize mp4.(mp4-h264_1920x1080/aac => mp4-h264_640x480/mp3) using gstreamer. I wrote down this command.
$ gst-launch-0.10 filesrc location=./gain_1.mp4 ! qtdemux name=demux demux.video_00 ! queue ! ffdec_h264 ! videoscale ! 'video/x-raw-yuv,width=640,height=480' ! x264enc ! queue ! qtmux name=mux mux.video_0 demux.audio_00 ! queue ! ffdec_aac ! lame bitrate=128 ! queue ! mux.audio_0 mux. ! filesink location=0000.mp4 –v -e
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Redistribute latency...
^CCaught interrupt -- handling interrupt.
Interrupt: Stopping pipeline ...
(gst-launch-0.10:17958): GLib-CRITICAL **: Source ID 1 was not found when attempting to remove it
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
which didn't work.
Transcoding video-only works :
gst-launch-0.10 filesrc location=./gain_1.mp4 ! qtdemux name=demux demux.video_00 ! queue ! ffdec_h264 ! videoscale ! 'video/x-raw-yuv, width=640, height=480' ! x264enc ! queue ! mux. mp4mux name=mux ! filesink location=0000.mp4 –v -e
And transcoding audio-only too:
gst-launch-0.10 filesrc location=./gain_1.mp4 ! qtdemux name=demux demux.audio_00 ! ffdec_aac ! lame bitrate=128 ! queue ! mux. mp4mux name=mux ! filesink location=0000.mp4 –v -e
How can I transcode audio and video with the same command?
#Lionel.J I would like to suggest two improvements:
if possible, use gstreamer-1
your solution reads the source file twice. That's not necessary. Furthermore, the audio and video streams are not synchronized when you do this. You can read both audio and video streams out of qtdemux.
This is a pipeline which does the job with gstreamer-1 and reads the source only once:
gst-launch-1.0 -e filesrc location=/path/to/big_buck_bunny_720p_h264.mov ! \
decodebin name=decode ! \
videoscale ! 'video/x-raw,width=640,height=480' ! \
x264enc ! queue ! mp4mux name=mp4mux ! filesink location=0000.mp4 \
decode. ! audioconvert ! lamemp3enc bitrate=128 ! queue ! mp4mux.
Oh~
I solved this problem.
Next command did good work.
gst-launch-0.10 ffmux_mp4 name=mux ! \
filesink location=0000.mp4 \
filesrc location=./gain_1.mp4 ! qtdemux name=vdemux vdemux.video_00 ! queue ! ffdec_h264 ! videoscale ! 'video/x-raw-yuv, width=640, height=480' ! x264enc ! queue ! mux. \
filesrc location=./gain_1.mp4 ! qtdemux name=ademux ademux.audio_00 ! ffdec_aac ! lame bitrate=128 ! queue ! mux.`
Hi to all i try to play and record mp3 souphttpsrc in the same time but i don't have a good result someone can help please?
gst-launch-1.0 -e filesrc location=/dev/fd/0 ! h264parse ! tee name=myvid \! queue ! decodebin ! xvimagesink sync=false \ myvid. ! queue ! mux.video_0 \ alsasrc device="plughw:2,0" ! "audio/x-raw,rate=44100,channels=1,depth=24" ! audioconvert ! queue ! filesink location=/tmp/out.mp4
thank you
Hi your pipeline is slightly wrong.
There is no encoding happening with the audio so you're saving raw audio into the container.
There is no muxer and mux.video_0 therefore does not resolve to any pad on any element.
Here is a pipeline without these issues:
gst-launch-1.0 -e mp4mux name=mux ! filesink location=/tmp/out.mp4 filesrc location=/dev/fd/0 ! h264parse ! tee name=myvid ! queue ! decodebin ! xvimagesink sync=false myvid. ! queue ! mux.video_0 \ alsasrc device="plughw:2,0" ! "audio/x-raw,rate=44100,channels=1,depth=24" ! audioconvert ! queue ! lame ! mux.audio_0