How to stream a file.avi over network using gstreamer - gstreamer

I'm trying to use gstreamer to send a sample file .avi over a network. The code that I'm using to build my pipeline is the following:
gst-launch-1.0 -v rtpbin name=rtpbin latency=200 \
filesrc location=filesrc location=/home/enry/drop.avi ! decodebin name=dec \
dec. ! queue ! x264enc byte-stream=false bitrate=300 ! rtph264pay ! rtpbin.send_rtp_sink_0 \
rtpbin.send_rtp_src_0 ! udpsink port=5000 host=127.0.0.1 ts-offset=0 name=vrtpsink \
rtpbin.send_rtcp_src_0 ! udpsink port=5001 host=127.0.0.1 sync=false async=false \
name=vrtcpsink udpsrc port=5005 name=vrtpsrc ! rtpbin.recv_rtcp_sink_0
When I try to execute this command I'm getting this error:
gstavidemux.c(5383): gst_avi_demux_loop ():
/GstPipeline:pipeline0/GstDecodeBin:dec/GstAviDemux:avidemux0:
streaming stopped, reason not-linked
Execution ended after 0:00:00.032906515
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
Can you help me to solve this?

I think you would need a demux element before the decodebin, since an avi file would consist both an audio and video but you are using just one decodebin to decode them.
Take a look at the avidemux element here. You need it to get the video stream from the file.

Related

Gstreamer - how to capture a single frame from udp source

When I stream over udp and autovideosink on the client side (in this example both sender and receiver are on same host), all works fine, but when I try to filesink it and capture a single frame, all my attempts fail. The file been created, but it's empty.
Here is the source:
gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,width=640,height=480,framerate=30/1 ! jpegenc ! rtpjpegpay ! udpsink host=127.0.0.0 port=5000 -e -v
One of the non working clients:
$ gst-launch-1.0 udpsrc port=5000 num-buffers=1 ! application/x-rtp,encoding-name=JPEG! rtpjpegdepay ! jpegdec ! jpegenc ! filesink location=test.jpeg -e
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 0:00:00.112594509
Setting pipeline to NULL ...
Freeing pipeline ...
How do I capture it?
You use snapshot parameters of jpegenc et remove num-buffers=1 from udpsrc
https://gstreamer.freedesktop.org/documentation/jpeg/jpegenc.html?gi-language=c
Is something like that work for you ?
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! jpegenc snapshot=TRUE ! filesink location=test.jpeg

gstreamer: Demux & Remux MKV, preserving video

I am trying to reencode the audio part of a MKV file that contains some video/x-h264 and some audio/x-raw. I can't manage to just demux the MKV and remux it. Even simply:
gst-launch-1.0 filesrc location=test.mkv ! matroskademux name=demux \
matroskamux name=mux ! filesink location=test2.mkv \
demux.video_00 ! mux.video_00 \
demux.audio_00 ! mux.audio_00
fails miserably with:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
WARNING: from element /GstPipeline:pipeline0/GstMatroskaDemux:demux: Delayed linking failed.
Additional debug info:
../gstreamer/gst/parse/grammar.y(506): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstMatroskaDemux:demux:
failed delayed linking pad video_00 of GstMatroskaDemux named demux to pad video_00 of GstMatroskaMux named mux
WARNING: from element /GstPipeline:pipeline0/GstMatroskaDemux:demux: Delayed linking failed.
Additional debug info:
../gstreamer/gst/parse/grammar.y(506): gst_parse_no_more_pads (): /GstPipeline:pipeline0/GstMatroskaDemux:demux:
failed delayed linking pad audio_00 of GstMatroskaDemux named demux to pad audio_00 of GstMatroskaMux named mux
ERROR: from element /GstPipeline:pipeline0/GstMatroskaDemux:demux: Internal data stream error.
Additional debug info:
../gst-plugins-good/gst/matroska/matroska-demux.c(5715): gst_matroska_demux_loop (): /GstPipeline:pipeline0/GstMatroskaDemux:demux:
streaming stopped, reason not-linked (-1)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
My best attempt at the transcoding mentioned above goes:
gst-launch-1.0 -v filesrc location=test.mkv ! matroskademux name=demux \
matroskamux name=mux ! filesink location=test2.mkv \
demux.video_00 ! queue ! 'video/x-h264' ! h264parse ! mux. \
demux.audio_00 ! queue ! rawaudioparse ! audioconvert ! audioresample ! avenc_aac ! mux.
with the same result. Removing the pad name audio_00 leads to gst being stuck at PREROLLING.
I have seen a few people facing similar problems:
http://gstreamer-devel.966125.n4.nabble.com/Putting-h264-file-inside-a-container-td4668158.html
http://gstreamer-devel.966125.n4.nabble.com/Changing-the-container-format-td3576914.html
As therein, keeping only video or only audio works.
I think the rawaudioparse should not be here. I tried your pipeline and trouble with it too. I just came up with something as I would have done it and it seemed to work:
filesrc location=test.mkv ! matroskademux \
matroskademux0. ! queue ! audioconvert ! avenc_aac ! matroskamux ! filesink location=test2.mkv \
matroskademux0. ! queue ! h264parse ! matroskamux0.
Audio in my case was:
Stream #0:0(eng): Audio: pcm_f32le, 44100 Hz, 2 channels, flt, 2822 kb/s (default)
Another format may require addiitonal transformations..
The problem is that the pads video_00 and audio_00 have been renamed video_0 and audio_0. This can be seen using gst-inspect-1.0 matroskademux, which indicates that the format for the pads now reads video_%u. Note that some documentation pages of gstreamer are not updated to reflect that.
The first command, MKV to MKV should read:
gst-launch-1.0 filesrc location=test.mkv ! matroskademux name=demux \
matroskamux name=mux ! filesink location=test2.mkv \
demux.video_0 ! queue ! mux.video_0 \
demux.audio_0 ! queue ! mux.audio_0
(Note the added queues)
The second command, MKV to MKV reencoding audio should read:
gst-launch-1.0 -v filesrc location=test.mkv ! matroskademux name=demux \
matroskamux name=mux ! filesink location=test2.mkv \
demux.video_0 ! queue ! 'video/x-h264' ! h264parse ! mux. \
demux.audio_0 ! queue ! rawaudioparse ! audioconvert ! audioresample ! avenc_aac ! mux.
The same result could have been achieved by not specifying the pads and using cap filters if needed.
Thanks go to user Florian Zwoch for providing a working pipeline.

rtpbin receiver doesn't output X-GST encoded streams

I'm trying to make a server and client application that sends and receives a raw video stream using rtpbin. In order to send an uncompressed videostream I'm using rtpgstpay and rtpgstdepay to payload to the data.
The server application can succesfully send the video stream with the following pipeline:
gst-launch-1.0 -vvv rtpbin name=rtpbin \
videotestsrc ! \
rtpgstpay ! application/x-rtp,media=application,payload=96,encoding-name=X-GST ! rtpbin.send_rtp_sink_0 \
rtpbin.send_rtp_src_0 ! udpsink port=5000 host=127.0.0.1 name=vrtpsink \
rtpbin.send_rtcp_src_0 ! udpsink port=5002 host=127.0.0.1 sync=false async=false name=vrtcpsink
The client pipeline looks like this:
gst-launch-1.0 -vvv rtpbin name=rtpbin \
udpsrc caps="application/x-rtp,payload=96,media=application,encoding-name=X-GST" port=5000 ! rtpbin.recv_rtp_sink_0 \
rtpbin. ! rtpgstdepay ! videoconvert ! autovideosink \
udpsrc port=5002 ! rtpbin.recv_rtcp_sink_0
rtpbin succesfully creates a sink and links to the udpsrc, but no stream comes out of rtp source pad.
The same pipeline without rtpbin can succesfully display the stream:
gst-launch-1.0 -vvv \
udpsrc caps="application/x-rtp,payload=96,media=application,encoding-name=X-GST" port=5000 ! \
rtpgstdepay ! videoconvert ! autovideosink
What an I doing wrong that rtpbin doesn't want to output the stream?
I also tried to replace the rtp_source part of the client with a fakesink to see if it would output anything, but still nothing comes out of rtpbin.
I found the solution to my problem. If any people come across the same problem, this is how to fix it:
First of all, rtpbin needs the clock-rate to be specified in the caps
When using rtpgst(de)pay, you need to specify the caps event string in your caps filter at the receiver, you can find this when printing the caps of the rtpgstpay element at the transmitter, eg:
application/x-rtp, media=(string)application, clock-rate=(int)90000, encoding-name=(string)X-GST, caps=(string)"dmlkZW8veC1yYXcsIGZvcm1hdD0oc3RyaW5nKUdSQVk4LCB3aWR0aD0oaW50KTY0MCwgaGVpZ2h0PShpbnQpNDYwLCBpbnRlcmxhY2UtbW9kZT0oc3RyaW5nKXByb2dyZXNzaXZlLCBwaXhlbC1hc3BlY3QtcmF0aW89KGZyYWN0aW9uKTEvMSwgY29sb3JpbWV0cnk9KHN0cmluZykxOjQ6MDowLCBmcmFtZXJhdGU9KGZyYWN0aW9uKTI1LzE\=", capsversion=(string)0, payload=(int)96, ssrc=(uint)2501988797, timestamp-offset=(uint)1970605309, seqnum-offset=(uint)2428, a-framerate=(string)25
So here the caps event string is
dmlkZW8veC1yYXcsIGZvcm1hdD0oc3RyaW5nKUdSQVk4LCB3aWR0aD0oaW50KTY0MCwgaGVpZ2h0PShpbnQpNDYwLCBpbnRlcmxhY2UtbW9kZT0oc3RyaW5nKXByb2dyZXNzaXZlLCBwaXhlbC1hc3BlY3QtcmF0aW89KGZyYWN0aW9uKTEvMSwgY29sb3JpbWV0cnk9KHN0cmluZykxOjQ6MDowLCBmcmFtZXJhdGU9KGZyYWN0aW9uKTI1LzE\=
When adding this to the caps at the receiver, you have to add a null terminator ( \0 ) at the end of the string.

Extract h264 stream from USB webcam (logitech C920)

So, I'm starting to play around with gstreamer and I'm able to do very simple pipes such as
gst-launch-1.0 -v v4l2src device=/dev/video1 ! video/x-raw,format=YUY2,width=640,height=480,framerate=10/1 ! videoconvert ! autovideosink
Now, as my USB webcam (which is video1, video0 being the computer's built in camera) supports h264 (I have checked using lsusb), I would like to try to get the h264 feed directly. I understand that this feed is muxed in the mjpeg one, but looking around on the web it seems that gstreamer is able to get it nonetheless.
Since my end goal is to stream it from a Beaglebone, I made an attempt using the solution given to this post (adding a listener from a different terminal):
#sender
gst-launch-1.0 v4l2src device=/dev/video1 ! video/x-264,width=320,height=90,framerate=10/1 ! tcpserversink host=192.168.148.112 port=9999
But this yields the following error :
WARNING: erroneous pipeline: could not link v4l2src0 to tcpserversink0
I also tried something similar to my first command, changing the source from raw to h264 (based on that post , trying the full command given there gives the same error message)
gst-launch-1.0 -v v4l2src device=/dev/video1 ! video/x-h264,width=640,height=480,framerate=10/1 ! h264parse ! avdec_h264 ! autovideosink
But again, this did not work either:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2948): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming task paused, reason not-negotiated (-4)
Execution ended after 0:00:00.036309961
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
I admit this is driving me pretty crazy: looking on SO or elsewhere on the web, there seems to be a lot of people who made it work with exactly the same webcam as the one I have (Logitech C920), but I keep running into issues one after the other.
What would be an example of correct pipe to extract the h264 from that webcam?
You definitely need to use a payloader before it hits the wire. For example rtph264pay. Here is an example that cannot test as I don't have your hardware available. I have working udp examples from alternates sources if this doesn't steer you in the right direction.
server
gst-launch v4l2src device=/dev/video1 \
! video/x-264,width=320,height=90,framerate=10/1 \
! x264enc \
! queue \
! rtph264pay, config-interval=3, pt=96, mtu=1500 \
! queue \
! tcpserversink host=127.0.0.1 port=9002
client
gst-launch tcpserversrc host=127.0.0.1 port=9002 \
! application/x-rtp, media=video, clock-rate=90000, encoding-name=H264, payload=96 \
! rtph264depay \
! video/x-h264 \
! queue \
! ffdec_h264 \
! queue \
! xvimagesink

Streaming an mpeg2-ts video over RTP using gstreamer

I am trying to stream an mpeg2-ts video over RTP using gstreamer. I am using the following pipeline for the server:
gst-launch-0.10 -v filesrc location=/home/…/miracast_sample.mpeg ! rtpmp2tpay ! udpsink host=localhost port=5000 sync=false
The problem that I am facing is that I get directly an EOS event like described below:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstRTPMP2TPay:rtpmp2tpay0: timestamp = 3878456990
/GstPipeline:pipeline0/GstRTPMP2TPay:rtpmp2tpay0: seqnum = 50764
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 126835285 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
I can understand that it is running very fast but how to fix it?
You have set sync=FALSE, and that translates to "do not sync on timestamps, but process the buffer as fast as possible." Try and change it to TRUE, like so:
gst-launch-0.10 -v filesrc location=/home/…/miracast_sample.mpeg ! rtpmp2tpay ! udpsink host=localhost port=5000 sync=1
I had the same problem as you and my coworker suggested I insert a tsparse set-timestamps=true between the filesrc and rtpmp2tpay. It worked for me, so try changing your pipeline to
gst-launch-0.10 -v filesrc location=/home/…/miracast_sample.mpeg ! \
tsparse set-timestamps=true ! rtpmp2tpay ! udpsink host=localhost port=5000 sync=false
Have you tried to demux it and then mux it ...
such as:
server:
gst-launch-0.10 -v filesrc location=file_to_stream.ts ! tsdemux program-number=811 ! mpegtsmux ! rtpmp2tpay ! udpsink host=localhost port=5000 sync=1
client:
gst-launch-0.10 udpsrc port=5000 caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)MP2T-ES" ! gstrtpbin ! rtpmp2tdepay ! tsdemux ! mpeg2dec ! ffmpegcolorspace ! autovideosink