I am trying to stream an mpeg2-ts video over RTP using gstreamer. I am using the following pipeline for the server:
gst-launch-0.10 -v filesrc location=/home/…/miracast_sample.mpeg ! rtpmp2tpay ! udpsink host=localhost port=5000 sync=false
The problem that I am facing is that I get directly an EOS event like described below:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstRTPMP2TPay:rtpmp2tpay0: timestamp = 3878456990
/GstPipeline:pipeline0/GstRTPMP2TPay:rtpmp2tpay0: seqnum = 50764
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 126835285 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
I can understand that it is running very fast but how to fix it?
You have set sync=FALSE, and that translates to "do not sync on timestamps, but process the buffer as fast as possible." Try and change it to TRUE, like so:
gst-launch-0.10 -v filesrc location=/home/…/miracast_sample.mpeg ! rtpmp2tpay ! udpsink host=localhost port=5000 sync=1
I had the same problem as you and my coworker suggested I insert a tsparse set-timestamps=true between the filesrc and rtpmp2tpay. It worked for me, so try changing your pipeline to
gst-launch-0.10 -v filesrc location=/home/…/miracast_sample.mpeg ! \
tsparse set-timestamps=true ! rtpmp2tpay ! udpsink host=localhost port=5000 sync=false
Have you tried to demux it and then mux it ...
such as:
server:
gst-launch-0.10 -v filesrc location=file_to_stream.ts ! tsdemux program-number=811 ! mpegtsmux ! rtpmp2tpay ! udpsink host=localhost port=5000 sync=1
client:
gst-launch-0.10 udpsrc port=5000 caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)MP2T-ES" ! gstrtpbin ! rtpmp2tdepay ! tsdemux ! mpeg2dec ! ffmpegcolorspace ! autovideosink
Related
When I stream over udp and autovideosink on the client side (in this example both sender and receiver are on same host), all works fine, but when I try to filesink it and capture a single frame, all my attempts fail. The file been created, but it's empty.
Here is the source:
gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,width=640,height=480,framerate=30/1 ! jpegenc ! rtpjpegpay ! udpsink host=127.0.0.0 port=5000 -e -v
One of the non working clients:
$ gst-launch-1.0 udpsrc port=5000 num-buffers=1 ! application/x-rtp,encoding-name=JPEG! rtpjpegdepay ! jpegdec ! jpegenc ! filesink location=test.jpeg -e
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 0:00:00.112594509
Setting pipeline to NULL ...
Freeing pipeline ...
How do I capture it?
You use snapshot parameters of jpegenc et remove num-buffers=1 from udpsrc
https://gstreamer.freedesktop.org/documentation/jpeg/jpegenc.html?gi-language=c
Is something like that work for you ?
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! jpegenc snapshot=TRUE ! filesink location=test.jpeg
I wanted to create a RTP-stream of a mp4-file with gstreamer.
I am using gstreamer 1.18.4 on debian bullseye.
To create a mp4-file I recorded an RTSP-stream from my webcam using the following command:
gst-launch-1.0 -e rtspsrc location="rtsp://192.168.111.146/axis-media/media.amp" port-range=28000-38000 buffer-mode=0 latency=80 ! rtph264depay ! h264parse ! mp4mux ! filesink location=filename.mp4
After recording the file filename.mp4 I tried to stream it using RTP:
gst-launch-1.0 filesrc location=filename.mp4 ! qtdemux ! h264parse ! avdec_h264 ! x264enc ! h264parse ! rtph264pay ! udpsink port=50000 host=127.0.0.1
And the playback of the stream could be started using the following command on the same machine:
gst-launch-1.0 udpsrc address=127.0.0.1 port=50000 auto-multicast=true ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! avdec_h264 ! autovideosink
Everything works as expected!
But since I don't want to transcode the file, I just wanted to skip the decoding and encoding part. Therefore, I created the following pipelines:
gst-launch-1.0 filesrc location=filename.mp4 ! qtdemux ! h264parse ! rtph264pay ! udpsink port=50000 host=127.0.0.1
and
gst-launch-1.0 filesrc location=filename.mp4 ! qtdemux ! rtph264pay ! udpsink port=50000 host=127.0.0.1
However, if I retry the playback pipeline (the pipeline with udpsrc) on both pipelines the stream is not displayed.
Interestingly, nload shows network traffic on lo.
What is wrong with the streaming pipelines?
Did I miss some magic-plugin in between?
Meanwhile I found an answer to my question.
Changing the stream-server-pipeline from
gst-launch-1.0 filesrc location=filename.mp4 ! qtdemux ! h264parse ! rtph264pay ! udpsink port=50000 host=127.0.0.1
to
gst-launch-1.0 filesrc location=filename.mp4 ! qtdemux ! h264parse config-interval=-1 ! rtph264pay ! udpsink port=50000 host=127.0.0.1
solves the issue.
Thus, the difference is setting the parameter config-interval=-1 for h264parse.
I have an issue in sending H264 streams to linphone.
When I play the following pipeline :
gst-launch-1.0 -v filesrc location=C:/ test.mp4 ! qtdemux ! avdec_h264 ! x264enc bitrate=192 ! rtph264pay ! udpsink host=127.0.0.1 port=9078
Everything is ok and the video played in linphone screen.
But what I want to do is to save first the video streams into a file then send this file to the linphone. What I did is the following:
Saving into a file
gst-launch-1.0 -v filesrc location=C:/ test.mp4 ! qtdemux ! avdec_h264 ! x264enc bitrate=192 ! filesink location=C: /videosample
Send to linphone:
gst-launch-1.0 -v filesrc location=C: /videosample ! h264parse ! rtph264pay ! udpsink host=127.0.0.1 port=9078
The packets received but the linphone played a black screen.
I want to know what I am missing in my pipelines or if there is a specific parameter to set.
Note that it is working when a play a gst receiver :
gst-launch-1.0 -v udpsrc port=9078 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! avdec_h264 ! autovideosink
I'm trying to use gstreamer to send a sample file .avi over a network. The code that I'm using to build my pipeline is the following:
gst-launch-1.0 -v rtpbin name=rtpbin latency=200 \
filesrc location=filesrc location=/home/enry/drop.avi ! decodebin name=dec \
dec. ! queue ! x264enc byte-stream=false bitrate=300 ! rtph264pay ! rtpbin.send_rtp_sink_0 \
rtpbin.send_rtp_src_0 ! udpsink port=5000 host=127.0.0.1 ts-offset=0 name=vrtpsink \
rtpbin.send_rtcp_src_0 ! udpsink port=5001 host=127.0.0.1 sync=false async=false \
name=vrtcpsink udpsrc port=5005 name=vrtpsrc ! rtpbin.recv_rtcp_sink_0
When I try to execute this command I'm getting this error:
gstavidemux.c(5383): gst_avi_demux_loop ():
/GstPipeline:pipeline0/GstDecodeBin:dec/GstAviDemux:avidemux0:
streaming stopped, reason not-linked
Execution ended after 0:00:00.032906515
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
Can you help me to solve this?
I think you would need a demux element before the decodebin, since an avi file would consist both an audio and video but you are using just one decodebin to decode them.
Take a look at the avidemux element here. You need it to get the video stream from the file.
Here is what I'm trying:
gst-launch -v udpsrc port=1234 ! fakesink dump=1
I test with:
gst-launch -v audiotestsrc ! udpsink host=127.0.0.1 port=1234
And everything works fine, I can see the packages arriving from the audiotestsrc
Now lets test with the webcam source:
gst-launch -v v4l2src device=/dev/video0 ! queue ! videoscale method=1 ! "video/x-raw-yuv,width=320,height=240" ! queue ! videorate ! "video/x-raw-yuv,framerate=(fraction)15/1" ! queue ! udpsink host=127.0.0.1 port=1234
And nothing happens, no package appears in the dump.
Here is a logdump of what verbose shows in the server.
Does anyone have a clue on this?
Try these (You may have to install gstreamer-ugly plugins for this one)
UDP streaming from Webcam (stream over the network)
gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=640,height=480' ! x264enc pass=qual quantizer=20 tune=zerolatency ! rtph264pay ! udpsink host=127.0.0.1 port=1234
UDP Streaming received from webcam (receive over the network)
gst-launch udpsrc port=1234 ! "application/x-rtp, payload=127" ! rtph264depay ! ffdec_h264 ! xvimagesink sync=false
Update
To determine the payload at the streaming end simply use verbose option with gst-launch -v ...
Maybe packets are too large for udp? They are limited to 64K. Try resizing frames to really small size to check if this is the reason. If so, you may be interested in some compression and payloaders/depayloaders (gst-inspect | grep pay).
gstreamer1-1.16.0-1.fc30
gst-launch-1.0 -v filesrc location=/.../.../.../sample-mp4-file.mp4 ! qtdemux ! h264parse ! queue ! rtph264pay config-interval=10 pt=96 ! udpsink port=8888 host=127.0.0.1
https://en.wikipedia.org/wiki/RTP_audio_video_profile