I'm trying to stream a h264 encoded movie file from a server to multiple clients at once by sending the RTP Stream to the broadcast address.
The solution I've got works but is very slow. Playing the video locally works fine.
Here's my Server:
gst-launch-0.10 -v filesrc location=/home/zeroc8/Videos/bunny.mov \
! qtdemux ! h264parse ! rtph264pay pt=96 ! udpsink host=192.168.1.255 port=5000
This is the Client:
gst-launch-0.10 udpsrc port=5000 \
caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"J01AHqkYGwe83gDUBAQG2wrXvfAQ\\,KN4JyA\\=\\=\", payload=(int)96, ssrc=(uint)786848209, clock-base=(uint)101553131, seqnum-base=(uint)64602"
! rtph264depay ! ffdec_h264 ! ffmpegcolorspace ! autovideosink
Am I doing something bad here? Why is this so slow?
Just got the answer from the gstreamer mailing list.
In case anyone else is having the same problem, adding the gstrtpjitterbuffer element fixes it.
gst-launch-0.10 udpsrc port=5000 \
caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"J01AHqkYGwe83gDUBAQG2wrXvfAQ\\,KN4JyA\\=\\=\", payload=(int)96, ssrc=(uint)786848209, clock-base=(uint)101553131, seqnum-base=(uint)64602" \
! gstrtpjitterbuffer latency=1000
! rtph264depay ! ffdec_h264 ! ffmpegcolorspace ! autovideosink
Related
I wanted to create a RTP-stream of a mp4-file with gstreamer.
I am using gstreamer 1.18.4 on debian bullseye.
To create a mp4-file I recorded an RTSP-stream from my webcam using the following command:
gst-launch-1.0 -e rtspsrc location="rtsp://192.168.111.146/axis-media/media.amp" port-range=28000-38000 buffer-mode=0 latency=80 ! rtph264depay ! h264parse ! mp4mux ! filesink location=filename.mp4
After recording the file filename.mp4 I tried to stream it using RTP:
gst-launch-1.0 filesrc location=filename.mp4 ! qtdemux ! h264parse ! avdec_h264 ! x264enc ! h264parse ! rtph264pay ! udpsink port=50000 host=127.0.0.1
And the playback of the stream could be started using the following command on the same machine:
gst-launch-1.0 udpsrc address=127.0.0.1 port=50000 auto-multicast=true ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! avdec_h264 ! autovideosink
Everything works as expected!
But since I don't want to transcode the file, I just wanted to skip the decoding and encoding part. Therefore, I created the following pipelines:
gst-launch-1.0 filesrc location=filename.mp4 ! qtdemux ! h264parse ! rtph264pay ! udpsink port=50000 host=127.0.0.1
and
gst-launch-1.0 filesrc location=filename.mp4 ! qtdemux ! rtph264pay ! udpsink port=50000 host=127.0.0.1
However, if I retry the playback pipeline (the pipeline with udpsrc) on both pipelines the stream is not displayed.
Interestingly, nload shows network traffic on lo.
What is wrong with the streaming pipelines?
Did I miss some magic-plugin in between?
Meanwhile I found an answer to my question.
Changing the stream-server-pipeline from
gst-launch-1.0 filesrc location=filename.mp4 ! qtdemux ! h264parse ! rtph264pay ! udpsink port=50000 host=127.0.0.1
to
gst-launch-1.0 filesrc location=filename.mp4 ! qtdemux ! h264parse config-interval=-1 ! rtph264pay ! udpsink port=50000 host=127.0.0.1
solves the issue.
Thus, the difference is setting the parameter config-interval=-1 for h264parse.
I have an issue in sending H264 streams to linphone.
When I play the following pipeline :
gst-launch-1.0 -v filesrc location=C:/ test.mp4 ! qtdemux ! avdec_h264 ! x264enc bitrate=192 ! rtph264pay ! udpsink host=127.0.0.1 port=9078
Everything is ok and the video played in linphone screen.
But what I want to do is to save first the video streams into a file then send this file to the linphone. What I did is the following:
Saving into a file
gst-launch-1.0 -v filesrc location=C:/ test.mp4 ! qtdemux ! avdec_h264 ! x264enc bitrate=192 ! filesink location=C: /videosample
Send to linphone:
gst-launch-1.0 -v filesrc location=C: /videosample ! h264parse ! rtph264pay ! udpsink host=127.0.0.1 port=9078
The packets received but the linphone played a black screen.
I want to know what I am missing in my pipelines or if there is a specific parameter to set.
Note that it is working when a play a gst receiver :
gst-launch-1.0 -v udpsrc port=9078 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! avdec_h264 ! autovideosink
I'm trying to stream an arbitrary file with gstreamer, I have the following command line but it does not work (I will use python when I get this to work)
gst-launch-1.0 uridecodebin uri=file:///tmp/File.mkv name=decoder name=decbin \
! queue\
! videoconvert ! x264enc \
! mp4mux name=muxer ! udpsink host=127.0.0.1 port=1234 decbin. \
! queue \
! audioconvert ! lamemp3enc ! muxer.
and playing with
gst-launch-1.0 udpsrc port=1234 ! 'application/x-rtp,payload=96'\
! rtph264depay ! decodebin ! xvimagesink sync=false
I know I have to add rtph264pay and rtpmpapay but I don't know where.
I'm trying to decode a video from h264 and reencode it to transfer to a client trhough udp:
On the transmitter side:
gst-launch-1.0 -v filesrc location=/home/ubuntu/Videos/test.mp4 ! qtdemux name=demux ! h264parse ! omxh264dec ! nvvidconv ! omxh264enc ! rtph264pay ! udpsink host=127.0.0.1 port=5000
On the receiver side:
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=H264 ! rtph264depay ! h264parse ! omxh264dec ! nveglglessink
I do it locally, for a test, on an NVidia Tegra TK1, but nothing is being displayed although no error is being raised.
Does anybody see something to add? Thanks in advance.
Ok, I finally made it work, but using an other network protocol:
Just for testing, send and receive locally on the Tegra TK1 itself:
Send:
gst-launch-1.0 filesrc location=/home/ubuntu/Videos/test.mp4 ! qtdemux name=demux ! h264parse ! omxh264dec ! nvvidconv ! omxh264enc ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=127.0.0.1 port=5000
Receive:
gst-launch-1.0 -v tcpclientsrc host=127.0.0.1 port=5000 ! gdpdepay ! rtph264depay ! h264parse ! omxh264dec ! nveglglessink sync=false
Result is choppy, but I don't care at this stage. I receive something!!
Receive on Ubuntu PC:
gst-launch-1.0 -v tcpclientsrc host=<Tegra IP> port=5000 ! gdpdepay ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink sync=false
I'm trying to create gstreamer pipeline with rtpbin to stream webcam both way (videophone). However, I am not even able to make rtpbin work with simple snippet like below which just takes webcam source and streams out, then other udpsrc captures RTP packets and displays. All localhost. When splitted to two pipes and launched separately, it works. This, however, not. I feel it has something with threading, however I am stucked here as no queue worked for me so far.
Basically, what I need is displaying incomming videostream and stream out my webcam videostream out to remote party.
gst-launch -v \
gstrtpbin name=rtpbin \
udpsrc caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H263" port=5000 ! rtpbin. \
rtpbin. ! rtph263depay ! ffdec_h263 ! ffmpegcolorspace ! xvimagesink \
v4l2src ! video/x-raw-yuv, framerate=30/1, width=320, height=240 ! videoscale ! videorate ! "video/x-raw-yuv,width=352,height=288,framerate=30/1" ! ffenc_h263 ! rtph263pay ! rtpbin. \
rtpbin. ! udpsink port=5000
Ok, I have to answer to myself, it was enough to add sync=false async=false to the udpsink:
gst-launch -v \
gstrtpbin name=rtpbin udpsrc caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H263" port=5000 ! queue ! rtpbin. \
rtpbin. ! rtph263depay ! ffdec_h263 ! ffmpegcolorspace ! xvimagesink \
v4l2src ! video/x-raw-yuv, framerate=30/1, width=320, height=240 ! videoscale ! videorate ! "video/x-raw-yuv,width=352,height=288,framerate=30/1" ! ffenc_h263 ! rtph263pay ! rtpbin. \
rtpbin. ! udpsink port=5000 sync=false async=false