GStreamer: get video from RTP, setting the format automatically - c++

Let's say I've got a simple RTP video server:
ximagesrc ! video/x-raw,framerate=30/1 ! videoscale ! videoconvert ! x264enc tune=zerolatency bitrate=500 speed-preset=slow ! rtph264pay ! udpsink host=127.0.0.1 port=5000
And I'm receiving it just fine:
udpsrc port=5000 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! videoconvert ! autovideosink
But if my server was sending H265 (or any other format), I'd have to use another pipeline. Considering there are a ton of possible formats, that is definitely something I'd like to avoid. Is there any way to get my video decoded from any format?

No. RTP streams are supposed to be initiated by SDP and RTSP control protocols which will tell you what kind of stream you are dealing with. If you want self describing media, perhaps try looking at MPEG-TS instead.

Related

Transform RTP into RTSP with gstreamer

I have a third party application that reads data from a thermal camera and generates a RTP stream to a given UDP source. I am trying to wrap this RTP into a RTSP stream but I am running into problems...
The third party application basically runs gstreamer with this command
appsrc format=GST_FORMAT_TIME is-live=true block=true caps=video/x-raw,width=640,height=480,format=GRAY8,clock-rate=90000,framerate=10/1 ! openjpegenc ! rtpj2kpay ! udpsink host=127.0.0.1 port=3000
Using the command below I can visualize the stream on my machine
gst-launch-1.0 udpsrc port=3000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG2000, sampling=(string)GRAYSCALE, width=(int)640, height=(int)480, payload=(int)96" ! queue ! rtpj2kdepay ! openjpegdec ! videoconvert ! xvimagesink
However when trying to use the default RTP to RTSP application example using https://github.com/freedesktop/gstreamer-gst-rtsp-server/blob/master/examples/test-launch.c to just forward it with a RTSP container the connection fails with VLC. Command below:
./rtp-src-to-rtsp '( udpsrc port=3000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG2000, sampling=(string)GRAYSCALE, width=(int)640, height=(int)480, payload=(int)96" ! queue ! rtpj2kdepay ! rtpj2kpay )'
Any light on what I am doing wrong? VLC gives only a non-descriptive error
live555 error: Nothing to play for rtsp://{IP}:{PORT}/test
It might be a lack of support of J2K in VLC (I'm using revision 3.0.8-0).
Simulating your source with:
gst-launch-1.0 videotestsrc ! video/x-raw,width=640,height=480,framerate=10/1,format=GRAY8 ! openjpegenc ! rtpj2kpay ! udpsink host=127.0.0.1 port=3000
and relaying as RSTP with:
./test-launch "udpsrc port=3000 auto-multicast=0 ! application/x-rtp,encoding-name=JPEG2000,sampling=GRAYSCALE ! queue ! rtpj2kdepay ! image/x-jpc ! jpeg2000parse ! rtpj2kpay name=pay0 "
works on Linux with X using:
gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test ! application/x-rtp, encoding-name=JPEG2000,sampling=GRAYSCALE ! rtpj2kdepay ! jpeg2000parse ! openjpegdec ! videoconvert ! xvimagesink -v
Though, I haven't been able to receive with VLC, nor able to make a correct J2K/RTP SDP for VLC nor ffmpeg. Someone better skilled may further advise.

Why is encoding required for video transfer? (gstreamer)

The video I transferred is already encoded. Why do I encode again when transferring?
example: gst-launch-1.0 -v filesrc location=123.mp4 ! decodebin ! x264enc ! rtph264pay ! udpsink host=192.168.10.186 port=9001
Just send the video without encoding. Can I view it on the other side?
for example:
server: gst-launch-1.0 -v filesrc location =123.mp4 ! udpsink host=192.168.10.186 port=9001
123.mp4 encoded h265
client: gst-launch-1.0 udpsrc port=9001 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H265, payload=(int)96" ! rtph265depay ! h265parse ! nvh265dec ! autovideosink
best regards
Okay, with the clarification that your input is assumed to be an MPEG4 file which contains an H.265: yes, then it's possible (if this is assumption doesn't hold, then this will not work).
The following should do the trick:
gst-launch-1.0 filesrc location=123.mp4 ! qtdemux ! h265parse config-interval=-1 ! rtph265pay ! udpsink host=192.168.10.186 port=9001
Explanation:
the qtdemux will demux the MPEG4 container into the contained video/audio/subtitile streams (if there's more than one stream inside that container, you'll need to link to it multple times, or GStreamer will error out)
the h265parse config-interval=1 will make sure that your contains the correct SPS/PPS parameter sets. If the stream inside your original input file is not an H265 video stream, this will fail to link.
the rth265pay will translate this into RTP packets.
... which the udpsink can then send over the specified socket
P.S.: you might also be interested in the rtpsink (which used to live out-of-tree, but is now included in the latest master of GStreamer)
P.P.S.: you should use an even port to send an RTP stream

How to add audio to a h264 video stream using gstreamer

I can successfully stream HD video using following pipelines:
streame server:
gst-launch-1.0 filesrc location="Gravity.2013.720p.BluRay.x264.YIFY.mp4" ! decodebin ! x264enc ! rtph264pay pt=96 ssrc=0 timestamp-offset=0 seqnum-offset=0 pt=96 ! gdppay ! tcpclientsink host=192.168.1.93 port=5000
client:
gst-launch-1.0 tcpserversrc host=192.168.1.93 port=5000 ! gdpdepay ! rtph264depay ! decodebin ! autovideosink
I want to add the audio stream too.
I guess it is possible to use a different port and tcpserver/tcpclient combination to stream audio parallel to video. But I am not certain, how gstreamer would synchronize two streams properly to play the movie in the client end. Apart from this method, are there any other methods? such as muxing two streams before and demuxing it in client end?

gstreamer rtpvp8depay cannot decode stream

I have two GStreamer instances : a sender and a receiver. I want to stream RTP / VP8 video. It works perfectly fine if I stream via UDP, like this :
sender
gst-launch-0.10 -v videotestsrc ! vp8enc ! rtpvp8pay ! udpsink host=127.0.0.1 port=9001
receiver
gst-launch-0.10 udpsrc port=9001 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)VP8-DRAFT-IETF-01, payload=(int)96" ! rtpvp8depay ! vp8dec ! ffmpegcolorspace ! autovideosink
That works fine. But when I try to stream throug a FIFO / named pipe (done with mkfifo()) with :
sender
gst-launch-0.10 -v videotestsrc ! vp8enc ! rtpvp8pay ! filesink location = myPipe
receiver
gst-launch-0.10 filesrc location = myPipe ! capsfilter caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)VP8-DRAFT-IETF-01, payload=(int)96 ! rtpvp8depay ! vp8dec ! ffmpegcolorspace ! autovideosink
It fails and my receiver continusouly outputs :
WARNING: from element /GstPipeline:pipeline0/GstRtpVP8Depay:rtpvp8depay0: Could not decode stream.
Additional debug info:
gstbasertpdepayload.c(387): gst_base_rtp_depayload_chain (): /GstPipeline:pipeline0/GstRtpVP8Depay:rtpvp8depay0:
Received invalid RTP payload, dropping
I think I read somewhere (but can't find it again) that it was because when using UDP, the RTP packets were separated properly, while using a named pipe like this, the packets being written are "chained" (not properly separated) and thus gstreamer doesn't know how much bytes to read to get a RTP packet.
Is this correct, and if yes, how can I change that ?
Thanks in advance !
When going through a named pipe, the RTP are not packetized properly. You could either,
Send the encoded stream directly through as a byte-stream, without using the rtpvp8pay element.
Use another RTP element in GStreamer that handles byte-stream format, such as rtpstreampay or rtpgdppay. (I believe the rtpstreampay might be a GStreamer 1.0 element though.)
I finally solved my problem.
I did not manage the byte-stream thing through a pipe, but I managed to use an AppSrc to feed the gst pipeline.
So my whole pipeline (might be useful for other people) looks like this : appsrc -> rtpvp8depay -> vp8dec -> videoconvert -> videoscale -> appsink (I'm using Gstreamer1.0 on ArchLinux).
Hope this helps !

A lot of buffers are being dropped

I'm trying to stream with RTP and the client says that there is allot of packet drops.
Server pipeline:
gst-launch videotestsrc ! x264enc ! rtph264pay ! udpsink host=192.168.1.16 port=5000
Client pipeline:
gst-launch udpsrc uri=udp://192.168.1.16:5000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z01AFeygoP2AiAAAAwALuaygAHixbLA\\=\\,aOvssg\\=\\=\", payload=(int)96, ssrc=(uint)1645090291, clock-base=(uint)1778021115, seqnum-base=(uint)28353" ! rtph264depay ! queue ! ffdec_h264 ! autovideosink
I can see the video on the client, but very slowly with a lot of packet drops. The error on the client side:
from element /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage:
A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(2875): gst_base_sink_is_too_late (): /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage:
There may be a timestamping problem, or this computer is too slow
If someone could tell me what am I doing wrong here, that would be great!
Instead of using "autovideosink" try "xvimagesink sync=false".