Transform RTP into RTSP with gstreamer - gstreamer

I have a third party application that reads data from a thermal camera and generates a RTP stream to a given UDP source. I am trying to wrap this RTP into a RTSP stream but I am running into problems...
The third party application basically runs gstreamer with this command
appsrc format=GST_FORMAT_TIME is-live=true block=true caps=video/x-raw,width=640,height=480,format=GRAY8,clock-rate=90000,framerate=10/1 ! openjpegenc ! rtpj2kpay ! udpsink host=127.0.0.1 port=3000
Using the command below I can visualize the stream on my machine
gst-launch-1.0 udpsrc port=3000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG2000, sampling=(string)GRAYSCALE, width=(int)640, height=(int)480, payload=(int)96" ! queue ! rtpj2kdepay ! openjpegdec ! videoconvert ! xvimagesink
However when trying to use the default RTP to RTSP application example using https://github.com/freedesktop/gstreamer-gst-rtsp-server/blob/master/examples/test-launch.c to just forward it with a RTSP container the connection fails with VLC. Command below:
./rtp-src-to-rtsp '( udpsrc port=3000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG2000, sampling=(string)GRAYSCALE, width=(int)640, height=(int)480, payload=(int)96" ! queue ! rtpj2kdepay ! rtpj2kpay )'
Any light on what I am doing wrong? VLC gives only a non-descriptive error
live555 error: Nothing to play for rtsp://{IP}:{PORT}/test

It might be a lack of support of J2K in VLC (I'm using revision 3.0.8-0).
Simulating your source with:
gst-launch-1.0 videotestsrc ! video/x-raw,width=640,height=480,framerate=10/1,format=GRAY8 ! openjpegenc ! rtpj2kpay ! udpsink host=127.0.0.1 port=3000
and relaying as RSTP with:
./test-launch "udpsrc port=3000 auto-multicast=0 ! application/x-rtp,encoding-name=JPEG2000,sampling=GRAYSCALE ! queue ! rtpj2kdepay ! image/x-jpc ! jpeg2000parse ! rtpj2kpay name=pay0 "
works on Linux with X using:
gst-launch-1.0 rtspsrc location=rtsp://127.0.0.1:8554/test ! application/x-rtp, encoding-name=JPEG2000,sampling=GRAYSCALE ! rtpj2kdepay ! jpeg2000parse ! openjpegdec ! videoconvert ! xvimagesink -v
Though, I haven't been able to receive with VLC, nor able to make a correct J2K/RTP SDP for VLC nor ffmpeg. Someone better skilled may further advise.

Related

GStreamer: get video from RTP, setting the format automatically

Let's say I've got a simple RTP video server:
ximagesrc ! video/x-raw,framerate=30/1 ! videoscale ! videoconvert ! x264enc tune=zerolatency bitrate=500 speed-preset=slow ! rtph264pay ! udpsink host=127.0.0.1 port=5000
And I'm receiving it just fine:
udpsrc port=5000 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! videoconvert ! autovideosink
But if my server was sending H265 (or any other format), I'd have to use another pipeline. Considering there are a ton of possible formats, that is definitely something I'd like to avoid. Is there any way to get my video decoded from any format?
No. RTP streams are supposed to be initiated by SDP and RTSP control protocols which will tell you what kind of stream you are dealing with. If you want self describing media, perhaps try looking at MPEG-TS instead.

Sending H264 streams to linphone through gstreamer

I have an issue in sending H264 streams to linphone.
When I play the following pipeline :
gst-launch-1.0 -v filesrc location=C:/ test.mp4 ! qtdemux ! avdec_h264 ! x264enc bitrate=192 ! rtph264pay ! udpsink host=127.0.0.1 port=9078
Everything is ok and the video played in linphone screen.
But what I want to do is to save first the video streams into a file then send this file to the linphone. What I did is the following:
Saving into a file
gst-launch-1.0 -v filesrc location=C:/ test.mp4 ! qtdemux ! avdec_h264 ! x264enc bitrate=192 ! filesink location=C: /videosample
Send to linphone:
gst-launch-1.0 -v filesrc location=C: /videosample ! h264parse ! rtph264pay ! udpsink host=127.0.0.1 port=9078
The packets received but the linphone played a black screen.
I want to know what I am missing in my pipelines or if there is a specific parameter to set.
Note that it is working when a play a gst receiver :
gst-launch-1.0 -v udpsrc port=9078 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! avdec_h264 ! autovideosink

Getting solid green image with Gstreamer, VP8, and RTP

I don't understand why I'm getting a green image. I'd appreciate any insights.
Producer:
gst-launch-0.10 -v videotestsrc ! vp8enc ! rtpvp8pay ! udpsink host=127.0.0.1 port=9001
Consumer:
gst-launch-0.10 udpsrc port=9001 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)VP8-DRAFT-IETF-01, payload=(int)96, ssrc=(uint)2990747501, clock-base=(uint)275641083, seqnum-base=(uint)34810" ! rtpvp8depay ! vp8dec ! ffmpegcolorspace ! autovideosink
This is using the gstreamer-sdk-2013.6-universal.pkg package published from Gstreamer on OS X Mavericks.
REF: http://delog.wordpress.com/2011/04/14/stream-webm-video-over-rtp-with-gstreamer/
REF: http://delog.wordpress.com/2011/05/20/vp8-video-streaming-over-rtp-using-the-rtpbin-plugin-of-gstreamer/
I've tried it and it works on my machine (Arch linux).
Have you tried tweaking the ssrc and clock-base to make them equal ? Not sure if it can solve your problem because it works as-is on my machine.

Gstreamer mux, caps refused

I'm using the following pipeline (SIMPLIFIED) in Gstreamer OSS Build 0.10.7 on Win 7 x64:
udpsrc ! application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96 !
gstrtpjitterbuffer latency=200 ! rtph264depay ! tee name=h264Tee
h264Tee. ! queue ! h264parse ! mux.
matroskamux name=mux ! filesink location=rec.mkv sync=false // same for avimux/mp4/qt
h264Tee. ! queue ! ffdec_h264 ! tee name=videoTee
//.videoTee ! queue ! dx9videosink
//.videoTee ! queue ! appsink
//udpsrc ! queue ! directsoundsink
audiotestsrc ! mux. //only for testing, should be connected to udpsrc
The pipeline is launched via Gstreamer-Sharp.
Here's the console output of the pipeline:
WARN default xoverlay.c:354:gst_x_overlay_set_xwindow_id:<videoSink> Using deprecated gst_x_overlay_set_xwindow_id()
ERROR d3dvideosink d3dvideosink.c:2204:gst_d3dvideosink_release_swap_chain: Direct3D device has not been initialized
WARN bin gstbin.c:2378:gst_bin_do_latency_func:<pipeline0> failed to query latency
WARN matroskamux matroska-mux.c:970:gst_matroska_mux_video_pad_setcaps:<mux> pad video_0 refused caps 05370C40
Both video and audio are playing just fine as long as I leave out the muxer. When include the muxer in the pipeline, the video freezes immedeately and no sound can be heard. What's wrong why does the muxer refuse the caps?
Ok solved it my self:
The video caps above don't contain sprop-parameter-sets which aren't needed for playback. For encoding however they are needed since various properties of the stream are encoded within these:
udpsrc !
application/x-rtp, media=(string)video, clock-rate=(int)90000,
encoding-name=(string)H264,
sprop-parameter-sets= (string)\"Z0LADdkBQfsBEAAAAwAQAAADAyjxQqSA\\,aMuMTIA\\=\",
payload=(int)96,
ssrc (uint)2332354585,
clock-base=(uint)1158355497,
seqnum-base=(uint)10049 !
gstrtpjitterbuffer latency=200 ! rtph264depay ! tee name=h264Tee
...

Webcam streaming using gstreamer over UDP

Here is what I'm trying:
gst-launch -v udpsrc port=1234 ! fakesink dump=1
I test with:
gst-launch -v audiotestsrc ! udpsink host=127.0.0.1 port=1234
And everything works fine, I can see the packages arriving from the audiotestsrc
Now lets test with the webcam source:
gst-launch -v v4l2src device=/dev/video0 ! queue ! videoscale method=1 ! "video/x-raw-yuv,width=320,height=240" ! queue ! videorate ! "video/x-raw-yuv,framerate=(fraction)15/1" ! queue ! udpsink host=127.0.0.1 port=1234
And nothing happens, no package appears in the dump.
Here is a logdump of what verbose shows in the server.
Does anyone have a clue on this?
Try these (You may have to install gstreamer-ugly plugins for this one)
UDP streaming from Webcam (stream over the network)
gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=640,height=480' ! x264enc pass=qual quantizer=20 tune=zerolatency ! rtph264pay ! udpsink host=127.0.0.1 port=1234
UDP Streaming received from webcam (receive over the network)
gst-launch udpsrc port=1234 ! "application/x-rtp, payload=127" ! rtph264depay ! ffdec_h264 ! xvimagesink sync=false
Update
To determine the payload at the streaming end simply use verbose option with gst-launch -v ...
Maybe packets are too large for udp? They are limited to 64K. Try resizing frames to really small size to check if this is the reason. If so, you may be interested in some compression and payloaders/depayloaders (gst-inspect | grep pay).
gstreamer1-1.16.0-1.fc30
gst-launch-1.0 -v filesrc location=/.../.../.../sample-mp4-file.mp4 ! qtdemux ! h264parse ! queue ! rtph264pay config-interval=10 pt=96 ! udpsink port=8888 host=127.0.0.1
https://en.wikipedia.org/wiki/RTP_audio_video_profile