Gstreamer - how to capture a single frame from udp source - gstreamer

When I stream over udp and autovideosink on the client side (in this example both sender and receiver are on same host), all works fine, but when I try to filesink it and capture a single frame, all my attempts fail. The file been created, but it's empty.
Here is the source:
gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,width=640,height=480,framerate=30/1 ! jpegenc ! rtpjpegpay ! udpsink host=127.0.0.0 port=5000 -e -v
One of the non working clients:
$ gst-launch-1.0 udpsrc port=5000 num-buffers=1 ! application/x-rtp,encoding-name=JPEG! rtpjpegdepay ! jpegdec ! jpegenc ! filesink location=test.jpeg -e
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 0:00:00.112594509
Setting pipeline to NULL ...
Freeing pipeline ...
How do I capture it?

You use snapshot parameters of jpegenc et remove num-buffers=1 from udpsrc
https://gstreamer.freedesktop.org/documentation/jpeg/jpegenc.html?gi-language=c
Is something like that work for you ?
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! jpegenc snapshot=TRUE ! filesink location=test.jpeg

Related

Sending H264 streams to linphone through gstreamer

I have an issue in sending H264 streams to linphone.
When I play the following pipeline :
gst-launch-1.0 -v filesrc location=C:/ test.mp4 ! qtdemux ! avdec_h264 ! x264enc bitrate=192 ! rtph264pay ! udpsink host=127.0.0.1 port=9078
Everything is ok and the video played in linphone screen.
But what I want to do is to save first the video streams into a file then send this file to the linphone. What I did is the following:
Saving into a file
gst-launch-1.0 -v filesrc location=C:/ test.mp4 ! qtdemux ! avdec_h264 ! x264enc bitrate=192 ! filesink location=C: /videosample
Send to linphone:
gst-launch-1.0 -v filesrc location=C: /videosample ! h264parse ! rtph264pay ! udpsink host=127.0.0.1 port=9078
The packets received but the linphone played a black screen.
I want to know what I am missing in my pipelines or if there is a specific parameter to set.
Note that it is working when a play a gst receiver :
gst-launch-1.0 -v udpsrc port=9078 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! avdec_h264 ! autovideosink

How to stream a file.avi over network using gstreamer

I'm trying to use gstreamer to send a sample file .avi over a network. The code that I'm using to build my pipeline is the following:
gst-launch-1.0 -v rtpbin name=rtpbin latency=200 \
filesrc location=filesrc location=/home/enry/drop.avi ! decodebin name=dec \
dec. ! queue ! x264enc byte-stream=false bitrate=300 ! rtph264pay ! rtpbin.send_rtp_sink_0 \
rtpbin.send_rtp_src_0 ! udpsink port=5000 host=127.0.0.1 ts-offset=0 name=vrtpsink \
rtpbin.send_rtcp_src_0 ! udpsink port=5001 host=127.0.0.1 sync=false async=false \
name=vrtcpsink udpsrc port=5005 name=vrtpsrc ! rtpbin.recv_rtcp_sink_0
When I try to execute this command I'm getting this error:
gstavidemux.c(5383): gst_avi_demux_loop ():
/GstPipeline:pipeline0/GstDecodeBin:dec/GstAviDemux:avidemux0:
streaming stopped, reason not-linked
Execution ended after 0:00:00.032906515
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
Can you help me to solve this?
I think you would need a demux element before the decodebin, since an avi file would consist both an audio and video but you are using just one decodebin to decode them.
Take a look at the avidemux element here. You need it to get the video stream from the file.

Feed a video file to v4l2sink using gstreamer

I would like to feed a video file to my virtual video device using gstreamer and v4l2loopback.
Using videotestsrc, something like this works (i.e. I can open my virtual device from VLC):
gst-launch -v videotestsrc ! queue ! decodebin2 name=dec ! queue ! ffmpegcolorspace ! v4l2sink device=/dev/video0
However, the exact same code does not work with my video file:
gst-launch filesrc location=~/Documents/my_video.ogv ! queue ! decodebin2 name=dec ! queue ! ffmpegcolorspace ! v4l2sink device=/dev/video0
It actually gets stuck in the "PREROLLING" phase:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Can anybody see why? Do I miss some conversion between filesrc and decodebin2?
I don't know why exactly, but I was missing the ! videoscale ! step. And the ! queue !'s are apparently not necessary.
Here is the working line:
gst-launch filesrc location=~/Documents/my_video.ogv ! decodebin2 ! ffmpegcolorspace ! videoscale ! ffmpegcolorspace ! v4l2sink device=/dev/video0

Streaming an mpeg2-ts video over RTP using gstreamer

I am trying to stream an mpeg2-ts video over RTP using gstreamer. I am using the following pipeline for the server:
gst-launch-0.10 -v filesrc location=/home/…/miracast_sample.mpeg ! rtpmp2tpay ! udpsink host=localhost port=5000 sync=false
The problem that I am facing is that I get directly an EOS event like described below:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstRTPMP2TPay:rtpmp2tpay0: timestamp = 3878456990
/GstPipeline:pipeline0/GstRTPMP2TPay:rtpmp2tpay0: seqnum = 50764
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 126835285 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
I can understand that it is running very fast but how to fix it?
You have set sync=FALSE, and that translates to "do not sync on timestamps, but process the buffer as fast as possible." Try and change it to TRUE, like so:
gst-launch-0.10 -v filesrc location=/home/…/miracast_sample.mpeg ! rtpmp2tpay ! udpsink host=localhost port=5000 sync=1
I had the same problem as you and my coworker suggested I insert a tsparse set-timestamps=true between the filesrc and rtpmp2tpay. It worked for me, so try changing your pipeline to
gst-launch-0.10 -v filesrc location=/home/…/miracast_sample.mpeg ! \
tsparse set-timestamps=true ! rtpmp2tpay ! udpsink host=localhost port=5000 sync=false
Have you tried to demux it and then mux it ...
such as:
server:
gst-launch-0.10 -v filesrc location=file_to_stream.ts ! tsdemux program-number=811 ! mpegtsmux ! rtpmp2tpay ! udpsink host=localhost port=5000 sync=1
client:
gst-launch-0.10 udpsrc port=5000 caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)MP2T-ES" ! gstrtpbin ! rtpmp2tdepay ! tsdemux ! mpeg2dec ! ffmpegcolorspace ! autovideosink

Webcam streaming using gstreamer over UDP

Here is what I'm trying:
gst-launch -v udpsrc port=1234 ! fakesink dump=1
I test with:
gst-launch -v audiotestsrc ! udpsink host=127.0.0.1 port=1234
And everything works fine, I can see the packages arriving from the audiotestsrc
Now lets test with the webcam source:
gst-launch -v v4l2src device=/dev/video0 ! queue ! videoscale method=1 ! "video/x-raw-yuv,width=320,height=240" ! queue ! videorate ! "video/x-raw-yuv,framerate=(fraction)15/1" ! queue ! udpsink host=127.0.0.1 port=1234
And nothing happens, no package appears in the dump.
Here is a logdump of what verbose shows in the server.
Does anyone have a clue on this?
Try these (You may have to install gstreamer-ugly plugins for this one)
UDP streaming from Webcam (stream over the network)
gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=640,height=480' ! x264enc pass=qual quantizer=20 tune=zerolatency ! rtph264pay ! udpsink host=127.0.0.1 port=1234
UDP Streaming received from webcam (receive over the network)
gst-launch udpsrc port=1234 ! "application/x-rtp, payload=127" ! rtph264depay ! ffdec_h264 ! xvimagesink sync=false
Update
To determine the payload at the streaming end simply use verbose option with gst-launch -v ...
Maybe packets are too large for udp? They are limited to 64K. Try resizing frames to really small size to check if this is the reason. If so, you may be interested in some compression and payloaders/depayloaders (gst-inspect | grep pay).
gstreamer1-1.16.0-1.fc30
gst-launch-1.0 -v filesrc location=/.../.../.../sample-mp4-file.mp4 ! qtdemux ! h264parse ! queue ! rtph264pay config-interval=10 pt=96 ! udpsink port=8888 host=127.0.0.1
https://en.wikipedia.org/wiki/RTP_audio_video_profile