Gstreamer with two Udpsinks - gstreamer

I like to know how to receiver Gstreamer Live video Streaming which has dual udpsink on the Receiver sides.
Here i provide single Udpsink transmitter and receiver which works absolutely fine
Sender : "raspivid -t 999999 -h 480 -w 640 -fps 25 -b 2000000 -o - | gst-launch-0.10 -v fdsrc fd=0 ! h264parse ! rtph264pay ! udpsink host=192.168.0.105 port=5000"
Receiver : "udpsrc port=5000 caps=application/x-rtp buffer-size=100000 ! rtph264depay ! ffdec_h264 ! queue ! autovideosink sync=false"
Dual Udpsink Sender: raspivid -t 999999 -h 480 -w 640 -fps 25 -b 2000000 -o - | gst-launch-0.10 -v fdsrc fd=0 ! tee name=tp \tp. h264parse ! rtph264pay ! udpsink host=192.168.0.105 port=5000 \tp. ! h264parse ! rtph264pay ! udpsink host=192.168.0.100 port=5005
Now i am unable to receiver on anyone of the receiver side.. Please any help me.. Thanks in advance.

Use a single multiudpsink element instead:
raspivid -t 999999 -h 480 -w 640 -fps 25 -b 2000000 -o - | gst-launch-0.10 -v fdsrc fd=0 ! h264parse ! rtph264pay ! multiudpsink clients=192.168.0.105:5000,192.168.0.100:5005

Related

Corrupted H264 video when streaming via RTP/UDP

I'm trying to stream a video encoded in H264 over RTP/UDP.
Sending:
gst-launch-1.0 \
videotestsrc ! \
video/x-raw,format=RGBx,width=960,height=540,framerate=25/1 ! \
videoconvert ! \
x264enc bitrate=2000 ! \
rtph264pay config-interval=1 pt=96 ! \
udpsink port=5000
Receive:
gst-launch-1.0 \
udpsrc port=5000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! \
rtph264depay ! \
decodebin ! \
videoconvert ! \
ximagesink
If I start receiving the video before sending it, then everything works as intended.
However, if I start receiving video after the start of sending, then the image "breaks".
An example of a corrupted image
How to fix this problem?
The problem was solved by specifying caps after videoconvert
...
videoconvert ! video/x-raw,format=I420
...

Recording multiple RTSP streams h265 format to Kinesis Video Streams using Gstreamer and Kvssink

I need to record 4 RTSP streams into a single stream of the Kinesis Video Streams.
Streams must be placed in the video like this:
---------- ----------
| | |
| STREAM 1 | STREAM 2 |
| | |
|----------|----------|
| | |
| STREAM 3 | STREAM 4 |
| | |
---------- ----------
I was able to insert a single stream and make it work perfectly, using the command below:
gst-launch-1.0 rtspsrc user-id="admin" user-pw="password" location="rtsp://admin:password#192.168.0.1:554/cam/realmonitor?channel=1&subtype=0" short-header=TRUE ! rtph265depay ! h265parse ! video/x-h265, alignment=au ! kvssink stream-name="test-stream" storage-size=512 access-key="access-key" secret-key="secret-key" aws-region="us-east-1"
However, my goal is to insert an array of streams into the same stream in the Kinesis Video Streams.
For this I found the example with videomixer that's below:
gst-launch-1.0 -e rtspsrc location=rtsp_url1 ! rtph264depay ! h264parse ! decodebin ! videoconvert! m.sink_0 \
rtspsrc location=rtsp_url2 ! rtph264depay ! h264parse ! decodebin ! videoconvert! m.sink_1 \
rtspsrc location=rtsp_url3 ! rtph264depay ! h264parse ! decodebin ! videoconvert! m.sink_2 \
rtspsrc location=rtsp_url4 ! rtph264depay ! h264parse ! decodebin ! videoconvert! m.sink_3 \
videomixer name=m sink_1::xpos=1280 sink_2::ypos=720 sink_3::xpos=1280 sink_3::ypos=720 ! x264enc ! mp4mux ! filesink location=./out.mp4 sync=true
I adapted the example to just two streams and made it work inside the container, using a command like the one below:
gst-launch-1.0 -e rtspsrc user-id="admin" user-pw="password" location="rtsp://password#192.168.0.1:554/cam/realmonitor?channel=1&subtype=0" short-header=TRUE ! rtph265depay ! h265parse ! video/x-h265, alignment=au ! libde265dec ! videoconvert ! m.sink_0 \
rtspsrc user-id="admin" user-pw="password" location="rtsp://password#192.168.0.2:554/cam/realmonitor?channel=1&subtype=0" short-header=TRUE ! rtph265depay ! h265parse ! video/x-h265, alignment=au ! libde265dec ! videoconvert ! m.sink_1 \
videomixer name=m sink_0::xpos=1080 sink_1::ypos=1080 ! x265enc ! h265parse ! video/x-h265, alignment=au ! kvssink stream-name="test-stream" storage-size=512 access-key="access-key" secret-key="secret-key" aws-region="us-east-1"
And in another way:
gst-launch-1.0 -e videomixer name=mix sink_0::xpos=0 sink_0::ypos=0 sink_0::alpha=0 sink_1::xpos=0 sink_1::ypos=0 \
rtspsrc user-id="admin" user-pw="password" location="rtsp://password#192.168.0.1:554/cam/realmonitor?channel=1&subtype=0" short-header=TRUE ! rtph265depay ! h265parse ! video/x-h265, alignment=au ! libde265dec ! videoconvert ! videoscale ! video/x-raw,width=1920,height=1080 ! mix.sink_0 \
rtspsrc user-id="admin" user-pw="password" location="rtsp://password#192.168.0.2:554/cam/realmonitor?channel=1&subtype=0" short-header=TRUE ! rtph265depay ! h265parse ! video/x-h265, alignment=au ! libde265dec ! videoconvert ! videoscale ! video/x-raw,width=1920,height=1080 ! mix.sink_1 \
mix. ! queue ! videoconvert ! x265enc ! queue ! kvssink stream-name="test-stream" storage-size=512 access-key="access-key" secret-key="secret-key" aws-region="us-east-1"
The container in question is from: https://github.com/awslabs/amazon-kinesis-video-streams-producer-sdk-cpp
However, when I log into Kinesis Video Streams and try to download a getClip, in both cases I get this error:
MissingCodecPrivateDataException
Missing codec private data in fragment for track 1.
Status code: 400
The logs with GST_DEBUG=1 can be found at https://gist.github.com/vbbandeira/b15ec8af6986237a4cd7e382e4ede261
And the logs with GST_DEBUG=4 can be found at https://gist.github.com/vbbandeira/6bd4b7a014a69da5f46cd036eaf32aec
Can you guys please let me know what is going on there?
Or if possible, help me find the solution to this error.
Thanks!
for those looking for the same solution, I managed to make it work by replacing the videomixer which is deprecated by the composer, below is an example of the command I used and it worked:
gst-launch-1.0 rtspsrc location="rtsp://password#192.168.0.1:554/cam/realmonitor?channel=1&subtype=0" short-header=TRUE ! decodebin ! videoconvert ! comp.sink_0 \
rtspsrc location="rtsp://password#192.168.0.2:554/cam/realmonitor?channel=1&subtype=0" short-header=TRUE ! decodebin ! videoconvert ! comp.sink_1 \
compositor name=comp sink_0::xpos=0 sink_1::xpos=1280 ! x264enc ! kvssink stream-name="test-stream" storage-size=512 access-key="access-key" secret-key="secret-key" aws-region="us-east-1"
However, I was only able to do this using h264.

gstreamer-1.8.3 v4l2src to multicast

I used gstreamer-1.2.4 to send stream from v4l2src to multicast and to write it in shared memory. Pipeline is:
v4l2src device=${device} do-timestamp=true blocksize=400000 typefind=true \
! "video/x-h264, width=(int)${width}, height=(int)${height}, framerate=(fraction)${framerate}, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1" \
! h264parse \
! "video/x-h264, alignment=(string)au, stream-format=(string)avc, parsed=(boolean)true" \
! tee name=video1tee \
video1tee. ! queue \
! rtph264pay config-interval=1 \
! application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96 \
! queue \
! udpsink multicast-iface=${IFACE_OUT} host=${MULTICAST_OUT_IP_ADDR} port=${VIDEO_OUT_UDP_PORT} ttl-mc=10 auto-multicast=false sync=true async=false
video1tee. ! queue leaky=downstream \
! shmsink socket-path=\"/tmp/camera_shmsink\" shm-size=20000000 wait-for-connection=false max-lateness=5000000000 sync=false async=false > ${logfile} 2>&1
Now, I updated it to gstreamer-1.8.3 and this pipeline doesn't work. Then I try to view this multicast stream, I don't get anything. Hardware has not changed.
Also I have warnings:
v4l2src gstv4l2src.c:862:gst_v4l2src_create:<v4l2src0> lost frames detected: count = 18446744073709551615 - ts: 0:00:09.8444373543
h264parse gsth264parse.c:1205:gst_h264_parse_handle_frame:<h264parse0> broken/invalid nal Type: 1 Slice, Size: 22638 will be dropped
Then I delete
! h264parse \
! "video/x-h264, alignment=(string)au, stream-format=(string)avc, parsed=(boolean)true" \
I get a green screen.
Pipeline to view video:
udpsrc do-timestamp=true buffer-size=30000000 address=227.1.1.11 auto-multicast=true port=51012 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! rtph264depay ! h264parse ! avdec_h264 output-corrupt=false ! autovideoconvert ! videocrop ! videoscale ! videobalance ! video/x-raw, format=(string)YV12, colorimetry=(string)bt601 ! autovideoconvert ! fpsdisplaysink
Why did it work in the old version, but does not work in the new one?

Encoding a audio file using ffenc_aac

I am trying to encode an audio file using gstreamer. I am using the command
gst launch filesrc location=s.pcm ! audio/x-raw-int, rate=4000, channels=2, endianness=1234, width=16, depth=16, signed=true ! ffenc_aac ! filesink location=file.wav
And i am getting an error message:-
Setting pipeline to PAUSED ... Pipeline is PREROLLING ... ERROR: from
element /GstPipeline:pipeline0/GstFileSrc:filesrc0: Internal data flow
error. Additional debug info: gstbasesrc.c(2625): gst_base_src_loop
(): /GstPipeline:pipeline0/GstFileSrc:filesrc0: streaming task paused,
reason not-negotiated (-4) ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ... Freeing pipeline ...
can any one guide me to overcome this issue
Don't confuse encoding with containers. You cannot have an AAC encoded WAV, WAV's are PCM. You can have a 4k WAV or you can have an AAC encoded file in an MP4 or M4A container. Both examples are below. Note that in these examples the AAC encoders get very picky if you try to change the sample rate below 48000.
Create raw audio file
gst-launch audiotestsrc num-buffers=100 \
! audio/x-raw-int, rate=48000, channels=2, endianness=1234, width=16, depth=16, signed=true \
! filesink location=foo.pcm
Encode it as a WAV
gst-launch filesrc location=foo.pcm \
! audio/x-raw-int, rate=48000, channels=2, endianness=1234, width=16, depth=16, signed=true \
! audioresample \
! audio/x-raw-int, rate=4000 \
! wavenc \
! filesink location=foo.wav
Encode it as AAC and mux into mp4
dont really know why I had to encode then decode again, but nothing else worked, even though I could go directly from the audiotest src.
gst-launch filesrc location=foo.pcm \
! audio/x-raw-int, rate=48000, channels=2, endianness=1234, width=16, depth=16, signed=true \
! wavenc \
! wavparse \
! ffenc_aac \
! mp4mux \
! filesink location=foo.mp4
..alternately using faac
the pipeline was a lot cleaner and the output file was smaller
gst-launch filesrc location=foo.pcm \
! audio/x-raw-int, rate=48000, channels=2, endianness=1234, width=16, depth=16, signed=true \
! faac \
! mp4mux \
! filesink location=foo.mp4
or voaacenc
voaacenc wouldn't work below 48000 even though it looks to have the most flexible capabilities. I tried 8k,16k,48k,96k and 44100 which anecdotally changed the pitch of the test tone.
gst-launch filesrc location=foo.pcm \
! audio/x-raw-int, rate=48000, channels=2, endianness=1234, width=16, depth=16, signed=true \
! voaacenc \
! mp4mux \
! filesink location=foo.mp4
Low bit rate AAC
The lowest AAC bitrates I was successful with was 16000, here are those tests, again noting that faac produced the smallest file size.
gst-launch audiotestsrc num-buffers=100 \
! audio/x-raw-int, rate=16000, channels=2, endianness=1234, width=16, depth=16, signed=true \
! ffenc_aac \
! mp4mux \
! filesink location=foo.mp4
gst-launch audiotestsrc num-buffers=100 \
! audio/x-raw-int, rate=16000, channels=2, endianness=1234, width=16, depth=16, signed=true \
! faac \
! mp4mux \
! filesink location=foo.mp4

GStreamer videotestsrc to RTP

I'm just trying to get an RTP sample working, but every example I've seen doesn't execute due to missing plugins or incorrect pins.
This seems the most promising, but, although the server and client seem to launch properly and go to "PLAYING", nothing happens:
Server:
gst-launch -v videotestsrc ! \
video/x-raw-rgb, format=\(fourcc\)RGB, width=4, height=4, frame-rate=1/1 ! rtpvrawpay !
udpsink host=127.0.0.1
Server output:
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)RGB, depth=(string)8, width=(string)4, height=(string)4, colorimetry=(string)SMPTE240M, payload=(int)96, ssrc=(uint)3779397700, clock-base=(uint)1161131286, seqnum-base=(uint)43390
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
Client:
gst-launch-0.10 -v udpsrc caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)RGB, depth=(string)8, width=(string)4, height=(string)4, colorimetry=(string)SMPTE240M, payload=(int)96, ssrc=(uint)3779397700, clock-base=(uint)1161131286, seqnum-base=(uint)43390" ! rtpvrawdepay ! xvimagesink
Client output:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:src: caps = video/x-raw-rgb, width=(int)4, height=(int)4, format=(fourcc)0x00000000, framerate=(fraction)0/1, endianness=(int)4321, red_mask=(int)16711680, green_mask=(int)65280, blue_mask=(int)255, bpp=(int)24, depth=(int)24
/GstPipeline:pipeline0/GstRtpVRawDepay:rtpvrawdepay0.GstPad:sink: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)RAW, sampling=(string)RGB, depth=(string)8, width=(string)4, height=(string)4, colorimetry=(string)SMPTE240M, ssrc=(uint)3779397700, clock-base=(uint)1161131286, seqnum-base=(uint)43390
These work:
Server:
gst-launch-0.10 -v \
gstrtpbin name=rtpbin1 \
videotestsrc ! x264enc ! rtph264pay ! rtpbin1.send_rtp_sink_0 \
rtpbin1.send_rtp_src_0 ! udpsink host=127.0.0.1 port=5011 \
rtpbin1.send_rtcp_src_0 ! udpsink host=127.0.0.1 port=5012 \
udpsrc port=5013 ! rtpbin1.recv_rtcp_sink_0
Client:
gst-launch-0.10 -v \
videomixer name=mix ! ffmpegcolorspace ! autovideosink sync=false async=false \
gstrtpbin name=rtpbin1 \
udpsrc port=5011 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, sprop-parameter-sets=(string)\"Z0LAFdkBQfsBEAAAAwAXc1lAAPFi5IAA\\,aMuMsg\\=\\=\", ssrc=(uint)595281375, payload=(int)96, clock-base=(uint)3105254905, seqnum-base=(uint)59233" ! rtpbin1.recv_rtp_sink_0 rtpbin1. ! rtph264depay ! queue ! ffdec_h264 ! videobox border-alpha=0 top=0 left=0 ! mix. \
udpsrc port=5012 ! rtpbin1.recv_rtcp_sink_0 \
rtpbin1.send_rtcp_src_0 ! udpsink port=5013 host=127.0.0.1