How to read a RTMP stream and send it to Kinesis Video Stream? - gstreamer

I'm trying to read a RTMP video stream (followed this) and send it to Kinesis Video Stream with the GStreamer plugin dockerized with this command:
gst-launch-1.0 rtmpsrc location=rtmp://172.17.0.2/live/test \
! decodebin ! videoconvert \
! video/x-raw,format=I420,width=640,height=480,framerate=30/1 \
! x264enc bframes=0 key-int-max=45 bitrate=500 \
! video/x-h264,stream-format=avc,alignment=au,profile=baseline \
! kvssink stream-name="STREAM_NAME" storage-size=512 \
access-key="ACCESS_KEY" \
secret-key="SECRET_KEY" \
aws-region="REGION"
but I can't and the log says:
...
2023-01-05 20:08:41 [139732104787520] INFO - getStreamingTokenResultEvent(): Get streaming token result event.
2023-01-05 20:08:41 [139732104787520] DEBUG - stepStateMachine(): State Machine - Current state: 0x0000000000000010, Next state: 0x0000000000000040
2023-01-05 20:08:41 [139732104787520] DEBUG - streamReadyHandler invoked
2023-01-05 20:08:41 [139732104787520] Stream is ready
Pipeline is PREROLLING ...
ERROR: from element /GstPipeline:pipeline0/GstRTMPSrc:rtmpsrc0: Internal data stream error.
Additional debug info:
../libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline0/GstRTMPSrc:rtmpsrc0:
streaming stopped, reason not-linked (-1)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
Freeing pipeline ...
INFO - Freeing Kinesis Video Stream test
2023-01-05 20:08:43 [139732691388224] INFO - freeKinesisVideoStream(): Freeing Kinesis Video stream.
2023-01-05 20:08:43 [139732691388224] DEBUG - curlApiCallbacksShutdownActiveRequests(): pActiveRequests hashtable is empty
what I missing out?
How can I solve?
EDIT:
with this gstreamer command:
gst-launch-1.0 rtmp2src location="rtmp://172.17.0.2/live/test live=1" \
! decodebin3 ! videoconvert \
! video/x-raw,format=I420,width=640,height=480,framerate=30/1 \
! x264enc bframes=0 key-int-max=45 bitrate=500 \
! video/x-h264,stream-format=avc,alignment=au \
! kvssink stream-name="STREAM_NAME" storage-size=512 \
access-key="ACCESS_KEY" \
secret-key="SECRET_KEY" \
aws-region="REGION"
I have no errors and the log says:
...
2023-01-06 05:59:47 [139898318763584] Stream is ready
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
But on Kinesis I can't see the stream.
Why?

Related

gstreamer pipeline for rtp audio forwarding to multiple sinks with different delays

:)
I'm trying to receive an rtp audio stream using gstreamer and forward it to multiple target hosts with different delays.
To insert a delay, I use the queue element with min-threshold-time as suggested here: https://stackoverflow.com/a/17218113/4881938
This works fine so far, however, if I want to have multiple output streams with different delays (or one with no delay at all), no data is set (i.e. the pipeline is paused) until the queue with the longest min-threshold-time is full.
This is not what I want - I want all forwarded streams to start as soon as possible, so if I have target1 one with no delay and target2 with 10s delay, target1 should receive data immediately, and not having to wait 10s.
I tried different sink options (sync=false, async=true) and tee option allow-not-linked=true to no avail; the pipeline remains paused until the longest delay in one of the queues.
What am I missing? How do I get gstreamer to activate the branch with no delay immediately? (and, in case I have multiple different delays: activate each delayed branch as soon as the buffer is full, not only after the longest buffer is filled?)
This is the complete test command I used:
% gst-launch-1.0 \
udpsrc port=10212 caps='application/x-rtp,media=(string)audio,clock-rate=(int)48000,encoding-name=(string)OPUS' ! \
tee name=t1 allow-not-linked=true \
t1. ! queue name=q1 leaky=downstream max-size-buffers=0 max-size-time=0 max-size-bytes=0 min-threshold-time=0 q1. ! \
udpsink host=target1 port=10214 sync=false async=true \
t1. ! queue name=q2 leaky=downstream max-size-buffers=0 max-size-time=0 max-size-bytes=0 min-threshold-time=5000000000 q2. ! \
udpsink host=target2 port=10215 sync=false async=true
version: GStreamer 1.18.4
Thanks everyone for even reading this far! :)
According to #SeB 's comment, I tried out interpipes:
Thank you very much for your input. I tried it out, and it seems the problem is similar. If I omit the queue elements or don't set min-threshold-time to more than 0, it works, but as soon as I add any delay to one or more of the queue elements, the whole pipeline does nothing, the time counter never goes up from 0:00:00.0
I tried out different combinations of the interpipe sink/source options forward-/accept-events and forward-/accept-eos but it didn't change anything.
What am I doing wrong? As I understand interpipe, it should decouple any sink/source elements from each other so one stalling pipe doesn't affect the rest(?).
command and output:
% gst-launch-1.0 \
udpsrc port=10212 caps='application/x-rtp,media=(string)audio,clock-rate=(int)48000,encoding-name=(string)OPUS' ! \
interpipesink name=t1 \
interpipesrc listen-to="t1" ! queue leaky=downstream max-size-buffers=0 max-size-time=0 max-size-bytes=0 min-threshold-time=5000000000 ! \
udpsink host=targethost1 port=10214 async=true sync=false \
interpipesrc listen-to="t1" ! queue leaky=downstream max-size-buffers=0 max-size-time=0 max-size-bytes=0 min-threshold-time=10000000000 ! \
udpsink host=targethost2 port=10215 async=true sync=false
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
0:00:00.0 / 99:99:99..
I also tried shmsource/shmsink, but this also kinda fails -- as soon as I add a delay to one of the pipelines with the shmsource, it remains stuck in prerolling state:
shmsink:
% gst-launch-1.0 \
udpsrc port=10212 caps='application/x-rtp,media=(string)audio,clock-rate=(int)48000,encoding-name=(string)OPUS' ! \
queue ! shmsink socket-path=/tmp/blah shm-size=20000000 wait-for-connection=false
shmsource (without is-live):
% gst-launch-1.0 \
shmsrc socket-path=/tmp/blah ! 'application/x-rtp,media=(string)audio,clock-rate=(int)48000,encoding-name=(string)OPUS' ! \
queue leaky=downstream max-size-buffers=0 max-size-time=0 max-size-bytes=0 min-threshold-time=5000000000 ! \
udpsink host=targethost port=10215 async=true sync=false
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
shmsource (with is-live):
% gst-launch-1.0 \
shmsrc is-live=true socket-path=/tmp/blah ! 'application/x-rtp,media=(string)audio,clock-rate=(int)48000,encoding-name=(string)OPUS' ! \
queue leaky=downstream max-size-buffers=0 max-size-time=0 max-size-bytes=0 min-threshold-time=50 ! \
udpsink host=targethost port=10215 async=true sync=false
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
depending on setting is-live on the src, the behavior is different, but in both cases, no data is actually sent. without the min-threshold-time for the queue element, both shmsource commands work.

gstreamer 1.0 input-selector

I want to switch between RTSP stream and test stream.
I use input-selector and this pipeline:
input-selector name=selector ! rtph264depay ! h264parse ! matroskamux \
streamable=false min-index-interval=100000 ! \
filesink location=test.mkv videotestsrc ! \
video/x-raw, width=1024, height=768, framerate=30/1, clock-rate=90000 ! \
x264enc ! rtph264pay ! selector.sink_0 rtspsrc name=rtspsrc \
location=rtsp://admin:admin#192.168.88.231:554/h264 retry=100 \
udp-buffer-size=30000000 latency=200 caps="application/x-rtp, \
media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! \
selector.sink_1
I wrote python code that switches the source every 5 seconds. But only the test stream is written to the file. Also I see warnings in the logs
0:00:00.353575129 13560 0x23f5540 WARN basesrc gstbasesrc.c:2948:gst_base_src_loop:<udpsrc3> error: Internal data flow error.
0:00:00.353602076 13560 0x23f5540 WARN basesrc gstbasesrc.c:2948:gst_base_src_loop:<udpsrc3> error: streaming task paused, reason not-linked (-1)
0:00:02.177975961 13560 0x23f2ed0 WARN x264enc :0::<x264enc0> VBV underflow (frame 298, -14405 bits)
0:00:02.338993437 13560 0x23f2ed0 WARN x264enc :0::<x264enc0> VBV underflow (frame 328, -4872 bits)
How to fix it?
When I replace selector.sink_0 with fakesink, RTSP stream is recorded normaly. Apparently the problem is caused by switching streams.

kurento can't receive rtp from gstreamer correctly

I installed kurento media server, and run the kurento java tutorial,(RTP receiver), kurento offer a gstreamer pipeline,
PEER_V=23490 PEER_IP=10.0.176.127 SELF_V=5004 SELF_VSSRC=112233
bash -c 'gst-launch-1.0 -t \
rtpbin name=r \
v4l2src device=/dev/video0 ! videoconvert ! x264enc tune=zerolatency \
! rtph264pay ! "application/x-rtp,payload=(int)103,clock-rate=(int)90000,ssrc=(uint)$SELF_VSSRC" \
! r.send_rtp_sink_1 \
r.send_rtp_src_1 ! udpsink host=$PEER_IP port=$PEER_V bind-port=$SELF_V \
'
this is the pipe which I simplified from officail pipeline, and it could run successfully;
but there is a problem when I implement this pipeline with c or c++ code.
kurento can't receive rtp stream, but I can receive rtp stream with my own rtp receiver that I wrote by c++.
the kurento media server log warnings:
enter image description here
it looks like that kurento doesn't process video stream, but audio stream.
but I never send audio stream.
So I want to know how to change c code to fit the kurento, let my video stream to kurento. my code enter link description here
yes , After a few days of toss, I figure out this problem today,
PEER_V=23490 PEER_IP=10.0.176.127 SELF_V=5004 SELF_VSSRC=112233
bash -c 'gst-launch-1.0 -t \
rtpbin name=r \
v4l2src device=/dev/video0 ! videoconvert ! x264enc tune=zerolatency \
! rtph264pay ! "application/x-rtp,payload=(int)103,clock-rate=(int)90000,ssrc=
(uint)$SELF_VSSRC" \
! r.send_rtp_sink_1 \
r.send_rtp_src_1 ! udpsink host=$PEER_IP port=$PEER_V bind-port=$SELF_V \
this pipeline, if you change payload to 96,then kurento media server will report the same warning as the picture in question.
so I think that it's my payload setting error.
then I add a pad probe to detect pad's caps.
s.h.i.t, it's true,
but I don't know why I set caps but not effective,
so I set the property "pt" of rtph264pay, and it runs successfully.
the code is enter link description here

How to stream a file.avi over network using gstreamer

I'm trying to use gstreamer to send a sample file .avi over a network. The code that I'm using to build my pipeline is the following:
gst-launch-1.0 -v rtpbin name=rtpbin latency=200 \
filesrc location=filesrc location=/home/enry/drop.avi ! decodebin name=dec \
dec. ! queue ! x264enc byte-stream=false bitrate=300 ! rtph264pay ! rtpbin.send_rtp_sink_0 \
rtpbin.send_rtp_src_0 ! udpsink port=5000 host=127.0.0.1 ts-offset=0 name=vrtpsink \
rtpbin.send_rtcp_src_0 ! udpsink port=5001 host=127.0.0.1 sync=false async=false \
name=vrtcpsink udpsrc port=5005 name=vrtpsrc ! rtpbin.recv_rtcp_sink_0
When I try to execute this command I'm getting this error:
gstavidemux.c(5383): gst_avi_demux_loop ():
/GstPipeline:pipeline0/GstDecodeBin:dec/GstAviDemux:avidemux0:
streaming stopped, reason not-linked
Execution ended after 0:00:00.032906515
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
Can you help me to solve this?
I think you would need a demux element before the decodebin, since an avi file would consist both an audio and video but you are using just one decodebin to decode them.
Take a look at the avidemux element here. You need it to get the video stream from the file.

Gstreamer souphttpsrc to rtp h263 encoded stream

I am trying to create a pipeline for streaming a jpeg stream into h263 encoded stream over RTP. When I execute:
gst-launch -v \
souphttpsrc \
location=http://192.168.1.54:8080 \
do-timestamp=true \
! multipartdemux ! image/jpeg,width=352,height=288 \
! ffmpegcolorspace ! video/x-raw-yuv,framerate=15/1 \
! videoscale \
! ffenc_h263 ! rtph263pay \
! udpsink host=192.168.1.31 port=1234
gstreamer reports:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstCapsFilter:capsfilter2: caps = image/jpeg, width=(int)352, height=(int)288
ERROR: from element /GstPipeline:pipeline0/GstSoupHTTPSrc:souphttpsrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2507): gst_base_src_loop (): /GstPipeline:pipeline0/GstSoupHTTPSrc:souphttpsrc0:
streaming task paused, reason not-linked (-1)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
/GstPipeline:pipeline0/GstMultipartDemux:multipartdemux0.GstPad:src_0: caps = NULL
Freeing pipeline ...
I've checked that the elements are existing. I've run gst-inspect for ffenc_h263, ffmpegcolorspace and the rest of the elements in this command too. gst-inspect does not report any error.
Is there something I'm missing?
You need jpegdec after multipartdemux to decode jpeg stream into raw video.
You don't need ffmpegcolorspace because jpegdec converts to video/x-raw-yuv.
videoscale is useless here, because you do not specify width/height for outgoing stream.
Try this:
gst-launch -v \
souphttpsrc \
location=http://192.168.1.54:8080 \
do-timestamp=true \
! multipartdemux \
! image/jpeg,width=352,height=288,framerate=15/1 \
! jpegdec ! ffenc_h263 ! rtph263pay \
! udpsink host=192.168.1.31 port=1234