I have an issue in sending H264 streams to linphone.
When I play the following pipeline :
gst-launch-1.0 -v filesrc location=C:/ test.mp4 ! qtdemux ! avdec_h264 ! x264enc bitrate=192 ! rtph264pay ! udpsink host=127.0.0.1 port=9078
Everything is ok and the video played in linphone screen.
But what I want to do is to save first the video streams into a file then send this file to the linphone. What I did is the following:
Saving into a file
gst-launch-1.0 -v filesrc location=C:/ test.mp4 ! qtdemux ! avdec_h264 ! x264enc bitrate=192 ! filesink location=C: /videosample
Send to linphone:
gst-launch-1.0 -v filesrc location=C: /videosample ! h264parse ! rtph264pay ! udpsink host=127.0.0.1 port=9078
The packets received but the linphone played a black screen.
I want to know what I am missing in my pipelines or if there is a specific parameter to set.
Note that it is working when a play a gst receiver :
gst-launch-1.0 -v udpsrc port=9078 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! avdec_h264 ! autovideosink
Related
I wanted to create a RTP-stream of a mp4-file with gstreamer.
I am using gstreamer 1.18.4 on debian bullseye.
To create a mp4-file I recorded an RTSP-stream from my webcam using the following command:
gst-launch-1.0 -e rtspsrc location="rtsp://192.168.111.146/axis-media/media.amp" port-range=28000-38000 buffer-mode=0 latency=80 ! rtph264depay ! h264parse ! mp4mux ! filesink location=filename.mp4
After recording the file filename.mp4 I tried to stream it using RTP:
gst-launch-1.0 filesrc location=filename.mp4 ! qtdemux ! h264parse ! avdec_h264 ! x264enc ! h264parse ! rtph264pay ! udpsink port=50000 host=127.0.0.1
And the playback of the stream could be started using the following command on the same machine:
gst-launch-1.0 udpsrc address=127.0.0.1 port=50000 auto-multicast=true ! application/x-rtp,encoding-name=H264,payload=96 ! rtph264depay ! avdec_h264 ! autovideosink
Everything works as expected!
But since I don't want to transcode the file, I just wanted to skip the decoding and encoding part. Therefore, I created the following pipelines:
gst-launch-1.0 filesrc location=filename.mp4 ! qtdemux ! h264parse ! rtph264pay ! udpsink port=50000 host=127.0.0.1
and
gst-launch-1.0 filesrc location=filename.mp4 ! qtdemux ! rtph264pay ! udpsink port=50000 host=127.0.0.1
However, if I retry the playback pipeline (the pipeline with udpsrc) on both pipelines the stream is not displayed.
Interestingly, nload shows network traffic on lo.
What is wrong with the streaming pipelines?
Did I miss some magic-plugin in between?
Meanwhile I found an answer to my question.
Changing the stream-server-pipeline from
gst-launch-1.0 filesrc location=filename.mp4 ! qtdemux ! h264parse ! rtph264pay ! udpsink port=50000 host=127.0.0.1
to
gst-launch-1.0 filesrc location=filename.mp4 ! qtdemux ! h264parse config-interval=-1 ! rtph264pay ! udpsink port=50000 host=127.0.0.1
solves the issue.
Thus, the difference is setting the parameter config-interval=-1 for h264parse.
Using the following 2 commands I can stream a videotestsrc source over SRT.
gst-launch-1.0 -v videotestsrc ! queue ! x264enc ! queue ! mpegtsmux alignment=7 ! identity silent=false ! queue leaky=downstream ! srtsink uri="srt://:8888" sync=false async=false
gst-launch-1.0 -v srtsrc uri="srt://127.0.0.1:8888" ! identity silent=false ! fakesink async=false
And play it in this way:
gst-play-1.0 srt://127.0.0.1:8888
Now I want to stream a rtsp source, and I get it in the following way:
gst-launch-1.0 rtspsrc location=rtsp://localhost:8554/main latency=100 ! queue ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! videoscale ! video/x-raw,width=640,height=480 ! srtsink uri="srt://:8888" sync=false async=false
gst-launch-1.0 -v srtsrc uri="srt://127.0.0.1:8888" ! identity silent=false ! fakesink async=false
However, when I when to playback I have this error:
gst-play-1.0 srt://127.0.0.1:8888
Press 'k' to see a list of keyboard shortcuts.
Now playing srt://127.0.0.1:8888
Pipeline is live.
ERROR Could not determine type of stream. for srt://127.0.0.1:8888
ERROR debug information: ../subprojects/gstreamer/plugins/elements/gsttypefindelement.c(999): gst_type_find_element_chain_do_typefinding (): /GstPlayBin:playbin/GstURIDecodeBin:uridecodebin0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind
Reached end of play list.
How can I solve it?
I'm trying to write RTSP stream in shared memory, and then write it in .mkv file.
I use this command to write stream in .mkv file directly:
gst-launch-1.0 rtspsrc location=rtsp://admin:admin#192.168.88.248:554/h264 ! rtph264depay ! h264parse ! matroskamux ! filesink location= file.mkv
It works.
Now I add shared memory:
gst-launch-1.0 rtspsrc location=rtsp://admin:admin#192.168.88.248:554/h264 ! shmsink socket-path=/tmp/foo shm-size=2000000
And
gst-launch-1.0 shmsrc socket-path=/tmp/foo ! rtph264depay ! h264parse ! matroskamux ! filesink location=file.mkv
And I get message:
Input buffers need to have RTP caps set on them.
Ok, I write
gst-launch-1.0 rtspsrc location=rtsp://admin:admin#192.168.88.248:554/h264 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" ! shmsink socket-path=/tmp/foo shm-size=2000000
And I get this message again.
What am I doing wrong?
You need to set the caps after shmsrc, for example following is my receiving pipeline:
gst-launch-1.0 -v rtspsrc
location=rtsp://192.168.1.150:8554/VBoxVideo ! shmsink
socket-path=/tmp/foo shm-size=2000000 wait-for-connection=false
You have to note down the caps from the above shmsink, following are my caps for shmsink:
/GstPipeline:pipeline0/GstShmSink:shmsink0.GstPad:sink: caps =
"application/x-rtp\,\ media\=(string)video\,\ payload\=(int)96\,\
clock-rate\=(int)90000\,\ encoding-name\=(string)H264\,\
packetization-mode\=(string)1\,\
profile-level-id\=(string)64002a\,\
sprop-parameter-sets\=(string)\"J2QAKqwbKgHgCJ+WEAAAPoAADqYOAAEZABGQve6wgA\\=\\=\\,KP4Briw\\=\"\,\
a-tool\=(string)GStreamer\,\ a-type\=(string)broadcast\,\
a-framerate\=(string)30\,\ a-ts-refclk\=(string)local\,\
a-mediaclk\=(string)sender\,\ ssrc\=(uint)4083957277\,\
clock-base\=(uint)1018840792\,\ seqnum-base\=(uint)13685\,\
npt-start\=(guint64)0\,\ play-speed\=(double)1\,\
play-scale\=(double)1"
Now, to use shmsrc,
gst-launch-1.0 -vm shmsrc socket-path=/tmp/foo do-timestamp=true is-live=true
num-buffers=1000 !
"application/x-rtp,media=(string)video,payload=(int)96,packetization-mode=(string)1" ! rtph264depay ! h264parse ! mp4mux ! filesink location=file.mp4
Note: I have set the caps from the above, also note I have set um-buffers=1000 as I am using mp4mux, and I need to send and eos for the file to play.
So in your case:
gst-launch-1.0 -v rtspsrc location=rtsp://admin:admin#192.168.88.248:554/h264 ! shmsink socket-path=/tmp/foo shm-size=2000000
Note down the caps from the pipeline for shmsink0, and later use it in your pipeline:
gst-launch-1.0 shmsrc socket-path=/tmp/foo is-live=true num-buffers=1000 ! caps ! rtph264depay ! h264parse ! mp4mux ! filesink location=file.mp4
I'm trying to decode a video from h264 and reencode it to transfer to a client trhough udp:
On the transmitter side:
gst-launch-1.0 -v filesrc location=/home/ubuntu/Videos/test.mp4 ! qtdemux name=demux ! h264parse ! omxh264dec ! nvvidconv ! omxh264enc ! rtph264pay ! udpsink host=127.0.0.1 port=5000
On the receiver side:
gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=H264 ! rtph264depay ! h264parse ! omxh264dec ! nveglglessink
I do it locally, for a test, on an NVidia Tegra TK1, but nothing is being displayed although no error is being raised.
Does anybody see something to add? Thanks in advance.
Ok, I finally made it work, but using an other network protocol:
Just for testing, send and receive locally on the Tegra TK1 itself:
Send:
gst-launch-1.0 filesrc location=/home/ubuntu/Videos/test.mp4 ! qtdemux name=demux ! h264parse ! omxh264dec ! nvvidconv ! omxh264enc ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=127.0.0.1 port=5000
Receive:
gst-launch-1.0 -v tcpclientsrc host=127.0.0.1 port=5000 ! gdpdepay ! rtph264depay ! h264parse ! omxh264dec ! nveglglessink sync=false
Result is choppy, but I don't care at this stage. I receive something!!
Receive on Ubuntu PC:
gst-launch-1.0 -v tcpclientsrc host=<Tegra IP> port=5000 ! gdpdepay ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink sync=false
Here is what I'm trying:
gst-launch -v udpsrc port=1234 ! fakesink dump=1
I test with:
gst-launch -v audiotestsrc ! udpsink host=127.0.0.1 port=1234
And everything works fine, I can see the packages arriving from the audiotestsrc
Now lets test with the webcam source:
gst-launch -v v4l2src device=/dev/video0 ! queue ! videoscale method=1 ! "video/x-raw-yuv,width=320,height=240" ! queue ! videorate ! "video/x-raw-yuv,framerate=(fraction)15/1" ! queue ! udpsink host=127.0.0.1 port=1234
And nothing happens, no package appears in the dump.
Here is a logdump of what verbose shows in the server.
Does anyone have a clue on this?
Try these (You may have to install gstreamer-ugly plugins for this one)
UDP streaming from Webcam (stream over the network)
gst-launch v4l2src device=/dev/video0 ! 'video/x-raw-yuv,width=640,height=480' ! x264enc pass=qual quantizer=20 tune=zerolatency ! rtph264pay ! udpsink host=127.0.0.1 port=1234
UDP Streaming received from webcam (receive over the network)
gst-launch udpsrc port=1234 ! "application/x-rtp, payload=127" ! rtph264depay ! ffdec_h264 ! xvimagesink sync=false
Update
To determine the payload at the streaming end simply use verbose option with gst-launch -v ...
Maybe packets are too large for udp? They are limited to 64K. Try resizing frames to really small size to check if this is the reason. If so, you may be interested in some compression and payloaders/depayloaders (gst-inspect | grep pay).
gstreamer1-1.16.0-1.fc30
gst-launch-1.0 -v filesrc location=/.../.../.../sample-mp4-file.mp4 ! qtdemux ! h264parse ! queue ! rtph264pay config-interval=10 pt=96 ! udpsink port=8888 host=127.0.0.1
https://en.wikipedia.org/wiki/RTP_audio_video_profile