Following a tutorial in Portuguese on Youtube, executed commands were:
Source:
gst-launch-1.0 videotestsrc \
! decodebin \
! x264enc \
! rtph264pay \
! udpsink host=localhost port=7001
Sink:
gst-launch-1.0 \
udpsrc port=7001 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" \
! rtph264depay \
! decodebin \
! videoconvert \
! autovideosink
And it worked for him, a video was displayed over udp connection, in my case it doesn't show anything.
Results:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Got context from element 'autovideosink0': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayX11\)\ gldisplayx11-0";
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Running in an Ubuntu 20.04 fresh installed.
Fixed by changing host=localhost to host=127.0.0.1
Related
I am trying to stream mp4 over rtsp url and read from other terminal but facing issue Service Unavailable (503)
Server Code : Reference From
./test-launch "filesrc location=./sample.mp4 \
! qtdemux \
! h264parse \
! decodebin \
! videoconvert \
! omxh264enc insert-sps-pps=true bitrate=16000000 \
! rtph264pay name=pay0"
Server Response :
stream ready at rtsp://127.0.0.1:8554/test
Client Code :
sudo gst-launch-1.0 -v rtspsrc location=rtsp://127.0.0.1:8554/test ! rtph264depay ! h264parse ! decodebin ! videoconvert ! autovideosink sync=false
Client Response :
Setting pipeline to PAUSED ...
error: XDG_RUNTIME_DIR not set in the environment.
Pipeline is live and does not need PREROLL ...
Got context from element 'autovideosink0': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayX11\)\ gldisplayx11-0";
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://127.0.0.1:8554/test
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0: Unhandled error
Additional debug info:
gstrtspsrc.c(6161): gst_rtspsrc_send (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0:
Service Unavailable (503)
ERROR: pipeline doesn't want to preroll.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
Janus has feature to forward rtp, and so i've made listener with gstream with this command:
gst-launch-1.0 --gst-debug=4 rtpbin name=rtpbin -v udpsrc port=5104 caps="application/x-rtp, clock-rate=(int)90000, encoding-name=H264, payload=100" ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink
The question is how can i show it into html5 browser?
I'm very new to gstreamer but after a lot of research I've now managed to create my own working pipeline streaming a webcam over a network from a Raspberry PI Zero to a PC via a UDP transport. I am pleased with my progress! :)
But I'm struggling to create a TCP transport...
This pipeline works perfectly over UDP:
(note: simplified using a test video source and JPEG encoding):
Server UDP (192.168.2.1):
gst-launch-1.0 videotestsrc is-live=true ! jpegenc ! rtpjpegpay ! udpsink host=192.168.2.13 port=7001
Client UDP (192.168.2.13):
gst-launch-1.0 udpsrc port=7001 ! application/x-rtp,encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink
...but when I use a TCP sink/source with exactly the same elements I receive nothing but errors.
The modified pipeline using tcpserversink and tcpclientsrc:
Server TCP (192.168.2.1):
gst-launch-1.0 videotestsrc is-live=true ! jpegenc ! rtpjpegpay ! tcpserversink port=7001
Client TCP (192.168.2.13):
gst-launch-1.0 tcpclientsrc host=192.168.2.1 port=7001 ! application/x-rtp,encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink
Attempt 1: tcpserversink port=7001
ERROR: Failed to connect to host '192.168.2.1:7001': No connection could be made because the target machine actively refused it.
Attempt 2: tcpserversink host=localhost port=7001
ERROR: from element /GstPipeline:pipeline0/GstTCPClientSrc:tcpclientsrc0: Could not open resource for reading.
Attempt 3: tcpserversink host=127.0.0.1 port=7001
ERROR: from element /GstPipeline:pipeline0/GstTCPClientSrc:tcpclientsrc0: Could not open resource for reading.
Attempt 4: tcpserversink host=192.168.2.1 port=7001
ERROR: from element /GstPipeline:pipeline0/GstTCPClientSrc:tcpclientsrc0: Internal data stream error.
Attempt 5: tcpserversink host=0.0.0.0 port=7001
ERROR: from element /GstPipeline:pipeline0/GstTCPClientSrc:tcpclientsrc0: Internal data stream error.
I figured I should be able to replace src & sink elements without the pipeline breaking so I must just be missing something.
I would be grateful for any light you could shed on this.
You can solve it one of two ways (at least). The first is add the rtpstreampay element after the rtp payloader for your media type.
https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-good/html/gst-plugins-good-plugins-rtpstreampay.html
server:
gst-launch-1.0 videotestsrc is-live=true \
! jpegenc \
! rtpjpegpay \
! rtpstreampay \
! tcpserversink port=7001
client:
gst-launch-1.0 tcpclientsrc port=7001 \
! application/x-rtp-stream,encoding-name=JPEG \
! rtpstreamdepay \
! rtpjpegdepay \
! jpegdec \
! autovideosink
The second way is to use a muxer rather than an rtp payloader, something like matroskamux which is pretty generic.
server:
gst-launch-1.0 videotestsrc is-live=true \
! jpegenc \
! matroskamux \
! tcpserversink port=7001
client:
gst-launch-1.0 tcpclientsrc port=7001 \
! matroskademux \
! jpegdec \
! autovideosink
You might also want to look into the GstRtspServer if you're wanting to do client/server rtp connections. A simple Python script like this will act as the server.
rtspserver.py
import gi
gi.require_version('Gst','1.0')
gi.require_version('GstRtspServer','1.0')
from gi.repository import Gst, GObject, GstRtspServer
Gst.init(None)
mainloop = GObject.MainLoop()
server = GstRtspServer.RTSPServer()
factory = GstRtspServer.RTSPMediaFactory()
factory.set_launch((
'videotestsrc is-live=true '
'! jpegenc '
'! rtpjpegpay name=pay0 pt=26'
))
# allow multiple connections
factory.set_shared(True)
mounts = server.get_mount_points()
mounts.add_factory('/live', factory)
server.attach(None)
mainloop.run()
And you can use a pipeline like this to view the output.
gst-launch-1.0 \
rtspsrc location=rtsp://localhost:8554/live latency=100 \
! rtpjpegdepay \
! jpegdec \
! autovideosink
I need live-stream audio over Internet. Quick search in internet and I decide using Gstreamer. I has streamed successfully using udpsink, but it only work on LAN. I test with tcpserversink but it not working:
Server (IP: 113.160.166.87)
gst-launch-1.0 filesrc location="G:/Project/Gstreamer/TestContent/Em-Gai-Mua-Huong-Tram.mp3" ! decodebin ! mulawenc ! tcpserversink port=7001 host=0.0.0.0
Client:
gst-launch-1.0 tcpclientsrc host=113.160.166.87 port=7001 ! "audio/x-mulaw, channels=1, depth=16, width=16, rate=44100" ! mulawdec ! autoaudiosink
Someone help me! Thanks you.
why are you encoding in the sender again ? can you try the following pipeline,
Sender:
gst-launch-1.0 -v filesrc location="G:/Project/Gstreamer/TestContent/Em-Gai-Mua-Huong-Tram.mp3" ! audioparse ! tcpserversink port=7001 host=0.0.0.0
Receiver:
gst-launch-1.0 tcpclientsrc port=7001 host=113.160.166.87 ! decodebin ! autoaudiosink
I am trying to stream adpcm (G726),32kbps audio from a host to a client through
RTP:
I have tried following commands from the client(receiver) and the host(sender).
Both boards are connected through IP.
I am getting "internal data flow error" at the receiver once i run the cmds on
the sender:
RECEIVER:
gst-launch -v udpsrc port=3004 caps="application/x-rtp” ! rtpg726depay !
ffdec_g726 ! alsasink
SENDER:
gst-launch -v autoaudiosrc ! audioconvert ! audioresample ! ffenc_g726
bitrate=32000 ! rtpg726pay ! udpsink host=192.168.1.104 port=3004
If I try the same with pcm, alaw encoder and decoder, then streaming works
fine. I can hear the live audio(when I speak on sender's Microphone) into
receiver's speakerphone:
commands I am running in this case:
RECEIVER:
gst-launch udpsrc port=3004 caps="application/x-rtp" ! rtppcmadepay ! alawdec
! alsasink
SENDER:
gst-launch autoaudiosrc ! audioconvert ! audioresample ! alawenc ! rtppcmapay
! udpsink host=192.168.1.104 port=3004
This seems to be an issue in the pipeline?
here is the full error I am getting at the receiver
///////////////
root#am335x-evm:~/EVM4# gst-launch udpsrc port=3004 caps="application/x-rtp" !
rtpg726depay ! ffdec_g726 ! alsasink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal data
flow error.
Additional debug info:
gstbasesrc.c(2625): gst_base_src_loop ():
/GstPipeline:pipeline0/GstUDPSrc:udpsrc0:
streaming task paused, reason not-negotiated (-4)
Execution ended after 20828810227 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...