I'm very new to gstreamer but after a lot of research I've now managed to create my own working pipeline streaming a webcam over a network from a Raspberry PI Zero to a PC via a UDP transport. I am pleased with my progress! :)
But I'm struggling to create a TCP transport...
This pipeline works perfectly over UDP:
(note: simplified using a test video source and JPEG encoding):
Server UDP (192.168.2.1):
gst-launch-1.0 videotestsrc is-live=true ! jpegenc ! rtpjpegpay ! udpsink host=192.168.2.13 port=7001
Client UDP (192.168.2.13):
gst-launch-1.0 udpsrc port=7001 ! application/x-rtp,encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink
...but when I use a TCP sink/source with exactly the same elements I receive nothing but errors.
The modified pipeline using tcpserversink and tcpclientsrc:
Server TCP (192.168.2.1):
gst-launch-1.0 videotestsrc is-live=true ! jpegenc ! rtpjpegpay ! tcpserversink port=7001
Client TCP (192.168.2.13):
gst-launch-1.0 tcpclientsrc host=192.168.2.1 port=7001 ! application/x-rtp,encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink
Attempt 1: tcpserversink port=7001
ERROR: Failed to connect to host '192.168.2.1:7001': No connection could be made because the target machine actively refused it.
Attempt 2: tcpserversink host=localhost port=7001
ERROR: from element /GstPipeline:pipeline0/GstTCPClientSrc:tcpclientsrc0: Could not open resource for reading.
Attempt 3: tcpserversink host=127.0.0.1 port=7001
ERROR: from element /GstPipeline:pipeline0/GstTCPClientSrc:tcpclientsrc0: Could not open resource for reading.
Attempt 4: tcpserversink host=192.168.2.1 port=7001
ERROR: from element /GstPipeline:pipeline0/GstTCPClientSrc:tcpclientsrc0: Internal data stream error.
Attempt 5: tcpserversink host=0.0.0.0 port=7001
ERROR: from element /GstPipeline:pipeline0/GstTCPClientSrc:tcpclientsrc0: Internal data stream error.
I figured I should be able to replace src & sink elements without the pipeline breaking so I must just be missing something.
I would be grateful for any light you could shed on this.
You can solve it one of two ways (at least). The first is add the rtpstreampay element after the rtp payloader for your media type.
https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-good/html/gst-plugins-good-plugins-rtpstreampay.html
server:
gst-launch-1.0 videotestsrc is-live=true \
! jpegenc \
! rtpjpegpay \
! rtpstreampay \
! tcpserversink port=7001
client:
gst-launch-1.0 tcpclientsrc port=7001 \
! application/x-rtp-stream,encoding-name=JPEG \
! rtpstreamdepay \
! rtpjpegdepay \
! jpegdec \
! autovideosink
The second way is to use a muxer rather than an rtp payloader, something like matroskamux which is pretty generic.
server:
gst-launch-1.0 videotestsrc is-live=true \
! jpegenc \
! matroskamux \
! tcpserversink port=7001
client:
gst-launch-1.0 tcpclientsrc port=7001 \
! matroskademux \
! jpegdec \
! autovideosink
You might also want to look into the GstRtspServer if you're wanting to do client/server rtp connections. A simple Python script like this will act as the server.
rtspserver.py
import gi
gi.require_version('Gst','1.0')
gi.require_version('GstRtspServer','1.0')
from gi.repository import Gst, GObject, GstRtspServer
Gst.init(None)
mainloop = GObject.MainLoop()
server = GstRtspServer.RTSPServer()
factory = GstRtspServer.RTSPMediaFactory()
factory.set_launch((
'videotestsrc is-live=true '
'! jpegenc '
'! rtpjpegpay name=pay0 pt=26'
))
# allow multiple connections
factory.set_shared(True)
mounts = server.get_mount_points()
mounts.add_factory('/live', factory)
server.attach(None)
mainloop.run()
And you can use a pipeline like this to view the output.
gst-launch-1.0 \
rtspsrc location=rtsp://localhost:8554/live latency=100 \
! rtpjpegdepay \
! jpegdec \
! autovideosink
Related
I'm trying to send with gstreamer (version 1.18.4) a videostream with udpsink ipv6 and gst-launch-1.0 but it doesn't work, nothing is received on the client side.
My commands are:
client:
gst-launch-1.0 udpsrc port=5000 ! 'application/x-rtp,encoding-name=H264,payload=96' ! rtph264depay ! avdec_h264 ! xvimagesink sync=0
server:
gst-launch-1.0 videotestsrc is-live=1 ! video/x-raw,width=1280,height=720 ! timeoverlay valignment=4 halignment=1 ! videoconvert ! 'video/x-raw,width=1280,height=720' ! x264enc ! h264parse ! rtph264pay ! udpsink host=fe80::2b37:54e4:1812:9169 port=5000 sync=0
Putting the ipv6 address into quotes "" doesn't help either.
With ipv4 everything works.
gst-launch-1.0 videotestsrc is-live=1 ! video/x-raw,width=1280,height=720 ! timeoverlay valignment=4 halignment=1 ! videoconvert ! 'video/x-raw,width=1280,height=720' ! x264enc ! h264parse ! rtph264pay ! udpsink host=192.168.0.176 port=5000 sync=0
Does anyone know how to solve it?
I need it for a vpn provider which supports only ipv6.
When using IPv6 Link-Local addressing, you must include the Zone ID (see RFC 6874). Every IPv6 interface uses the same Link-Local network, so you must use the Zone ID to distinguish the specific interface in the host.
Also, remember that packets with Link-Local addresses cannot leave the link; they cannot be routed or cross a layer-3 device.
You should really use the IPv6 Global addresses, or you could use IPv6 ULA addresses if you do not want the traffic to be able to be used on the public Internet. Remember that IPv6 can have addresses of each type (even multiple addresses of each type) on the same interface.
Following a tutorial in Portuguese on Youtube, executed commands were:
Source:
gst-launch-1.0 videotestsrc \
! decodebin \
! x264enc \
! rtph264pay \
! udpsink host=localhost port=7001
Sink:
gst-launch-1.0 \
udpsrc port=7001 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" \
! rtph264depay \
! decodebin \
! videoconvert \
! autovideosink
And it worked for him, a video was displayed over udp connection, in my case it doesn't show anything.
Results:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Got context from element 'autovideosink0': gst.gl.GLDisplay=context, gst.gl.GLDisplay=(GstGLDisplay)"\(GstGLDisplayX11\)\ gldisplayx11-0";
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Running in an Ubuntu 20.04 fresh installed.
Fixed by changing host=localhost to host=127.0.0.1
Janus has feature to forward rtp, and so i've made listener with gstream with this command:
gst-launch-1.0 --gst-debug=4 rtpbin name=rtpbin -v udpsrc port=5104 caps="application/x-rtp, clock-rate=(int)90000, encoding-name=H264, payload=100" ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink
The question is how can i show it into html5 browser?
I need live-stream audio over Internet. Quick search in internet and I decide using Gstreamer. I has streamed successfully using udpsink, but it only work on LAN. I test with tcpserversink but it not working:
Server (IP: 113.160.166.87)
gst-launch-1.0 filesrc location="G:/Project/Gstreamer/TestContent/Em-Gai-Mua-Huong-Tram.mp3" ! decodebin ! mulawenc ! tcpserversink port=7001 host=0.0.0.0
Client:
gst-launch-1.0 tcpclientsrc host=113.160.166.87 port=7001 ! "audio/x-mulaw, channels=1, depth=16, width=16, rate=44100" ! mulawdec ! autoaudiosink
Someone help me! Thanks you.
why are you encoding in the sender again ? can you try the following pipeline,
Sender:
gst-launch-1.0 -v filesrc location="G:/Project/Gstreamer/TestContent/Em-Gai-Mua-Huong-Tram.mp3" ! audioparse ! tcpserversink port=7001 host=0.0.0.0
Receiver:
gst-launch-1.0 tcpclientsrc port=7001 host=113.160.166.87 ! decodebin ! autoaudiosink
I am trying to stream adpcm (G726),32kbps audio from a host to a client through
RTP:
I have tried following commands from the client(receiver) and the host(sender).
Both boards are connected through IP.
I am getting "internal data flow error" at the receiver once i run the cmds on
the sender:
RECEIVER:
gst-launch -v udpsrc port=3004 caps="application/x-rtp” ! rtpg726depay !
ffdec_g726 ! alsasink
SENDER:
gst-launch -v autoaudiosrc ! audioconvert ! audioresample ! ffenc_g726
bitrate=32000 ! rtpg726pay ! udpsink host=192.168.1.104 port=3004
If I try the same with pcm, alaw encoder and decoder, then streaming works
fine. I can hear the live audio(when I speak on sender's Microphone) into
receiver's speakerphone:
commands I am running in this case:
RECEIVER:
gst-launch udpsrc port=3004 caps="application/x-rtp" ! rtppcmadepay ! alawdec
! alsasink
SENDER:
gst-launch autoaudiosrc ! audioconvert ! audioresample ! alawenc ! rtppcmapay
! udpsink host=192.168.1.104 port=3004
This seems to be an issue in the pipeline?
here is the full error I am getting at the receiver
///////////////
root#am335x-evm:~/EVM4# gst-launch udpsrc port=3004 caps="application/x-rtp" !
rtpg726depay ! ffdec_g726 ! alsasink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal data
flow error.
Additional debug info:
gstbasesrc.c(2625): gst_base_src_loop ():
/GstPipeline:pipeline0/GstUDPSrc:udpsrc0:
streaming task paused, reason not-negotiated (-4)
Execution ended after 20828810227 ns.
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...