GStreamer udpsink with ipv6 host doesn't work - gstreamer

I'm trying to send with gstreamer (version 1.18.4) a videostream with udpsink ipv6 and gst-launch-1.0 but it doesn't work, nothing is received on the client side.
My commands are:
client:
gst-launch-1.0 udpsrc port=5000 ! 'application/x-rtp,encoding-name=H264,payload=96' ! rtph264depay ! avdec_h264 ! xvimagesink sync=0
server:
gst-launch-1.0 videotestsrc is-live=1 ! video/x-raw,width=1280,height=720 ! timeoverlay valignment=4 halignment=1 ! videoconvert ! 'video/x-raw,width=1280,height=720' ! x264enc ! h264parse ! rtph264pay ! udpsink host=fe80::2b37:54e4:1812:9169 port=5000 sync=0
Putting the ipv6 address into quotes "" doesn't help either.
With ipv4 everything works.
gst-launch-1.0 videotestsrc is-live=1 ! video/x-raw,width=1280,height=720 ! timeoverlay valignment=4 halignment=1 ! videoconvert ! 'video/x-raw,width=1280,height=720' ! x264enc ! h264parse ! rtph264pay ! udpsink host=192.168.0.176 port=5000 sync=0
Does anyone know how to solve it?
I need it for a vpn provider which supports only ipv6.

When using IPv6 Link-Local addressing, you must include the Zone ID (see RFC 6874). Every IPv6 interface uses the same Link-Local network, so you must use the Zone ID to distinguish the specific interface in the host.
Also, remember that packets with Link-Local addresses cannot leave the link; they cannot be routed or cross a layer-3 device.
You should really use the IPv6 Global addresses, or you could use IPv6 ULA addresses if you do not want the traffic to be able to be used on the public Internet. Remember that IPv6 can have addresses of each type (even multiple addresses of each type) on the same interface.

Related

how to make rtsp server with gstreamer for h.265

I want to build a server with gstreamer that get push RTSP, encoded in H.265 codec and audio in AAC format from multiple cameras. Also client should be able to pull RTSP streams from this servers
Could you add something to this code? or could you check if it is correct?
gst-launch-1.0 udpsrc uri=udp://127.0.0.1:5000 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H265, payload=(int)96, seqnum-offset=(uint)27727, timestamp-offset=(uint)1713951204, ssrc=(uint)2573237941, a-framerate=(string)30" ! rtph265depay ! h265parse ! vaapidecode ! vaapipostproc ! vaapisink
You can use the Live 555 RTSP server
http://www.live555.com/mediaServer/
You can also create a gstreamer plugin for the same.

Janus RTP_Forward, Gstreamer and streaming it into html5 browser

Janus has feature to forward rtp, and so i've made listener with gstream with this command:
gst-launch-1.0 --gst-debug=4 rtpbin name=rtpbin -v udpsrc port=5104 caps="application/x-rtp, clock-rate=(int)90000, encoding-name=H264, payload=100" ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink
The question is how can i show it into html5 browser?

Gstreamer: udpsink/udpsrc versus tcpserversink/tcpclientsrc

I'm very new to gstreamer but after a lot of research I've now managed to create my own working pipeline streaming a webcam over a network from a Raspberry PI Zero to a PC via a UDP transport. I am pleased with my progress! :)
But I'm struggling to create a TCP transport...
This pipeline works perfectly over UDP:
(note: simplified using a test video source and JPEG encoding):
Server UDP (192.168.2.1):
gst-launch-1.0 videotestsrc is-live=true ! jpegenc ! rtpjpegpay ! udpsink host=192.168.2.13 port=7001
Client UDP (192.168.2.13):
gst-launch-1.0 udpsrc port=7001 ! application/x-rtp,encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink
...but when I use a TCP sink/source with exactly the same elements I receive nothing but errors.
The modified pipeline using tcpserversink and tcpclientsrc:
Server TCP (192.168.2.1):
gst-launch-1.0 videotestsrc is-live=true ! jpegenc ! rtpjpegpay ! tcpserversink port=7001
Client TCP (192.168.2.13):
gst-launch-1.0 tcpclientsrc host=192.168.2.1 port=7001 ! application/x-rtp,encoding-name=JPEG ! rtpjpegdepay ! jpegdec ! autovideosink
Attempt 1: tcpserversink port=7001
ERROR: Failed to connect to host '192.168.2.1:7001': No connection could be made because the target machine actively refused it.
Attempt 2: tcpserversink host=localhost port=7001
ERROR: from element /GstPipeline:pipeline0/GstTCPClientSrc:tcpclientsrc0: Could not open resource for reading.
Attempt 3: tcpserversink host=127.0.0.1 port=7001
ERROR: from element /GstPipeline:pipeline0/GstTCPClientSrc:tcpclientsrc0: Could not open resource for reading.
Attempt 4: tcpserversink host=192.168.2.1 port=7001
ERROR: from element /GstPipeline:pipeline0/GstTCPClientSrc:tcpclientsrc0: Internal data stream error.
Attempt 5: tcpserversink host=0.0.0.0 port=7001
ERROR: from element /GstPipeline:pipeline0/GstTCPClientSrc:tcpclientsrc0: Internal data stream error.
I figured I should be able to replace src & sink elements without the pipeline breaking so I must just be missing something.
I would be grateful for any light you could shed on this.
You can solve it one of two ways (at least). The first is add the rtpstreampay element after the rtp payloader for your media type.
https://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-good/html/gst-plugins-good-plugins-rtpstreampay.html
server:
gst-launch-1.0 videotestsrc is-live=true \
! jpegenc \
! rtpjpegpay \
! rtpstreampay \
! tcpserversink port=7001
client:
gst-launch-1.0 tcpclientsrc port=7001 \
! application/x-rtp-stream,encoding-name=JPEG \
! rtpstreamdepay \
! rtpjpegdepay \
! jpegdec \
! autovideosink
The second way is to use a muxer rather than an rtp payloader, something like matroskamux which is pretty generic.
server:
gst-launch-1.0 videotestsrc is-live=true \
! jpegenc \
! matroskamux \
! tcpserversink port=7001
client:
gst-launch-1.0 tcpclientsrc port=7001 \
! matroskademux \
! jpegdec \
! autovideosink
You might also want to look into the GstRtspServer if you're wanting to do client/server rtp connections. A simple Python script like this will act as the server.
rtspserver.py
import gi
gi.require_version('Gst','1.0')
gi.require_version('GstRtspServer','1.0')
from gi.repository import Gst, GObject, GstRtspServer
Gst.init(None)
mainloop = GObject.MainLoop()
server = GstRtspServer.RTSPServer()
factory = GstRtspServer.RTSPMediaFactory()
factory.set_launch((
'videotestsrc is-live=true '
'! jpegenc '
'! rtpjpegpay name=pay0 pt=26'
))
# allow multiple connections
factory.set_shared(True)
mounts = server.get_mount_points()
mounts.add_factory('/live', factory)
server.attach(None)
mainloop.run()
And you can use a pipeline like this to view the output.
gst-launch-1.0 \
rtspsrc location=rtsp://localhost:8554/live latency=100 \
! rtpjpegdepay \
! jpegdec \
! autovideosink

how to stream audio with tcpserversink using gstreamer

I need live-stream audio over Internet. Quick search in internet and I decide using Gstreamer. I has streamed successfully using udpsink, but it only work on LAN. I test with tcpserversink but it not working:
Server (IP: 113.160.166.87)
gst-launch-1.0 filesrc location="G:/Project/Gstreamer/TestContent/Em-Gai-Mua-Huong-Tram.mp3" ! decodebin ! mulawenc ! tcpserversink port=7001 host=0.0.0.0
Client:
gst-launch-1.0 tcpclientsrc host=113.160.166.87 port=7001 ! "audio/x-mulaw, channels=1, depth=16, width=16, rate=44100" ! mulawdec ! autoaudiosink
Someone help me! Thanks you.
why are you encoding in the sender again ? can you try the following pipeline,
Sender:
gst-launch-1.0 -v filesrc location="G:/Project/Gstreamer/TestContent/Em-Gai-Mua-Huong-Tram.mp3" ! audioparse ! tcpserversink port=7001 host=0.0.0.0
Receiver:
gst-launch-1.0 tcpclientsrc port=7001 host=113.160.166.87 ! decodebin ! autoaudiosink

GStreamer Launch RTSP Server for ReStreaming IP Camera H264

I am going to use multiple clients on different computers to be able to view video of an IP Camera stream url. Because the Ip camera has limitations on the number of connected clients, I want to setup a streamer for this purpose. I googled and tried GStreamer with different command line options but not yet successful.
Here is a test command line:
gst-launch-1.0 rtspsrc
location="rtsp://root:root#192.168.1.1/axis-media/media.amp?videocodec=h264&resolution=320x240&fps=10&compression=50"
latency=10 ! rtph264depay ! h264parse ! tcpserversink
host=127.0.0.1 port=5100 -e
But when I want to test it with vlc, nothing is played. Is it related to SDP? Does gstreamer can restream sdp from source?
After finding the correct command line, I want to integrate it into a c# application to automate this process.
Any help is welcome.
You need gst-rtsp-server. And to use it you have to write small C/C++ application - example here
upd: If your rtsp source provide h264 video stream you could use following pipeline to restream it without transcoding:
rtspsrc location=rtsp://example.com ! rtph264depay ! h264parse ! rtph264pay name=pay0 pt=96
To re-stream h.264 video from IP camera, below is the Gstreamer pipeline (this works for me)
rtspsrc location=rtsp://IP_CAMERA_URL ! rtph264depay ! video/x-h264, stream-format=byte-stream ! h264parse ! rtph264pay ! application/x-rtp,media=video,encoding-name=H264,payload=96 ! yoursink
On gst-launch-1.0 --version --->
gst-launch-1.0 version 1.14.5
GStreamer 1.14.5