Encode/decode VP8 or VP9 with Gstreamer? - gstreamer

I'm trying to find a way to use VP8 or VP9 compressed video, a part of Googles WebM project with Gstreamer.
Is there a already a module that can handle VP8? If so, can I get some simple example of how to use it in a broadcast/receive over RTP?
So far there is nothing on the Gstreamer official documentation. They have Matroska support but that seems only be for demuxing the container.
Edit
There obviously are ways
Server:
gst-launch-0.10 -v v4l2src ! video/x-raw-yuv,width=640,height=480 ! vp8enc ! rtpvp8pay ! udpsink host=127.0.0.1 port=9001
Client:
gst-launch-0.10 udpsrc port=9001 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)VP8-DRAFT-IETF-01, payload=(int)96, ssrc=(uint)2990747501, clock-base=(uint)275641083, seqnum-base=(uint)34810" ! rtpvp8depay ! vp8dec ! ffmpegcolorspace ! Autovideosink
But the latency is higher than I expected.

Yes VP8 is already supported..
VP9 is missing de/payloader more to that below (not really - gst 1.8 added support - details at bottom)
They are contained in vpx module.
http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-good-plugins/html/gst-plugins-good-plugins-plugin-vpx.html
VP8:
http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-good-plugins/html/gst-plugins-good-plugins-vp8enc.html
http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-plugins-good-plugins/html/gst-plugins-good-plugins-vp8dec.html
check with gst-inspect-1.0 vp8enc if you have it
For RTP you can use webmmux, rtpvp8depay, rtpvp8pay, vp8enc/dec etc..
However, as Burak Arslan stated pay/depayloader for RTP are not ready for now (not even in 1.6.1 I checked)
For the examples - post some pipe with that and we can check it when its not working :)
EDIT
GStreamer 1.8 was released with support for VP9 - new elements rtpvp9pay/rtpvp9depay added

Related

Using Gstreamer, i can't ind a solution to send av1 video throught Udpsink in rtp packets

I'm currently working on Gstreamer and my goal is to take video from camera(coded natively in h264) decode it, then encode in AV1 and send it in udp to another computer on the network.
My pipelines currently are :
Server :
gst-launch-1.0 -v rtspsrc location= rtsp://192.168.33.104:8554/vis.0 latency=1 is-live=TRUE ! decodebin ! autovideoconvert ! x265enc tune=zerolatency bitrate=300 speed-preset=3 ! rtph265pay ! udpsink host=192.168.33.39 port=8123
Client :
gst-launch-1.0 udpsrc address=192.168.33.39 port=8123 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=H265,payload=96 ! rtph265depay ! avdec_h265 ! autovideosink
So with h265 it works but i cannot find how to do it with AV1 because i can't find a rtpav1pay (and depay).
Thanks in advance.
I tried to search for rtpav1pay but found nothing. I tried rtpgstpay(and depay) didn't work. The main goal is to use as little as possible the network without lag so maybe it's not the best solution. If you have any other idea please share it.
There are rtpav1pay and rtpav1depay plugins provided by gst-plugins-rs; they can be built along with GStreamer if you enable the Rust plugins option, but you could also build them separately from their own repo (instructions on the README).

gstreamer: pass frame PTS in command line API

Currently I have a setup like this.
my-app | gst-launch-1.0 -e fdsrc ! \
videoparse format=GST_VIDEO_FORMAT_BGR width=640 height=480 ! \
videoconvert ! 'video/x-raw, format=I420' ! x265enc ! h265parse ! \
matroskamux ! filesink location=my.mkv
From my-app I am streaming raw BGR frame buffers to gst. How can I also pass presentation timestamps (PTSs) for those frames? I have somewhat full control over my-app. I can open other pipes to gst from it.
I know I have the option to use gstreamer C/C++ API or write a gstreamer plugin, but I was trying to avoid this.
I guess you can set a framerate for the videoparse element. You can also try do-timestamp=true for the fdsrc - maybe it requires a combination of both.
If you have the PTS in my-app you would probably need to wrap buffers and PTS in a real GstBuffer and use gdppay and gdpdepay as payload between the link.
For example if your my-app would dump the images in the following format:
https://github.com/GStreamer/gstreamer/blob/master/docs/random/gdp
(not sure how recent this info document is)
You could receive the data with the following pipeline:
fdsrc ! gdpdepay ! videoconvert ! ..
No need for resolution and format either as it is part of the protocol too. And you will have PTS as well if set.
If you can use GStreamer lib in my-app you could some soome pipeline like this:
appsrc ! gdppay ! fakesink dump=true
And you would push your image buffers with PTS to the appsink.
See https://github.com/GStreamer/gst-plugins-bad/tree/master/gst/gdp for some examples how gdp is used as a protocol.

gstreamer shmsrc and shmsink with h264 data

i am trying to share an h264 encoded data from gstreamer to another two processes(both are based on gstreamer).After some research only way i found is to use the shm plugin.
this is what i am trying to do
gstreamer--->h264 encoder--->shmsink
shmrc--->process1
shmrc--->process2
i was able to get raw data from videotestsrc and webcam working. But for h264 encoded data it doesn't.
this is my test pipeline
gst-launch-1.0 videotestsrc ! video/x-raw,width=640,height=480,format=YUY2 !
x264enc ! shmsink socket-path=/tmp/foo sync=true wait-for-
connection=false shm-size=10000000
gst-launch-1.0 shmsrc socket-path=/tmp/foo ! avdec_h264 ! video/x-
raw,width=640,height=480,framerate=25/1,format=YUY2 ! autovideosink
have anyone tried shm plugins with h264 encoded data, please help
Iam not aware of the capabilities of your sink used in "autovideosink", but as per my knowledge you either need to use videoconvert if the format supported by the sink (like kmssink or ximagesink) are different than provided by the source (in your case YUY2) or use videoparse if the camera format is supported by the sink. You may check this using gst-inspect-1.0 for the formats supported.
Anyways I am able to run your pipeline with some modifications using videoconvert in my setup :
./gst-launch-1.0 videotestsrc ! x264enc ! shmsink socket-path=/tmp/foo sync=true wait-for-connection=false shm-size=10000000
./gst-launch-1.0 shmsrc socket-path=/tmp/foo ! h264parse ! avdec_h264 ! videoconvert ! ximagesink
You may modify it as per the resolutions you want.
Kindly let me know if you face any issue with above.

Gstreamer: Could not swtich codebooks: rtpvorbisdepay

I am trying to stream audio with the following GStreamer pipeline:
Server:
gst-launch-1.0 -v audiotestsrc ! audioconvert ! vorbisenc ! rtpvorbispay ! udpsink host=127.0.0.1 port=5000
Client:
gst-launch-1.0 udpsrc port=5000 ! "application/x-rtp, media=audio, clock-rate=44100, encoding-name=VORBIS, encoding-params=1, payload=96" ! rtpvorbisdepay ! vorbisdec ! audioconvert ! autoaudiosink
I get the following message from GStreamer:
WARNING: from element /GstPipeline:pipeline0/GstRtpVorbisDepay:rtpvorbisdepay0: Could not decode stream.
Additional debug info: gstrtpvorbisdepay.c(614): gst_rtp_vorbis_depay_process (): /GstPipeline:pipeline 0/GstRtpVorbisDepay:rtpvorbisdepay0: Could not switch codebooks
And I don't get any sound on the client. Can anyone help?
[EDIT:]
When I copy-paste the caps from the server side... It works! But among those caps there is a configuration parameter which looks really ugly (link here). I noticed that if I just delete this parameter it doesn't work anymore. Moreover I used gst-inspect on udpsrc and rtpvorbisdepay elements and there is nothing about this parameter. Can someone explain me what this parameter corresponds to? Is there a way to avoid it?
I think this is Theora Vorbis thing.. those are some configuration parameters for initialization of decoder if I understand that properly..
Theora makes the same controversial design decision that Vorbis made to
include the entire probability model for the DCT coecients and all the quan-
tization parameters in the bitstream headers. This is often several hundred
elds. It is therefore impossible to decode any frame in the stream without
having previously fetched the codec info and codec setup headers.
~ from here
some similar question

How to stream in h265 using gstreamer?

I am trying to use latest gstreamer and x265enc together. I saw that someone have already posted some commits in http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/log/ext/x265/gstx265enc.c
Can anyone please give an example pipeline where it is known to working (gst-launch-1.0 pipeline example will be very helpful)
1)
What is the current status of x265enc plugin for gstreamer ? does it work really ?
Which branch of gstreamer I need to use to build x265enc? I want to build whole gsteamer source code which will be compatible with x265enc plugin.
What are the system requirement for x265enc and how to build it ? Any wiki/basic instructions will be very helpful.
My goal is to broadcast my ip cameras (h264 streams) as h265 stream on vaughnlive.tv
Currently, I am using following pipeline to broadcast in h264 format:
GST_DEBUG=2 gst-launch-1.0 flvmux name=mux streamable=true ! rtmpsink
sync=true location="rtmp://xxxxxxxxxxxx" rtspsrc
location="rtsp://xxxxxxx" caps="application/x-rtp,
media=(string)audio, clock-rate=(int)90000, encoding-name=(string)MPA,
payload=(int)96" ! rtpmpadepay ! mpegaudioparse ! queue ! mad !
audioconvert ! queue ! voaacenc bitrate=128000 ! aacparse !
audio/mpeg,mpegversion=4,stream-format=raw ! mux. rtspsrc
location="rtsp://xxxxxxx"
caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,
encoding-name=(string)H264" ! rtph264depay !
video/x-h264,stream-format=avc,alignment=au,byte-stream=false ! queue
! decodebin ! queue ! videorate ! "video/x-raw,framerate=30/1" ! queue
! x264enc threads=4 speed-preset=ultrafast bitrate=3072 ! mux.
2)
Can anyone please suggest on how should I change this pipeline to broadcast in h265 format using x265enc element?
A little late but, maybe some people will find this question when seeking info about H.265 support in gstreamer nowadays. This is with gstreamer 1.6.1 compiled from source on Ubuntu 15.10 which has packages ready for libx265..
1,
Encoder
There is x265enc which will be enabled when we have library libx265-dev.
The encoder is inside gst-plugins-bad so after doing autogen.sh you should see x265enc enabled.
You may also need h265parse, rtph265pay/depay
Decoder
I see two decoders, dont know which one is working, I guess libde265dec there is also avdec_h265.
mux
For mux for x264 I was using mpegtsmux, but this does not support video/x265, some work has to be done. The matroskamux should be working when using filesink etc..
[16:39] hi, which container is suitable for x265enc, for x264enc I was using mpegtsmux?
[16:54] otopolsky: mpegts would work if you add support for h265 there, not very difficult[16:55] slomo_: so we need to just add the caps compatibility?
[16:55] otopolsky: otherwise, matroskamux supports it. mp4mux/qtmux could get support for it relatively easily too
[16:55] otopolsky: a bit more than that. look at what tsdemux does for h265
[16:56] otopolsky: and check the gst_mpegts_descriptor_from_registration related code in tsmux
[17:00] slomo_: thanks
2,
Questioned flvmux also does not support h265 only h264..
matroskamux cannot be used for streaming, so only way is to patch mpegtsmux or flvmux etc.