How to stream in h265 using gstreamer? - gstreamer

I am trying to use latest gstreamer and x265enc together. I saw that someone have already posted some commits in http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/log/ext/x265/gstx265enc.c
Can anyone please give an example pipeline where it is known to working (gst-launch-1.0 pipeline example will be very helpful)
1)
What is the current status of x265enc plugin for gstreamer ? does it work really ?
Which branch of gstreamer I need to use to build x265enc? I want to build whole gsteamer source code which will be compatible with x265enc plugin.
What are the system requirement for x265enc and how to build it ? Any wiki/basic instructions will be very helpful.
My goal is to broadcast my ip cameras (h264 streams) as h265 stream on vaughnlive.tv
Currently, I am using following pipeline to broadcast in h264 format:
GST_DEBUG=2 gst-launch-1.0 flvmux name=mux streamable=true ! rtmpsink
sync=true location="rtmp://xxxxxxxxxxxx" rtspsrc
location="rtsp://xxxxxxx" caps="application/x-rtp,
media=(string)audio, clock-rate=(int)90000, encoding-name=(string)MPA,
payload=(int)96" ! rtpmpadepay ! mpegaudioparse ! queue ! mad !
audioconvert ! queue ! voaacenc bitrate=128000 ! aacparse !
audio/mpeg,mpegversion=4,stream-format=raw ! mux. rtspsrc
location="rtsp://xxxxxxx"
caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,
encoding-name=(string)H264" ! rtph264depay !
video/x-h264,stream-format=avc,alignment=au,byte-stream=false ! queue
! decodebin ! queue ! videorate ! "video/x-raw,framerate=30/1" ! queue
! x264enc threads=4 speed-preset=ultrafast bitrate=3072 ! mux.
2)
Can anyone please suggest on how should I change this pipeline to broadcast in h265 format using x265enc element?

A little late but, maybe some people will find this question when seeking info about H.265 support in gstreamer nowadays. This is with gstreamer 1.6.1 compiled from source on Ubuntu 15.10 which has packages ready for libx265..
1,
Encoder
There is x265enc which will be enabled when we have library libx265-dev.
The encoder is inside gst-plugins-bad so after doing autogen.sh you should see x265enc enabled.
You may also need h265parse, rtph265pay/depay
Decoder
I see two decoders, dont know which one is working, I guess libde265dec there is also avdec_h265.
mux
For mux for x264 I was using mpegtsmux, but this does not support video/x265, some work has to be done. The matroskamux should be working when using filesink etc..
[16:39] hi, which container is suitable for x265enc, for x264enc I was using mpegtsmux?
[16:54] otopolsky: mpegts would work if you add support for h265 there, not very difficult[16:55] slomo_: so we need to just add the caps compatibility?
[16:55] otopolsky: otherwise, matroskamux supports it. mp4mux/qtmux could get support for it relatively easily too
[16:55] otopolsky: a bit more than that. look at what tsdemux does for h265
[16:56] otopolsky: and check the gst_mpegts_descriptor_from_registration related code in tsmux
[17:00] slomo_: thanks
2,
Questioned flvmux also does not support h265 only h264..
matroskamux cannot be used for streaming, so only way is to patch mpegtsmux or flvmux etc.

Related

Using Gstreamer, i can't ind a solution to send av1 video throught Udpsink in rtp packets

I'm currently working on Gstreamer and my goal is to take video from camera(coded natively in h264) decode it, then encode in AV1 and send it in udp to another computer on the network.
My pipelines currently are :
Server :
gst-launch-1.0 -v rtspsrc location= rtsp://192.168.33.104:8554/vis.0 latency=1 is-live=TRUE ! decodebin ! autovideoconvert ! x265enc tune=zerolatency bitrate=300 speed-preset=3 ! rtph265pay ! udpsink host=192.168.33.39 port=8123
Client :
gst-launch-1.0 udpsrc address=192.168.33.39 port=8123 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=H265,payload=96 ! rtph265depay ! avdec_h265 ! autovideosink
So with h265 it works but i cannot find how to do it with AV1 because i can't find a rtpav1pay (and depay).
Thanks in advance.
I tried to search for rtpav1pay but found nothing. I tried rtpgstpay(and depay) didn't work. The main goal is to use as little as possible the network without lag so maybe it's not the best solution. If you have any other idea please share it.
There are rtpav1pay and rtpav1depay plugins provided by gst-plugins-rs; they can be built along with GStreamer if you enable the Rust plugins option, but you could also build them separately from their own repo (instructions on the README).

gstreamer playbin3 to kinesis pipeline: audio stream missing

Firstly, big thanks to the gstreamer community for your excellent software.
I'm trying to use gstreamer to consume a DASH/HLS/MSSS stream (using playbin3) and restream to AWS Kinesis video:
gst-launch-1.0 -v -e \
playbin3 uri=https://dash.akamaized.net/dash264/TestCasesUHD/2b/2/MultiRate.mpd \
video-sink="videoconvert ! x264enc bframes=0 key-int-max=45 bitrate=2048 ! queue ! kvssink name=kvss stream-name=\"test_stream\" access-key=${AWS_ACCESS_KEY_ID} secret-key=${AWS_SECRET_ACCESS_KEY}" \
audio-sink="audioconvert ! audioresample ! avenc_aac ! kvss."
After much experimentation I decided against using uridecodebin3 as it does not handle the incoming stream as completely.
The above command results in a video stream on KVS but the audio is missing. I tried moving the kvssink out of the video-sink pipeline and accessing it as kvss. in both but that fails to link.
I can create separate kvs streams for the audio and video but would prefer them to be muxed.
Does anyone know if this is even possible? I'm open to other stacks for this.
SOLVED
Just posting back here in case anyone else comes accross this problem.
I've got this working using streamlink to restream locally over http:
streamlink <streamUrl> best --player-external-http --player-external-http-port <httpport>
Then using the java JNI bindings for gstreamer to run this pipeline:
kvssink name=kvs stream-name=<streamname> access-key=<awskey> secret-key=<awssecret> aws-region=<awsregion> uridecodebin3 uri=http://localhost:<port> name=d d. ! queue2 ! videoconvert ! videorate ! x264enc bframes=0 key-int-max=45 bitrate=2048 tune=zerolatency ! queue2 ! kvs. d. ! queue2 ! audioconvert ! audioresample ! avenc_aac ! queue2 ! kvs.
I needed to use java to pause and restart the stream on buffering discontinuities so as not to break the stream.
Files arriving in kvs complete with audio.

gstreamer shmsrc and shmsink with h264 data

i am trying to share an h264 encoded data from gstreamer to another two processes(both are based on gstreamer).After some research only way i found is to use the shm plugin.
this is what i am trying to do
gstreamer--->h264 encoder--->shmsink
shmrc--->process1
shmrc--->process2
i was able to get raw data from videotestsrc and webcam working. But for h264 encoded data it doesn't.
this is my test pipeline
gst-launch-1.0 videotestsrc ! video/x-raw,width=640,height=480,format=YUY2 !
x264enc ! shmsink socket-path=/tmp/foo sync=true wait-for-
connection=false shm-size=10000000
gst-launch-1.0 shmsrc socket-path=/tmp/foo ! avdec_h264 ! video/x-
raw,width=640,height=480,framerate=25/1,format=YUY2 ! autovideosink
have anyone tried shm plugins with h264 encoded data, please help
Iam not aware of the capabilities of your sink used in "autovideosink", but as per my knowledge you either need to use videoconvert if the format supported by the sink (like kmssink or ximagesink) are different than provided by the source (in your case YUY2) or use videoparse if the camera format is supported by the sink. You may check this using gst-inspect-1.0 for the formats supported.
Anyways I am able to run your pipeline with some modifications using videoconvert in my setup :
./gst-launch-1.0 videotestsrc ! x264enc ! shmsink socket-path=/tmp/foo sync=true wait-for-connection=false shm-size=10000000
./gst-launch-1.0 shmsrc socket-path=/tmp/foo ! h264parse ! avdec_h264 ! videoconvert ! ximagesink
You may modify it as per the resolutions you want.
Kindly let me know if you face any issue with above.

In GStreamer how do I simultaneously playback and record an h264 AVI file of a v4l2src?

Recorded files with gstreamer-0.10 with FPS25 and FourCIF_Format plays in fast forward mode. Any solution would be appreciated. Some times skips 3-4 seconds in recorded files.
The pipeline I'm attempting to use is:
gst-launch v4l2src device=/dev/video2 !
'video/x-raw-yuv,width=704,height=576, framerate=25/1' ! tee
name=liveTee ! queue ! mfw_isink liveTee. ! queue ! vpuenc ! avimux !
filesink location=/home/Recording.avi
I'm gonna take a rough stab at it and re-format your question a bit. This is mostly a GStreamer and Freescale question, not so much QT.
gst-launch-1.0 -e videotestsrc pattern=ball do-timestamp=true
is-live=true ! timeoverlay !
'video/x-raw,width=704,height=576,framerate=25/1' ! tee name=liveTee !
queue leaky=downstream ! videoconvert !
ximagesink async=false
liveTee. ! queue leaky=downstream ! videoconvert ! queue ! x264enc !
avimux ! filesink location=/tmp/test.avi
The thing to keep in mind is that your encoder has to keep up with the live playback. So your pipeline needs to handle the case where the encoder falls out of sync. On the queue elements behind the tee, use the leaky attribute.
Then you also want to be careful about your video source and what it's supplying. It looks like in your case you want live video, but if your source was an existing video file the pipeline would probably need some more tweaking.
NOTE: It may be even simpler than that, just adding async=false to the videosink appears to be very important.

Recording audio+video from webcam with gstreamer

I'm having a problem trying to record audio+video from my webcam to a file. If I use videotestsrc and autoaudiosrc I get everything right (read as in I get a file with audio recorded from the webcam's mic, and test-video image), but as soon as I replace videotestsrc with v4l2src (or autovideosrc) I get Error starting streaming on device '/dev/video0'.
The command I'm using:
gst-launch-0.10 videotestsrc ! queue ! ffmpegcolorspace! theoraenc ! queue ! oggmux name=mux autoaudiosrc ! queue ! audioconvert ! vorbisenc ! queue ! mux. mux. ! queue ! filesink location = test.ogg
Why is that happening? What am I doing wrong?
EDIT:
In fact, something as simple as
gst-launch-0.10 autovideosrc ! autovideosink autoaudiosrc ! autoaudiosink
is failing with the same error (Error starting streaming on device '/dev/video0')
Replacing autovideosrc with videotestsrc gives me test image + real audio.
Replacing autoauidosrc with audiotestsrc gives me real image + test audio.
I'm starting to think that this is some kind of limitation of my webcam. Is that possible?
EDIT:
GST_DEBUG=2 log here: http://pastie.org/4755009
EDIT 2:
GST_DEBUG="v4l2*:5" (gstreamer 0.10): http://pastie.org/4810519
GST_DEBUG="v4l2*:5" (gstreamer 1.0): http://pastie.org/4810502
Please do a
gst-launch-1.0 v4l2src ! videoscale ! videoconvert ! autovideosink
Does that run? If not repeat as
GST_DEBUG="v4l2*:5" GST_DEBUG_NO_COLOR=1 gst-launch 2>debug.log ...
and check the log for errors. You also might want to run v4l-info (install v4l-conf under debian/ubuntu) and report what formats your camera supports.