I'm working on a gstreamer application that uses x264enc element. According to the document below, there is no property to specify minimum key interval while there's key-int-max.
https://thiblahute.github.io/GStreamer-doc/x264-1.0/index.html?gi-language=c
On the other hand, there's --min-keyint property in FFmpeg option.
https://sites.google.com/site/linuxencoding/x264-ffmpeg-mapping
How to set minimum key interval to x264enc element of gstreamer?
Thanks in advance.
You may try the option-string property in x264enc:
option-string : String of x264 options (overridden by element properties)
flags: readable, writable
String. Default: ""
This basically hands of options to libx264. Unfortunately I forgot the syntax that you need to use here.. Could have been option-string=min-keyint=x but perhaps double checking the x264enc element's code should give some more hints here.
Related
The goal here is to be able to send some metadata (timestamps, objects found per frame) with a stream within a single pipeline or multiple pipelines over network (e.g. RTP UDP).
Within The pipeline:
This is straight forward by defining a new GstMeta API and register and implement it. Then add it to GstBuffers via buffer probes or another element solely designed for this purpose.
Multiple pipelines over Network:
The only solution I have heard about is to add one or more RTP header extension to RTP packets coming out of a payloader. Actually some people decided to transform their Custom GstMeta into those header extensions. But Is This really a good use-case for it?
There's more ....
Another idea is to create your own custom media type video/x-mytype and for that you will need to write a typefinding function and an autoplugger on top of it, Not to mention a couple of elements to deal with this new type and convert it to other exiting media types. Now that new type should actually have a place where it can handle the meta data I was talking about.
That summarizes my research, Are there any other methods that I am not aware of?
I would definitely appreciate your input on this!
Thanks in advance!
I decided to implement a new media type inside gstreamer (e.g. video/x-h265-with-meta). This new media type shall have inherent metadata handling capabilities. You can design how it encode/decode the metadata as you see fit in this new type. You would also need write 2 more gstreamer elements. one to convert regular video/x-h265 to video/x-h265-with-meta (say muxmymeta) and the other to reverse such conversion (say demuxmymeta). So essentially muxmymeta will read the metadata injected in the GstBuffer* via GstMeta API implementation and encodes in the outgoing stream. demuxmymeta will decode such metadata from the stream and add it back to GstBuffer*. To ensure that for every encoded h265 frame will have a corresponding meta, muxmymeta must have its src caps in this format:
video/x-h265
stream-format: byte-stream
alignment: au
The byte-stream is mandatory for network transmission and the au alignment forces a whole encoded frame as input for muxmymeta.
I also used rtpgstpay and rtpgstdepay elements to use this new media type with RTP/RTSP. You will NOT need to write a typefinding function at this point since rtpgstpay/rtpgstdepay will be able to figure that out from the caps. This works well out of the box for h265 streams in gstreamer v1.6.2 but I had to make a fix inside gstreamer itself to make it work with h264 as well. Finally, to make autopluggers such as decodebin/uridecodebin to decode your stream, you must set the klass for demuxmymeta to Codec/Decoder/Demuxer and set a priority for it other than none, this basically tells gstreamer to use this element for decoding.
Here are 2 pipelines for illustration:
Sending H265 RTSP with metadata pipeline (set as the launch line for GstRTSPMediaFactory)
( rtspsrc location=rtsp://localhost:8554/test latency=0 ! rtph265depay ! h265parse ! muxmymeta ! rtpgstpay name=pay0 pt=96 )
Here, the rtsp://localhost:8554/test stream is a regular h265 stream video/x-h265. yet assume that the rtsp-server will produce rtsp://localhost:8553/test-meta which has the type video/x-h265-with-meta
Receiver
rtspsrc location=rtsp://localhost:8553/test-meta ! rtpgstdepay ! demuxmymeta ! h265parse ! avdec_h265 ! videoconvert ! autovideosink
With an Autoplugger it will look as neat as:
uridecodebin uri=rtsp://localhost:8553/test-meta ! autovideosink
Though this might seem like a longshot solution, it provides a way to transport self-synchronizing metadata between pipelines over the network.
I have this working but have been unable to get video from my magwell to intergrate and could use help with the correct pipline.
gst-launch-1.0 videotestsrc ! video/x-raw,width=848,height=480,framerate=25/1 ! x264enc bitrate=700 ! video/x-h264,width=848,height=480,framerate=25/1,stream-format=byte-stream,profile=baseline ! tee name=t\
t. ! queue ! tcpclientsink host=172.18.0.3 port=8000 \
t. ! queue ! tcpclientsink host=172.18.0.4 port=8000
I do not see the receiver side pipeline in the question description. This is required to verify that there are no issues at the receiver side. Based on your current pipeline I have the following suggestions:
You don't need set the caps again after the element x264enc, because the output is anyhow of type video/x-h264. What you need is to add h264parse after x264enc. You need to also add h264parse, before passing the data to decoder you are using at the receiver side.
The bitrate set for x264enc is also very less. The units are in kbits/sec, and for a video this might be very less. It's best to leave this to default setting if you do not have any strict resource constraints. Otherwise try for a higher value.
Also is there any reason why you are using TCP. Using UDP might be a better idea for video, in case video data/packet loss is not an issue.
I'm sure I've had this pipeline working on an earlier Ubuntu system I had set up (formatted for readability):
playbin
uri=rtspt://user:pswd#192.168.xxx.yyy/ch1/main
video-sink='videoconvert
! videoflip method=counterclockwise
! fpsdisplaysink'
Yet, when I try to use it within my program, I get:
Missing element: H.264 (Main Profile) decoder
WARNING: from element /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0:
No decoder available for type 'video/x-h264,
stream-format=(string)avc, alignment=(string)au,
codec_data=(buffer)014d001fffe10017674d001f9a6602802dff35010101400000fa000030d40101000468ee3c80,
level=(string)3.1, profile=(string)main, width=(int)1280,
height=(int)720, framerate=(fraction)0/1, parsed=(boolean)true'.
Additional debug info:
gsturidecodebin.c(938): unknown_type_cb ():
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0
Now I'm pretty certain I have an H264 decoder installed and indeed the gstreamer plugins autogen.sh/configure correctly recognised the fact. Installed packages are h264enc, libx264-142, libx264-dev and x264.
It does exactly the same thing if I use the more "acceptable" autovideosink in place of fpsdisplaysink, or if I try to play the RTSP stream with gst-play-1.0. However, it works if I use the test pattern source videotestsrc.
What am I doing wrong?
It looks like gstreamer cannot find a suitable plugin for decoding H264. Either you do not have an H264 decoder element installed, or gstreamer is looking in the wrong path for your elements.
First, try running gst-inspect-1.0. This should output a long list of all the elements gstreamer has detected.
If this doesn't return any elements, you probably need to set the GST_PLUGIN_PATH environment variable to point to the directory where your plugins are installed. Running Gstreamer - This link should help.
If it DOES return many elements, run gst-inspect-1.0 avdec_h264 to verify that you have the H264 decoder element.
I'm have pipeline:
gst-launch-1.0 rtspsrc location=rtsp://ip/cam ! rtph264depay ! h264parse ! mp4mux fragment-duration=10000 streamable=1 ! multifilesink next-file=2 location=file-%03d.mp4
The first segment is played well, others not. When I'm try to view the structure of damaged mp4 see an interesting bug:
MOOV
Some data
MOOF
MDAT
MOOF
MDAT
The most interesting thing in "Some data". There is no header data, they simply exist. By block size I think it MDAT. I find size of the block and add before it MDAT header. File immediately becomes valid and playing. But the unknown piece can't be played because before it no MOOF header.
Problem is at mp4mux and qtmux. Tested on GStreamer 1.1.0 and 1.2.2. All results are identical.
Can use multifilesink not correct?
If you take look at documentation for multifilesink you will find the answer:
It is not possible to use this element to create independently playable mp4 files, use the splitmuxsink element for that instead. ...
So use splitmuxsink and don't forget to send EOS when you done to correct finish last file
Update
Looks like at time when question has been asked there wasn't such element like splitmuxsink
Can this be reproduced using videotestsrc instead of rtsp?
Try replacing your h264 receiving and depayloading with "videotestsrc num-buffers= ! x264enc ! mp4mux ..."
This might be a bug, please file it at https://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer so it gets proper attention from maintainers.
Also, how are you trying to play it?
Thanks
I'm looking for the correct technique, if one exists, for dynamically replacing an element in a running gstreamer pipeline. I have a gstreamer based c++ app and the pipeline it creates looks like this (using gst-launch syntax) :
souphttpsrc location="http://localhost/local.ts" ! mpegtsdemux name=d ! queue ! mpeg2dec ! xvimagesink d. ! queue ! a52dec ! pulsesink
During the middle of playback (i.e. GST_STATE_PLAYING is the pipeline state and the user is happily watching video), I need to remove souphttpsrc from the pipeline and create a new souphttpsrc, or even a new neonhttpsource, and then immediately add that back into the pipeline and continue playback of the same uri source stream at the same time position where playback was before we performed this operation. The user might see a small delay and that is fine.
We've barely figured out how to remove and replace the source, and we need more understanding. Here's our best attempt thus far:
gst_element_unlink(source, demuxer);
gst_element_set_state(source, GST_STATE_NULL);
gst_bin_remove(GST_BIN(pipeline), source);
source = gst_element_factory_make("souphttpsrc", "src");
g_object_set(G_OBJECT(source), "location", url, NULL);
gst_bin_add(GST_BIN(pipeline), source);
gst_element_link(source, demuxer);
gst_element_sync_state_with_parent(source);
This doesn't work perfectly because the source is playing back from the beginning and the rest of the pipeline is waiting for the correct timestamped buffers (I assume) because after several seconds, playback picks back up. I tried seeking the source in multiple ways but nothing has worked.
I need to know the correct way to do this. It would be nice to know a general technique, if one exists, as well, in case we wanted to dynamically replace the decoder or some other element.
thanks
I think this may be what you are looking for:
http://cgit.freedesktop.org/gstreamer/gstreamer/tree/docs/design/part-block.txt
(starting at line 115)