I am using Gstreamer (Gst-python) to write a pipeline that samples images out of a video and creates a slideshow out of these images. For that I am using valve element to sample the images and imagefreeze to create the slideshow.
My pipeline looks something like the following :
source --- tee --- queue1 --- valve --- imagefreeze -- sink
--- queue2 --- sink
Currently, I am attaching a callback to GObject mainloop with "timeout_add_seconds".
In this callback I am toggling the drop property of valve to let one image buffer pass through, and it works. The problem is when I insert imagefreeze it does not refresh automatically so I tried to change its state to READY then to PLAYING. This works for the first call of the function but the entire pipeline freezes at the second call and I could not figure out why. here's my callback snippet :
def toggle(pipeline):
imagefreeze = pipeline.get_by_name("freeze")
imagefreeze.set_state(Gst.State.PAUSED)
imagefreeze.set_state(Gst.State.READY)
valve = pipeline.get_by_name("valve")
valve.set_property("drop", 0)
time.sleep(0.1)
imagefreeze.set_state(Gst.State.PAUSED)
imagefreeze.set_state(Gst.State.PLAYING)
valve.set_property("drop", 1)
return True
Is it a correct approach to do so? otherwise how can I force imagefreeze plugin to refresh its output when the input buffer changes?
Related
I am running below gstreamer command for live streaming:
gst-launch-1.0 imxv4l2videosrc device=/dev/video0 fps-n=30 imx-capture-mode=0 ! textoverlay name=overlay text=\"Overlay text here\" valignment=top halignment=left font-desc=\"Sans, 22\"! gdkpixbufoverlay name=imageoverlay location=/home/user/LZ_50/CamOverlay.png ! imxg2dvideotransform ! imxg2dvideosink framebuffer=/dev/fb1 use-vsync=true sync=false"
I want to change the text overlay dynamic in the GStreamer pipeline.
How can I get the pipeline object pointer to change the text overlay dynamic?
Thank
I had written an application but I have an issue with crashing the application using the GStreamer pipeline with overlay image & text.
Finally got the issue of crashing the application.more detail can be found at : imx Gstreamer plugins .
This also helps to reduce CPU usage(almost 20% 1 core).
how can I connect "qtdemux" and "ffdec_h264" using c code with Gstreamer ?
demux= gst_element_factory_make("qtdemux", "demux");
dec = gst_element_factory_make ("ffdec_h264", "dec");
I believe you are using 0.10.x? Gst 1.x doesn't have ffmpeg elements.
Here are the steps:
You will have to attach a callback to "pad-added" signal of the qtdemux.
The callback will be invoked from the qtdemux for every stream
(eg. audio,video) present in the source.
The pad connection will have to be done inside the callback.
Go to page 32 on
https://lzy-s-prct.googlecode.com/files/gstreamer-manual.pdf
for the example.
I am new using g streamer, and i try to use the emit-stats properties in tsdemux
How can I do to use this in my pipeline?
I'm trying to get the program clock reference value of a signal transport stream but no way to get it.
Properties in GStreamer are normally accessed by using the normal GLib APIs : g_object_set and g_object_get. Doing g_object_set (v1_demux, "emit-stats", TRUE, NULL);, supposing that v1_demux is a GstTSDemux*, will start emitting messages containing the PTS and DTS of the packets that flow into the demuxer.
Element messages in GStreamer are emitted by gst_element_post_message. In order to receive them in your application, it needs to set up a bus watch on the main pipeline's GstBus.
Just for the record, you can test how the property works and see the content of the messages by running this example pipeline in gst-launch :
gst-launch-1.0 -m filesrc location="$YOUR_TRANSPORT_STREAM" ! tsdemux emit-stats=1 ! fakesink
Running this with one of the transport streams on my HDD, I can see messages with the PTS and DTS being emitted from the demuxer element :
Got message #77 from element "tsdemux0" (element): tsdemux, pid=(uint)1803, offset=(guint64)266020, pts=(guint64)8429319339;
Got message #78 from element "tsdemux0" (element): tsdemux, pid=(uint)1805, offset=(guint64)273540, pts=(guint64)8429311261;
Got message #79 from element "tsdemux0" (element): tsdemux, pid=(uint)1802, offset=(guint64)282564, dts=(guint64)8429444461;
However, it doesn't look like PCR and OPCR values are emitted. You'll have to add this functionality yourself.
Thanks for the info.
was test commands and see the script and check the values, but is costing me add messages emit-stats in my line.
If I created a bus watch on the main pipeline's GstBus,to see the video duration and as playtime in my line, but can not see messages stats and video simultaneously. I still investigating comohacerlo as storing information pts and dts in some way.
My idea is get stamps of two videos and subtract this to calculate an automatic offset in one video.
I'm looking for the correct technique, if one exists, for dynamically replacing an element in a running gstreamer pipeline. I have a gstreamer based c++ app and the pipeline it creates looks like this (using gst-launch syntax) :
souphttpsrc location="http://localhost/local.ts" ! mpegtsdemux name=d ! queue ! mpeg2dec ! xvimagesink d. ! queue ! a52dec ! pulsesink
During the middle of playback (i.e. GST_STATE_PLAYING is the pipeline state and the user is happily watching video), I need to remove souphttpsrc from the pipeline and create a new souphttpsrc, or even a new neonhttpsource, and then immediately add that back into the pipeline and continue playback of the same uri source stream at the same time position where playback was before we performed this operation. The user might see a small delay and that is fine.
We've barely figured out how to remove and replace the source, and we need more understanding. Here's our best attempt thus far:
gst_element_unlink(source, demuxer);
gst_element_set_state(source, GST_STATE_NULL);
gst_bin_remove(GST_BIN(pipeline), source);
source = gst_element_factory_make("souphttpsrc", "src");
g_object_set(G_OBJECT(source), "location", url, NULL);
gst_bin_add(GST_BIN(pipeline), source);
gst_element_link(source, demuxer);
gst_element_sync_state_with_parent(source);
This doesn't work perfectly because the source is playing back from the beginning and the rest of the pipeline is waiting for the correct timestamped buffers (I assume) because after several seconds, playback picks back up. I tried seeking the source in multiple ways but nothing has worked.
I need to know the correct way to do this. It would be nice to know a general technique, if one exists, as well, in case we wanted to dynamically replace the decoder or some other element.
thanks
I think this may be what you are looking for:
http://cgit.freedesktop.org/gstreamer/gstreamer/tree/docs/design/part-block.txt
(starting at line 115)
I'm constructing a gstreamer pipeline that receives two RTP streams from an networked source:
ILBC Audio stream + corresponding RTCP stream
H263 Video stream + corresponding RTCP stream
Everything is put into one gstreamer pipeline so it will use the RTCP from both streams to synchronize audio/video. So far I've come up with this (using gst-launch for prototyping):
gst-launch -vvv gstrtpbin name=rtpbin
udpsrc caps="application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)H263-2000" port=40000 ! rtpbin.recv_rtp_sink_0
rtpbin. ! rtph263pdepay ! ffdec_h263 ! xvimagesink
udpsrc port=40001 ! rtpbin.recv_rtcp_sink_0
rtpbin.send_rtcp_src_0 ! udpsink port=40002 sync=false async=false
udpsrc caps="application/x-rtp,media=(string)audio,clock-rate=(int)8000,encoding-name=(string)PCMU,encoding-params=(string)1,octet-align=(string)1" port=60000 rtpbin.recv_rtp_sink_1
rtpbin. ! rtppcmudepay ! autoaudiosink
udpsrc port=60001 ! rtpbin.recv_rtcp_sink_1
rtpbin.send_rtcp_src_1 ! udpsink port=60002 sync=false async=false
This pipeline works well if the networked source starts out with sending both video and audio. If the videostream is paused later on, gstreamer will still playback audio and even will start playing back the video when the networked source resumes the video stream.
My problem is however that if the networked source starts out with only an audio stream (video might be added later on), the pipeline seems to pause/freeze until the video stream starts as well.
Since video is optional (and can be added/removed at will by the user) in my application, is there any way I can hook up for instance an 'videotestsrc' that will provide some kind of fallback video data to keep the pipeline running when there is no networked video data?
I've tried experimenting with 'videotestsrc' and a thing called 'videomixer' but I think that mixer still requires both streams to be alive. Any feedback is greatly appreciated!
I present a simple function for pause resume by changing bins. In the following example I provide the logic to change destination bin on the fly dynamically. This shall not completely stop the pipeline which is what you seek I believe. A similar logic could be used for src bins. Here you may remove your network source bin and related decoder/demux bins and add videotestsrc bins.
private static void dynamic_bin_replacement(Pipeline pipe, Element src_bin, Element dst_bin_new, Element dst_bin_old) {
pipe.pause();
src_bin.unlink(dst_bin_old);
pipe.remove(dst_bin_old);
pipe.add(dst_bin_new);
dst_bin_new.syncStateWithParent();
src_bin.link(dst_bin_new);
pipe.ready();
pipe.play();
}
The other logic you may want to try is "PADLOCKING". Please take a look at the following posts
http://cgit.freedesktop.org/gstreamer/gstreamer/tree/docs/design/part-block.txt
and
http://web.archiveorange.com/archive/v/8yxpz7FmOlGqxVYtkPb4
and
Adding and removing audio sources to/from GStreamer pipeline on-the-go
UPDATE
Try output-selector and input-selector bins as they seem to be better alternative. I found them most reliable and have had immense luck with them. I use fakesink or fakesrc respectively as the other end of the selector.
valve bin is another alternative that I found doesn't even need fakesink or fakesrc bins. It is also extremely reliable.
Also the correct state transition order for media file source
NULL -> READY -> PAUSED -> PLAYING (Upwards)
PLAYING -> PAUSED -> READY -> NULL (Downwards)
My order in the above example should be corrected where ready() should come before pause(). Also I would tend to think un-linking should be performed after null() state and not after pause(). I haven't tried these changes but theoretically they should work.
See the following link for detailed info
http://cgit.freedesktop.org/gstreamer/gstreamer/tree/docs/design/part-states.txt?h=BRANCH-RELEASE-0_10_19