Not able to save gstream videosink - gstreamer

I'm just trying to save the dummy video to my directory.
In that case I end up in this error so I knew something is wrong in the pipeline.
Do I missing any parameters here ?
gst-launch -v videotestsrc ! ximagesink ! filesink location=~/cupcake.mp4
WARNING: erroneous pipeline: could not link ximagesink0 to filesink0
I just want to record only the video.

ximagesink is a sink element and as such doesn't have an output (source pad).
This command will tell you about the details of an element:
gst-inspect-1.0 ximagesink
Notice that ximagesink has only sink pad and no source pads, so it doesn't generate any output.
You can dump the video directly to file by using:
gst-launch-1.0 videotestsrc ! filesink location=~/cupcake.raw
Unfortunately, this is still not what you want as videotestsrc will generate raw video and not encoded or muxed to mp4. If you want mp4 you need to put it into the mp4mux that will organize the data it receives into the mp4 container. It is also recommended to encode the video to reduce its size. Let's assume you want to use H.264 as your codec. You can use the element x264enc to encode to H.264
gst-launch-1.0 -e videotestsrc ! x264enc ! mp4mux ! filesink location=~/cupcake.mp4
Notice that I also added the "-e" parameter that will make gst-launch-1.0 send an EOS event and wait for the EOS message to indicate elements have finished working. Without the flag the pipeline is simply interrupted and aborted.
In any case I'd recommend going back to the manuals for application development: http://gstreamer.freedesktop.org/documentation/
The manpage for gst-launch-1.0 is also useful.
Disclaimer: You are using gstreamer 0.10 which is 3 years unmantained and obsolete, please upgrade to 1.0 (This answer is aimed at 1.0 but it can easily be applied to 0.10 by changing the commands to 0.10 version)

Related

Record RTSP camera H.264 stream to MP4 with Gstreamer

I'm developing an app receiving an H.264 video stream from an RTSP camera and displaying and storing it to MP4 without transcoding. For the purpose of my current test, I record for 5 sec only.
My problem is that the MP4 is not playable. The resulting file varies in size from one run of the app to another showing something is very wrong (unexpected since the recording time is fixed).
Here are my pipelines:
rtspsrc location = rtsp://192.168.0.61:8554/quality_h264 latency=0 ! rtph264depay ! h264parse ! video/x-h264,stream-format=avc ! queue ! interpipesink name=cam1
interpipesrc allow-renegotiation=true name=src listen-to=cam1 is-live=true ! h264parse ! queue ! decodebin ! autovideoconvert ! d3dvideosink sync=false
interpipesrc allow-renegotiation=true name=src listen-to=cam1 is-live=true ! h264parse ! queue ! mp4mux ! filesink location=test.mp4
In a next step I will add more cameras and will need to be able to change which camera gets recorded to MP4 on the fly, as well as pause/resume the recording. For this reason, I've opted to use interpipesink/src. It's a set of gstreamer elements that allow communication between two independent pipelines. https://github.com/RidgeRun/gst-interpipe
A thread waits for 10 sec, then sends EOS on the 3rd pipeline (recording). Then, when the bus receives GST_MESSAGE_EOS it sets the pipeline state to NULL. I have checked with a pad probe that the EOS event is indeeed received on the sink pad of the filesink.
I send EOS using this code: gst_element_send_event(m_pipeline, gst_event_new_eos()); m_pipeline is the 3rd pipeline.
Those exact pipelines produce a playable MP4 when run with gst-launch adding -e at the end.
If I replace mp4mux by matroskamux in my app, the mkv is playable and has the expected size. However, there's something wrong with the timestamps as the player shows it starting at time 10 sec insteasd of 0. Do I need to edit the timestamps before passing the buffers to the mux (mp4mux or matroskamux)?
It looks to me as if the MP4 is not fully written, but I can't see what else I can do appart from sending EOS?
I'm opened to suggestions to restructure the app, in case the use of the interpipe elements may cause a problem (although I can't see why at the moment).
I'm using Gstreamer 1.18.2 on Windows 10 (x64).

gstreamer dropping frames: ARM processor

I am running a 9 sec video on my NVIDIA Tegra jetson TK1 board using gstreamer as:
gst-launch-0.10 playbin uri=file:///home/ubuntu/widescreen.avi
I notice this drops a lot of frames and gstreamer prints these messages:
WARNING: from element /GstPlayBin:playbin0/GstBin:vbin/GstAutoVideoSink:videosink/GstXvImageSink:videosink-actual-sink-xvimage: A lot of buffers are being dropped.
Additional debug info:
gstbasesink.c(2875): gst_base_sink_is_too_late (): /GstPlayBin:playbin0/GstBin:vbin/GstAutoVideoSink:videosink/GstXvImageSink:videosink-actual-sink-xvimage:
There may be a timestamping problem, or this computer is too slow.
I ran top while executing this and indeed gstreamer is taking 95% of the CPU.
Now, when i play this video through the default media player, it plays completely fine and without any lag. I was wondering if anyone knows what may be the reason that gstreamer is unable to play it properly. I am new to gstreamer and wondering if I can do something to alleviate this.
Bins are many elements combined into one element type structure(i.e with appropriate number of sinks and sources). playbin is a standard bin with an "autovideosink" element which automatically detects video sinks.
Firstly I would suggest you to upgrade to gstreamer-1.0 in which many bugs were fixed.
Secondly the problem seems to be with your xvimagesink, hence try using ximagesink by defining explicitly, your autovideosink is selecting xvimagesink by default.
try this pipeine:-
(for hardware decoding)
gst-launch-1.0 filesrc location="location of h264 video/file.avi" ! avidemux ! h264parse ! omxh264dec ! videoconvert ! ximagesink
or for cpu decoding use avdec_h264 instead of omxh264dec
The default media player is likely using video hardware decoding if the media file is H.264, which allows it decode with less CPU resources. Try creating a gstreamer pipeline that explicitly uses the omx H.264 decoding element as discussed here (http://elinux.org/Jetson/H264_Codec).
eg. gst-launch-0.10 filesrc location=/home/ubuntu/widescreen.avi ! avidemux ! h264parse ! nv_omx_h264dec ! autovideosink

Gstreamer rtsp stream to appsink to openCV

I need a bit of your help because I'm trying to receive rtsp stream by gstreamer and then put it into openCV to process video. What is worse, I will need it back from openCV but first things first. I'm quite new to this so I don't know Gstreamer well so I'm counting on you guys. Some simple examples would be best but I'll use what I have;)
Thanks in advance
You can use something like this:
uridecodebin uri=rtsp:// name=uridec ! queue ! tee name=t ! queue ! <some encoder and muxer> ! filesink t. ! queue ! videoconvert ! "video/x-raw, format=BGR" ! appsink t. ! queue ! <restream>
In this possible solution you are receiving and decoding at uridecodebin which means that for re-streaming you need to encode, as well as encoding for storing to a file. If that's not what you want you can replace uridecodebin with rtspsrc that will give you RTP streams instead of decoded raw streams. Something like:
rtspsrc ! rtpXdepay ! tee name=t ! ...
Replace X with the format you are receiving (can be done dynamically from your application). Now the output is an encoded stream that you can use in a similar way as the sample pipeline above.
Note that these suggestions are assuming your rtsp input is a single stream (video likely), if you want video and audio you need to add 2 branches out of uridecodebin or rtspsrc. I also assumed that by 'rtspStream' is some sort of external library/application that you are going to use to retransmit instead of using gstreamer itself. In any way, this should give you an idea.

Gst-launch: Saving every image of a video stream while watching it

I'm currently trying to save a video stream into files using gst-launch while simultaneously watching the video itself (using v4l2src). As of now I got this by doing a work around with saving the images to files using ! multifilesink while having a tcl-script that automatically shows the newest file in one folder in an X windows.
This works but has of course a bit of a delay I would like to reduce.
Is there a possibility to do this with only using gst-launch? I'm not very experienced with gstreamer unfortunately. Could it be done saving the files with multifilesink while showing them using multifilesrc? Or is it impossible with only gst-launch?
It is possible, there is the 'tee' element that will replicate the stream in its source pads.
So, for example:
gst-launch-1.0 v4l2src ! tee name=t ! queue ! videoconvert ! autovideosink t. ! queue ! videoconvert ! jpegenc ! multifilesink location=image_%06d.jpg
This should have it displaying and saving to jpg with multifilesink.
Also, it seems that you are using gstreamer 0.10, it is (2 years?) obsolete and unmantained. Please move to 1.x

How to get Pipeline created by playbin in textual format in Gstreamer?

I'm playing a transport stream file (*.ts) using the following pipeline:
gst-launch-0.10 playbin2 uri=file:///c:/bbb.ts
But I need to convert that into a pipeline myself. I'm not sure how to achieve this.
So far I have tried: (works fine)
gst-launch-0.10 -v filesrc location=c:/bbb.ts ! tsdemux ! audio/x-ac3 ! fakesink
But if i replace fakesink with autoaudiosink it fails with a not-linked error.
And even the fakesink doesn't work for video:
gst-launch-0.10 -v filesrc location=c:/bbb.ts ! tsdemux ! video/x-mpeg2 ! fakesink
So I have two questions:
How to find out pipeline created by playbin element.
How to play mpeg2-ts file using gstreamer pipeline.
Answer to question 1 -
There is a way to get the graphs of the pipeline created mentioned in documentation of basic tutorial-11.
A brief from the page
Getting pipeline graphs
For those cases where your pipeline starts to grow too large and you
lose track of what is connected with what, GStreamer has the
capability to output graph files. These are .dot files, readable with
free programs like GraphViz, that describe the topology of your
pipeline, along with the caps negotiated in each link.
This is also very handy when using all-in-one elements like playbin2
or uridecodebin, which instantiate several elements inside them.
I hope this resolves what you want