gstreamer not flushing to the filesink - gstreamer

I have this gstreamer pipeline which works from the coomand line as:
gst-launch-1.0 autovideosrc ! tee name = t ! queue ! omxh264enc !
'video/x-h264, stream-format=(string)byte-stream' ! h264parse ! qtmux
! filesink name=fileSink location=test.mp4 t. ! queue ! videoscale !
video/x-raw, width=480,height=270 ! xvimagesink name=displaySink -e
Now, I am replicating this on the C++ side as follows:
GstElement * pipeline = gst_parse_launch("autovideosrc ! tee name = t ! "
"queue ! omxh264enc ! video/x-h264, "
"stream-format=(string)byte-stream ! h264parse ! "
"qtmux ! filesink name=fileSink location=test.mp4 t. "
"! queue ! videoscale ! video/x-raw, width=480,height=270 ! "
"xvimagesink name=displaySink", &error);</raw>
I connect this to a QT window and play as follows:
GstElement * displaySink = gst_bin_get_by_name (GST_BIN (pipeline), "displaySink");
qDebug() << displaySink;
// prepare the ui
QWidget window;
window.resize(480, 270);
window.show();
WId xwinid = window.winId();
gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY(displaySink), xwinid);
// run the pipeline
qDebug() << "Calling run...";
GstStateChangeReturn sret = gst_element_set_state (pipeline,
GST_STATE_PLAYING);
if (sret == GST_STATE_CHANGE_FAILURE) {
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (pipeline);
// Exit application
QTimer::singleShot(0, QApplication::activeWindow(), SLOT(quit()));
}
int ret = app.exec();
window.hide();
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (pipeline);
This starts displaying the video stream onto my Qt window and the file test.mp4 gets created and it starts to grow in size. However, when I quit the application, the file is not playable. I have a feeling this is because the last bits or some header information is not written due to me calling:
gst_element_set_state (pipeline, GST_STATE_NULL);
I am speculating that this perhaps closes the pipeline without ensuring that the file is correctly created and finalized. Is there a way to ensure that EOF or EOS is called on the pipeline before closing and ensuring that the file is written properly? This is also a speculation at the moment from my part but something else could be wrong...

Yes sending EOS is necessary..
So before the NULLing of pipe do:
gst_element_send_event(pipeline, gst_event_new_eos());
Edit for checking if EOS passed:
According to documentation:
The EOS event will travel down to the sink elements in the pipeline which will then post the GST_MESSAGE_EOS on the bus after they have finished playing any buffered data.
This means that to check if the EOS event successfully passed through the pipeline you could add bus watch callback with gst_bus_add_watch and check there for GST_MESSAGE_EOS.

Related

How to display video in a portion of window using d3dvideosink in Windows

I have written following gstreamer function to display the videotestsrc video on a Win32 Window(HWND) in Windows. This function works perfectly and displays videotestsrc in the entire window for the given "win" window handle.
void runGstPipe(HWND win)
{
GstElement *pipeline =
gst_parse_launch
("rtspsrc location=\"...\" ! decodebin ! d3dvideosink name=sink", NULL);
GstElement *sink = gst_bin_get_by_name(GST_BIN(pipeline), "sink");
gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(sink), (guintptr)win);
GstStateChangeReturn sret = gst_element_set_state(pipeline,
GST_STATE_PLAYING);
}
Next I tried to enhance the above function to display the videotestsrc in a portion of the window "win" using the following options.
a) By using glimagesink with render-rectangle option as follows
"rtspsrc location=\"...\" ! decodebin ! glvideosink render-rectange=\"<50, 50, 200, 150>\" name=sink"
b) By using gst_video_overlay_set_render_rectangle as follows
gst_video_overlay_set_render_rectangle(GST_VIDEO_OVERLAY(sink), 50, 50, 200, 150);
Both above options did not change the rendering area. i.e., videotestsrc still occupied whole window, instead of given coordinates. Appreciate, if I can get any suggestions.

gstreamer mux raw h264 to mp4?

I create any example stream by ffmpeg:
ffmpeg -f lavfi -i testsrc2=r=30:size=800x800:duration=10 -c:v libx264 -g 60 -bf 0 -f h264 test.264
Then I try to remux the stream without recode:
$ gst-launch-1.0 -v --gst-debug=trace filesrc location=test.264 ! h264parse ! queue ! qtmux ! filesink location=rawh264tomp4.mp4
However it fail to multiplex the stream:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, pixel-aspect-ratio=(fraction)1/1, width=(int)800, height=(int)800, framerate=(fraction)30/1, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, stream-format=(string)avc, alignment=(string)au, profile=(string)high, level=(string)3.1, codec_data=(buffer)0164001fffe100186764001facb2019032d80880000003008000001e078c192401000668ebc3cb22c0
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-h264, pixel-aspect-ratio=(fraction)1/1, width=(int)800, height=(int)800, framerate=(fraction)30/1, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, stream-format=(string)avc, alignment=(string)au, profile=(string)high, level=(string)3.1, codec_data=(buffer)0164001fffe100186764001facb2019032d80880000003008000001e078c192401000668ebc3cb22c0
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-h264, pixel-aspect-ratio=(fraction)1/1, width=(int)800, height=(int)800, framerate=(fraction)30/1, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, stream-format=(string)avc, alignment=(string)au, profile=(string)high, level=(string)3.1, codec_data=(buffer)0164001fffe100186764001facb2019032d80880000003008000001e078c192401000668ebc3cb22c0
/GstPipeline:pipeline0/GstQTMux:qtmux0.GstQTMuxPad:video_0: caps = video/x-h264, pixel-aspect-ratio=(fraction)1/1, width=(int)800, height=(int)800, framerate=(fraction)30/1, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, stream-format=(string)avc, alignment=(string)au, profile=(string)high, level=(string)3.1, codec_data=(buffer)0164001fffe100186764001facb2019032d80880000003008000001e078c192401000668ebc3cb22c0
/GstPipeline:pipeline0/GstQTMux:qtmux0.GstPad:src: caps = video/quicktime, variant=(string)apple
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/quicktime, variant=(string)apple
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstQTMux:qtmux0: Could not multiplex stream.
Additional debug info:
gstqtmux.c(4559): gst_qt_mux_add_buffer (): /GstPipeline:pipeline0/GstQTMux:qtmux0:
Buffer has no PTS.
Execution ended after 0:00:00.000111337
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
It said the buffer has no PTS.
I also tried the qtmux, same error.
the ffprobe result is following:
{
Input #0, h264, from 'test.264':
Duration: N/A, bitrate: N/A
Stream #0:0: Video: h264 (High), yuv420p(progressive), 800x800 [SAR 1:1 DAR 1:1], 30 fps, 30 tbr, 1200k tbn, 60 tbc
"programs": [
],
"streams": [
{
"index": 0,
"codec_name": "h264",
"codec_long_name": "H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10",
"profile": "High",
"codec_type": "video",
"codec_time_base": "1/60",
"codec_tag_string": "[0][0][0][0]",
"codec_tag": "0x0000",
"width": 800,
"height": 800,
"coded_width": 800,
"coded_height": 800,
"has_b_frames": 0,
"sample_aspect_ratio": "1:1",
"display_aspect_ratio": "1:1",
"pix_fmt": "yuv420p",
"level": 31,
"chroma_location": "left",
"field_order": "progressive",
"refs": 1,
"is_avc": "false",
"nal_length_size": "0",
"r_frame_rate": "30/1",
"avg_frame_rate": "30/1",
"time_base": "1/1200000",
"bits_per_raw_sample": "8",
"disposition": {
"default": 0,
"dub": 0,
"original": 0,
"comment": 0,
"lyrics": 0,
"karaoke": 0,
"forced": 0,
"hearing_impaired": 0,
"visual_impaired": 0,
"clean_effects": 0,
"attached_pic": 0,
"timed_thumbnails": 0
}
}
],
"format": {
"filename": "test.264",
"nb_streams": 1,
"nb_programs": 0,
"format_name": "h264",
"format_long_name": "raw H.264 video",
"size": "2516641",
"probe_score": 51
}
}

how to fix "GStreamer-CRITICAL **: gst_sample_get_buffer: assertion 'GST_IS_SAMPLE (sample)' failed"

I wanted to grab video frame via gstreamer and display on my app(using Qt) but I encounter some issue:
When I tried to use GstAppSink (gst_app_sink_pull_sample) it keep returning NULL, which I don't understand. I can stream the video using terminal (gst-launch-1.0) perfectly.
Below is my code:
void gstreamer::openStream()
{
pipeline = gst_parse_launch ("rtspsrc location=rtsp://192.168.10.123 ! rtph264depay ! h264parse ! queue ! avdec_h264 ! xvimagesink sync=false async=false appsink name=mysink", NULL);
GstElement* sink = gst_bin_get_by_name(GST_BIN(pipeline), "mysink");
GstAppSink* appsink = GST_APP_SINK(sink);
if(!appsink)
{
qDebug() << "get app sink failed";
}
else
{
qDebug() << "app sink pass";
mAppSink = appsink;
openSample();
}
}
void gstreamer::openSample()
{
if(!mAppSink)
{
qDebug() << "appsink failed";
}
GstSample* gstSample = gst_app_sink_pull_sample(mAppSink);
if(gstSample == NULL)
{
qDebug() << "sample failed ";
}
else{
qDebug() << "sample pass";
}
GstBuffer* buffer = gst_sample_get_buffer(gstSample);
if(!buffer)
{
qDebug() << "buffer fail";
}
GstMapInfo map;
gst_buffer_map(buffer, &map, GST_MAP_READ);
QImage image = QImage((map.data), 320, 240, QImage::Format_RGB888);
emit sendFrame(image);
}
I tried to find in the web but there is hardly any links for this issue.
Try changing the pipeline to
"rtspsrc location=rtsp://192.168.10.123 ! rtph264depay ! h264parse ! tee name=my_tee ! queue ! avdec_h264 ! xvimagesink sync=false my_tee. ! queue! appsink async=false name=mysink"

How to add a GstVideoOrientationInterface to a Gst pipeline?

I'm trying to rotate/flip the video played by a plabin element (in C++). What I'm trying to to is similar to what asked in the question Rotate a Video in gstreamer , but I prefer not to rely on the videoflip element. Instead I'd like to use the GstVideoOrientation interface (https://thiblahute.github.io/GStreamer-doc/gst-plugins-base-video-1.0/videoorientation.html?gi-language=c#interfaces) from the gst video library (https://thiblahute.github.io/GStreamer-doc/gst-plugins-base-video-1.0/index.html?gi-language=c).
The documentation of the interface itself and how to use it is pretty clear, but I can't understand how to add such interface to a GstElement.
There is some documentation in https://gstreamer.freedesktop.org/documentation/application-development/advanced/interfaces.html and in https://gstreamer.freedesktop.org/documentation/plugin-development/advanced/interfaces.html , but still I can't figure out how this works.
Below is the code sample I'm working with:
#include <gst/video/video.h>
#include <gst/gst.h>
gint
main (gint argc, gchar * argv[])
{
//...
GstElement *pipeline;
pipeline = NULL;
gst_init (NULL,NULL);
pipeline = gst_element_factory_make("playbin", "playbin");
g_object_set (pipeline, "uri", "an_uri", NULL);
gst_element_set_state (pipeline, GST_STATE_PLAYING);
//...
return 0;
}
Any help is appricieated
Many thanks

Gstreamer basic pipeline running but not displaying on windows 7 virtualbox

I am currently working with Gstreamer on Windows 7 (x86_64) as a VM (Virtualbox) and I wanted to run a basic pipeline:
gst-launch-1.0 -v videotestsrc pattern=snow ! autovideosync
When I run this pipeline I get:
Setting pipeline to PAUSED...
Pipeline is PREROLLING
And then an error occurs:
Pipeline doesn't want to preroll
I solved this error by adding async-handling=true at the end of the pipeline but nothing is still displaying...
I tried to run the same pipeline writing C++ code. Here is a simple main you can run. When I run this code, I get no error but nothing is displaying.
#include <gst/gst.h>
#include <glib.h>
#include <stdio.h>
int main(int argc, char* argv[]) {
GMainLoop *loop;
GstElement *pipeline, *source, *sink;
g_print("Starting...");
/* Initialisation */
gst_init(&argc, &argv);
g_print("Loop is created...");
loop = g_main_loop_new(NULL, FALSE);
/* Create gstreamer elements */
pipeline = gst_pipeline_new("gst-app-sink");
source = gst_element_factory_make("videotestsrc", "src");
sink = gst_element_factory_make("autovideosink", "sink");
if (!pipeline || !source || !sink) {
g_printerr("One element could not be created. Exiting.\n");
return -1;
}
/* Set up the pipeline */
/* we add all elements into the pipeline */
/* source | sink */
gst_bin_add_many(GST_BIN(pipeline), source, sink, NULL);
/* we link the elements together */
/* src -> sink */
gst_element_link(source, sink);
/* Set the pipeline to "playing" state*/
gst_element_set_state(pipeline, GST_STATE_PLAYING);
/* Iterate */
g_print("Running...\n");
g_main_loop_run(loop);
/* Out of the main loop, clean up nicely */
g_print("Returned, stopping playback\n");
gst_element_set_state(pipeline, GST_STATE_NULL);
g_print("Deleting pipeline\n");
gst_object_unref(GST_OBJECT(pipeline));
g_main_loop_unref(loop);
return 0;
}
I really don't know where it could come from. Any ideas?
By default, the VM doesn't enable 2D and 3D video acceleration which is necessary to display these kind of stream. Just right-click on you VM -> Settings -> Display and check "Enable 3D acceleration" and "Enable 2D Video Acceleration".