gstreamer mux raw h264 to mp4? - gstreamer

I create any example stream by ffmpeg:
ffmpeg -f lavfi -i testsrc2=r=30:size=800x800:duration=10 -c:v libx264 -g 60 -bf 0 -f h264 test.264
Then I try to remux the stream without recode:
$ gst-launch-1.0 -v --gst-debug=trace filesrc location=test.264 ! h264parse ! queue ! qtmux ! filesink location=rawh264tomp4.mp4
However it fail to multiplex the stream:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, pixel-aspect-ratio=(fraction)1/1, width=(int)800, height=(int)800, framerate=(fraction)30/1, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, stream-format=(string)avc, alignment=(string)au, profile=(string)high, level=(string)3.1, codec_data=(buffer)0164001fffe100186764001facb2019032d80880000003008000001e078c192401000668ebc3cb22c0
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-h264, pixel-aspect-ratio=(fraction)1/1, width=(int)800, height=(int)800, framerate=(fraction)30/1, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, stream-format=(string)avc, alignment=(string)au, profile=(string)high, level=(string)3.1, codec_data=(buffer)0164001fffe100186764001facb2019032d80880000003008000001e078c192401000668ebc3cb22c0
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-h264, pixel-aspect-ratio=(fraction)1/1, width=(int)800, height=(int)800, framerate=(fraction)30/1, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, stream-format=(string)avc, alignment=(string)au, profile=(string)high, level=(string)3.1, codec_data=(buffer)0164001fffe100186764001facb2019032d80880000003008000001e078c192401000668ebc3cb22c0
/GstPipeline:pipeline0/GstQTMux:qtmux0.GstQTMuxPad:video_0: caps = video/x-h264, pixel-aspect-ratio=(fraction)1/1, width=(int)800, height=(int)800, framerate=(fraction)30/1, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, stream-format=(string)avc, alignment=(string)au, profile=(string)high, level=(string)3.1, codec_data=(buffer)0164001fffe100186764001facb2019032d80880000003008000001e078c192401000668ebc3cb22c0
/GstPipeline:pipeline0/GstQTMux:qtmux0.GstPad:src: caps = video/quicktime, variant=(string)apple
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/quicktime, variant=(string)apple
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstQTMux:qtmux0: Could not multiplex stream.
Additional debug info:
gstqtmux.c(4559): gst_qt_mux_add_buffer (): /GstPipeline:pipeline0/GstQTMux:qtmux0:
Buffer has no PTS.
Execution ended after 0:00:00.000111337
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
It said the buffer has no PTS.
I also tried the qtmux, same error.
the ffprobe result is following:
{
Input #0, h264, from 'test.264':
Duration: N/A, bitrate: N/A
Stream #0:0: Video: h264 (High), yuv420p(progressive), 800x800 [SAR 1:1 DAR 1:1], 30 fps, 30 tbr, 1200k tbn, 60 tbc
"programs": [
],
"streams": [
{
"index": 0,
"codec_name": "h264",
"codec_long_name": "H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10",
"profile": "High",
"codec_type": "video",
"codec_time_base": "1/60",
"codec_tag_string": "[0][0][0][0]",
"codec_tag": "0x0000",
"width": 800,
"height": 800,
"coded_width": 800,
"coded_height": 800,
"has_b_frames": 0,
"sample_aspect_ratio": "1:1",
"display_aspect_ratio": "1:1",
"pix_fmt": "yuv420p",
"level": 31,
"chroma_location": "left",
"field_order": "progressive",
"refs": 1,
"is_avc": "false",
"nal_length_size": "0",
"r_frame_rate": "30/1",
"avg_frame_rate": "30/1",
"time_base": "1/1200000",
"bits_per_raw_sample": "8",
"disposition": {
"default": 0,
"dub": 0,
"original": 0,
"comment": 0,
"lyrics": 0,
"karaoke": 0,
"forced": 0,
"hearing_impaired": 0,
"visual_impaired": 0,
"clean_effects": 0,
"attached_pic": 0,
"timed_thumbnails": 0
}
}
],
"format": {
"filename": "test.264",
"nb_streams": 1,
"nb_programs": 0,
"format_name": "h264",
"format_long_name": "raw H.264 video",
"size": "2516641",
"probe_score": 51
}
}

Related

Crash in GStreamer qmlglsink pipeline dynamically rebind to different GstGLVideoItem

I've used one of existing qmlglsink examples to stream video feed from 4 IP Cameras.
4 Pipelines are created before engine load.
for(int i = 0; i < maxCameras; ++i) {
GstElement* pipeline = gst_pipeline_new (NULL);
GstElement* src = gst_element_factory_make ("udpsrc", NULL);
GstElement* parse = gst_element_factory_make ("jpegparse", NULL);
GstElement* decoder = gst_element_factory_make ("jpegdec", NULL);
GstElement* glcolorconvert = gst_element_factory_make ("glcolorconvert", NULL);
GstElement* glupload = gst_element_factory_make ("glupload", NULL);
GstElement *sink = gst_element_factory_make ("qmlglsink", NULL);
g_assert (src && parse && decoder && glupload && glcolorconvert && sink);
g_object_set (G_OBJECT (src), "port", startingPort + i, NULL);
g_object_set (G_OBJECT (sink), "sync", FALSE, NULL);
gst_bin_add_many (GST_BIN (pipeline), src, parse, decoder, glupload, glcolorconvert, sink, NULL);
if (!gst_element_link_many ( src, parse, decoder, glupload, glcolorconvert, sink, NULL)) {
qDebug() << "Linking GStreamer pipeline elements failed";
}
sinks.insert(std::make_pair(QString::number(startingPort+i), sink));
pipelines.insert(std::make_pair(QString::number(startingPort+i), pipeline));
}
In Qml sink is connected and processed with
import QtQuick 2.15
import QtQuick.Layouts 1.15
import CustomProject 1.0
import org.freedesktop.gstreamer.GLVideoItem 1.0
Item {
id: root
signal clicked()
required property int udpPort
property var camConnect: undefined
onUdpPortChanged: { setupConnection(); }
onVisibleChanged: {
if (visible) { setupConnection();
} else { camConnect = undefined }
}
GstGLVideoItem {
id: videoItem
anchors.fill: parent
function connect() {
CameraSinksFactory.connectSink(this, udpPort)
}
}
MouseArea {
anchors.fill: parent
onClicked: {
CameraSinksFactory.stopPipeline(udpPort)
root.clicked()
}
}
function setupConnection() {
if (udpPort <= 0 || !root.visible) return;
videoItem.connect()
root.camConnect = CameraSinksFactory.getCamConnection(udpPort);
root.camConnect.resolutionX =// - 15 root.width
root.camConnect.resolutionY = root.height
root.camConnect.bitrate = 15000000
root.camConnect.streaming = root.visible
CameraSinksFactory.startPipeline(udpPort)
}
}
Problem: Main screen display 4 (2x2 grid) items using Model (which provides udpPort as unique ID). When User clicks on one Item - feed from this camera should fill whole screen. In Examples they create GridLayout with 4/6 explicit items and just manipulate their visiblity (in effect clicked item is only one remaining and take whole screen).
In my case - I'm using separate Item for full screen view. So I'm disabling streaming (CamConnection class communicating with Cameras and sending commands) and hide GridView. New GstGLVideoItem binds to qmlglsink in pipeline.
Everything is OK, until I repeat click sequence (back to GridView and to fullview). Every time it ends with:
Bail out! ERROR:../ext/qt/gstqsgtexture.cc:134:virtual void
GstQSGTexture::bind(): code should not be reached
** (KMS:20495): CRITICAL **: 15:47:36.937: gst_video_frame_map_id: assertion 'info->width <= meta->width' failed
** ERROR:../ext/qt/gstqsgtexture.cc:134:virtual void GstQSGTexture::bind(): code should not be reached
From plugins code analysis it's happening when INFO (image size read from CAPS) is bigger then size in metadata in provided buffer. Which is understandable - buffer is too small.
I used GST_DEBUG=4/5/6/7 and logs confirm that autodetected caps are matching request in commands sent to camera.
I can use example, but project assumes another panel with those cameras - so above problem will hit me in near future.
How to make this whole setup working? How to rebind pipeline qmlglsink to new QML VideoItem safely?
Two possible solutions:
set gst_element_set_state (pipeline, GST_STATE_NULL);, change sink widget to new item and start pipeline gst_element_set_state (pipeline, GST_STATE_PLAYING);
use Qt 5 MediaPlayer with gst-pipeline as source. When visible set source and execute start(). When not visible reset source to empty (important) and execute stop().
In general - possible benefits from NOT creating pipeline each time are not worth hassle when we dynamically assign new QML Item to pipeline sink.

How to display video in a portion of window using d3dvideosink in Windows

I have written following gstreamer function to display the videotestsrc video on a Win32 Window(HWND) in Windows. This function works perfectly and displays videotestsrc in the entire window for the given "win" window handle.
void runGstPipe(HWND win)
{
GstElement *pipeline =
gst_parse_launch
("rtspsrc location=\"...\" ! decodebin ! d3dvideosink name=sink", NULL);
GstElement *sink = gst_bin_get_by_name(GST_BIN(pipeline), "sink");
gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(sink), (guintptr)win);
GstStateChangeReturn sret = gst_element_set_state(pipeline,
GST_STATE_PLAYING);
}
Next I tried to enhance the above function to display the videotestsrc in a portion of the window "win" using the following options.
a) By using glimagesink with render-rectangle option as follows
"rtspsrc location=\"...\" ! decodebin ! glvideosink render-rectange=\"<50, 50, 200, 150>\" name=sink"
b) By using gst_video_overlay_set_render_rectangle as follows
gst_video_overlay_set_render_rectangle(GST_VIDEO_OVERLAY(sink), 50, 50, 200, 150);
Both above options did not change the rendering area. i.e., videotestsrc still occupied whole window, instead of given coordinates. Appreciate, if I can get any suggestions.

how to fix "GStreamer-CRITICAL **: gst_sample_get_buffer: assertion 'GST_IS_SAMPLE (sample)' failed"

I wanted to grab video frame via gstreamer and display on my app(using Qt) but I encounter some issue:
When I tried to use GstAppSink (gst_app_sink_pull_sample) it keep returning NULL, which I don't understand. I can stream the video using terminal (gst-launch-1.0) perfectly.
Below is my code:
void gstreamer::openStream()
{
pipeline = gst_parse_launch ("rtspsrc location=rtsp://192.168.10.123 ! rtph264depay ! h264parse ! queue ! avdec_h264 ! xvimagesink sync=false async=false appsink name=mysink", NULL);
GstElement* sink = gst_bin_get_by_name(GST_BIN(pipeline), "mysink");
GstAppSink* appsink = GST_APP_SINK(sink);
if(!appsink)
{
qDebug() << "get app sink failed";
}
else
{
qDebug() << "app sink pass";
mAppSink = appsink;
openSample();
}
}
void gstreamer::openSample()
{
if(!mAppSink)
{
qDebug() << "appsink failed";
}
GstSample* gstSample = gst_app_sink_pull_sample(mAppSink);
if(gstSample == NULL)
{
qDebug() << "sample failed ";
}
else{
qDebug() << "sample pass";
}
GstBuffer* buffer = gst_sample_get_buffer(gstSample);
if(!buffer)
{
qDebug() << "buffer fail";
}
GstMapInfo map;
gst_buffer_map(buffer, &map, GST_MAP_READ);
QImage image = QImage((map.data), 320, 240, QImage::Format_RGB888);
emit sendFrame(image);
}
I tried to find in the web but there is hardly any links for this issue.
Try changing the pipeline to
"rtspsrc location=rtsp://192.168.10.123 ! rtph264depay ! h264parse ! tee name=my_tee ! queue ! avdec_h264 ! xvimagesink sync=false my_tee. ! queue! appsink async=false name=mysink"

gstreamer not flushing to the filesink

I have this gstreamer pipeline which works from the coomand line as:
gst-launch-1.0 autovideosrc ! tee name = t ! queue ! omxh264enc !
'video/x-h264, stream-format=(string)byte-stream' ! h264parse ! qtmux
! filesink name=fileSink location=test.mp4 t. ! queue ! videoscale !
video/x-raw, width=480,height=270 ! xvimagesink name=displaySink -e
Now, I am replicating this on the C++ side as follows:
GstElement * pipeline = gst_parse_launch("autovideosrc ! tee name = t ! "
"queue ! omxh264enc ! video/x-h264, "
"stream-format=(string)byte-stream ! h264parse ! "
"qtmux ! filesink name=fileSink location=test.mp4 t. "
"! queue ! videoscale ! video/x-raw, width=480,height=270 ! "
"xvimagesink name=displaySink", &error);</raw>
I connect this to a QT window and play as follows:
GstElement * displaySink = gst_bin_get_by_name (GST_BIN (pipeline), "displaySink");
qDebug() << displaySink;
// prepare the ui
QWidget window;
window.resize(480, 270);
window.show();
WId xwinid = window.winId();
gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY(displaySink), xwinid);
// run the pipeline
qDebug() << "Calling run...";
GstStateChangeReturn sret = gst_element_set_state (pipeline,
GST_STATE_PLAYING);
if (sret == GST_STATE_CHANGE_FAILURE) {
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (pipeline);
// Exit application
QTimer::singleShot(0, QApplication::activeWindow(), SLOT(quit()));
}
int ret = app.exec();
window.hide();
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (pipeline);
This starts displaying the video stream onto my Qt window and the file test.mp4 gets created and it starts to grow in size. However, when I quit the application, the file is not playable. I have a feeling this is because the last bits or some header information is not written due to me calling:
gst_element_set_state (pipeline, GST_STATE_NULL);
I am speculating that this perhaps closes the pipeline without ensuring that the file is correctly created and finalized. Is there a way to ensure that EOF or EOS is called on the pipeline before closing and ensuring that the file is written properly? This is also a speculation at the moment from my part but something else could be wrong...
Yes sending EOS is necessary..
So before the NULLing of pipe do:
gst_element_send_event(pipeline, gst_event_new_eos());
Edit for checking if EOS passed:
According to documentation:
The EOS event will travel down to the sink elements in the pipeline which will then post the GST_MESSAGE_EOS on the bus after they have finished playing any buffered data.
This means that to check if the EOS event successfully passed through the pipeline you could add bus watch callback with gst_bus_add_watch and check there for GST_MESSAGE_EOS.

Gstreamer basic pipeline running but not displaying on windows 7 virtualbox

I am currently working with Gstreamer on Windows 7 (x86_64) as a VM (Virtualbox) and I wanted to run a basic pipeline:
gst-launch-1.0 -v videotestsrc pattern=snow ! autovideosync
When I run this pipeline I get:
Setting pipeline to PAUSED...
Pipeline is PREROLLING
And then an error occurs:
Pipeline doesn't want to preroll
I solved this error by adding async-handling=true at the end of the pipeline but nothing is still displaying...
I tried to run the same pipeline writing C++ code. Here is a simple main you can run. When I run this code, I get no error but nothing is displaying.
#include <gst/gst.h>
#include <glib.h>
#include <stdio.h>
int main(int argc, char* argv[]) {
GMainLoop *loop;
GstElement *pipeline, *source, *sink;
g_print("Starting...");
/* Initialisation */
gst_init(&argc, &argv);
g_print("Loop is created...");
loop = g_main_loop_new(NULL, FALSE);
/* Create gstreamer elements */
pipeline = gst_pipeline_new("gst-app-sink");
source = gst_element_factory_make("videotestsrc", "src");
sink = gst_element_factory_make("autovideosink", "sink");
if (!pipeline || !source || !sink) {
g_printerr("One element could not be created. Exiting.\n");
return -1;
}
/* Set up the pipeline */
/* we add all elements into the pipeline */
/* source | sink */
gst_bin_add_many(GST_BIN(pipeline), source, sink, NULL);
/* we link the elements together */
/* src -> sink */
gst_element_link(source, sink);
/* Set the pipeline to "playing" state*/
gst_element_set_state(pipeline, GST_STATE_PLAYING);
/* Iterate */
g_print("Running...\n");
g_main_loop_run(loop);
/* Out of the main loop, clean up nicely */
g_print("Returned, stopping playback\n");
gst_element_set_state(pipeline, GST_STATE_NULL);
g_print("Deleting pipeline\n");
gst_object_unref(GST_OBJECT(pipeline));
g_main_loop_unref(loop);
return 0;
}
I really don't know where it could come from. Any ideas?
By default, the VM doesn't enable 2D and 3D video acceleration which is necessary to display these kind of stream. Just right-click on you VM -> Settings -> Display and check "Enable 3D acceleration" and "Enable 2D Video Acceleration".