How to add a GstVideoOrientationInterface to a Gst pipeline? - c++

I'm trying to rotate/flip the video played by a plabin element (in C++). What I'm trying to to is similar to what asked in the question Rotate a Video in gstreamer , but I prefer not to rely on the videoflip element. Instead I'd like to use the GstVideoOrientation interface (https://thiblahute.github.io/GStreamer-doc/gst-plugins-base-video-1.0/videoorientation.html?gi-language=c#interfaces) from the gst video library (https://thiblahute.github.io/GStreamer-doc/gst-plugins-base-video-1.0/index.html?gi-language=c).
The documentation of the interface itself and how to use it is pretty clear, but I can't understand how to add such interface to a GstElement.
There is some documentation in https://gstreamer.freedesktop.org/documentation/application-development/advanced/interfaces.html and in https://gstreamer.freedesktop.org/documentation/plugin-development/advanced/interfaces.html , but still I can't figure out how this works.
Below is the code sample I'm working with:
#include <gst/video/video.h>
#include <gst/gst.h>
gint
main (gint argc, gchar * argv[])
{
//...
GstElement *pipeline;
pipeline = NULL;
gst_init (NULL,NULL);
pipeline = gst_element_factory_make("playbin", "playbin");
g_object_set (pipeline, "uri", "an_uri", NULL);
gst_element_set_state (pipeline, GST_STATE_PLAYING);
//...
return 0;
}
Any help is appricieated
Many thanks

Related

GStreamer qmlglsink vs gst_parse_launch()

I'm new to Qt and GStreamer, but I need to create a simple player for a QuickTime/H.264 video file into a Qt 5.15.2 application (running on Linux UbuntuĀ 20.04 (Focal Fossa)).
I managed to play a standard videotestsrc (bouncing ball pattern) inside my application, and this is the code (main.cpp):
#include "mainwindow.h"
#include <QApplication>
#include <QQuickView>
#include <QWidget>
#include <QQuickItem>
#include <gst/gst.h>
int main(int argc, char *argv[])
{
GstElement* mPipeline = nullptr;
GstElement* mSource = nullptr;
GstElement* mGLUpload = nullptr;
GstElement* mSink = nullptr;
QQuickView* mView = nullptr;
QWidget* mWidget = nullptr;
QQuickItem* mItem = nullptr;
gst_init(argc, argv);
QApplication app(argc, argv);
MainWindow* window = new MainWindow;
mPipeline = gst_pipeline_new(NULL);
mSource = gst_element_factory_make("videotestsrc", NULL);
mGLUpload = gst_element_factory_make("glupload", NULL);
mSink = gst_element_factory_make("qmlglsink", NULL);
gst_bin_add_many(GST_BIN (mPipeline), mSource, mGLUpload, mSink, NULL);
gst_element_link_many(mSource, mGLUpload, mSink, NULL);
g_object_set(mSource, "pattern", 18, NULL);
mView = new QQuickView;
mView->scheduleRenderJob(new SetPlaying (mPipeline),
QQuickView::BeforeSynchronizingStage);
mView->setSource(QUrl(QStringLiteral("qrc:/video.qml")));
mWidget = QWidget::createWindowContainer(mView, parent);
mItem = mView->findChild<QQuickItem*>("videoItem");
window->setCentralWidget(mWidget);
window->show();
ret = app.exec();
g_object_set(mSink, "widget", mItem, NULL);
gst_deinit();
}
SetPlaying class...
#include <QRunnable>
#include <gst/gst.h>
class SetPlaying : public QRunnable
{
public:
SetPlaying(GstElement *pipeline) {
this->pipeline_ = pipeline ? static_cast<GstElement *> (gst_object_ref (pipeline)) : NULL;
}
~SetPlaying() {
if (this->pipeline_)
gst_object_unref (this->pipeline_);
}
void run () {
if (this->pipeline_)
gst_element_set_state (this->pipeline_, GST_STATE_PLAYING);
}
private:
GstElement * pipeline_;
};
The MainWindow code should not be relevant for the issue management (it's a standard empty window).
This is the source code of the only .qml item that's needed to provide an acceptable widget surface to qmlglsink:
import QtQuick 2.15
import QtQuick.Controls 1.1
import QtQuick.Controls.Styles 1.3
import QtQuick.Dialogs 1.2
import QtQuick.Window 2.1
import org.freedesktop.gstreamer.GLVideoItem 1.0
Item {
anchors.fill: parent
GstGLVideoItem {
id: video
objectName: "videoItem"
anchors.centerIn: parent
width: parent.width
height: parent.height
}
}
Now since the actual pipeline to plays the file is quite long and complex to manage the # code I opted to use a gst_parse_launch() approach.
To proceed step by step, I tried to use such a method to create a videotestsrc pipeline, i.e.:
mPipeline = gst_parse_launch( "videotestsrc ! glupload ! qmlglsink", NULL);
mSink = gst_bin_get_by_name(GST_BIN(mPipeline), "sink");
mSource = gst_bin_get_by_name(GST_BIN(mPipeline), "source");
If I run the code this is the result:
(videotest:14930): GLib-GObject-CRITICAL **: 16:33:08.868: g_object_set: assertion 'G_IS_OBJECT (object)' failed
(videotest:14930): GLib-GObject-CRITICAL **: 16:33:09.342: g_object_set: assertion 'G_IS_OBJECT (object)' failed
Of course, the application window displays nothing.
You should give the elements a name property. They will default to ones, but they include a numerical value and are incremented whenever you rebuild the pipeline. So it is better to not rely on those.
To make your existing code work, try this:
mPipeline = gst_parse_launch( "videotestsrc name=source ! glupload ! qmlglsink name=sink", NULL);

Crash in GStreamer qmlglsink pipeline dynamically rebind to different GstGLVideoItem

I've used one of existing qmlglsink examples to stream video feed from 4 IP Cameras.
4 Pipelines are created before engine load.
for(int i = 0; i < maxCameras; ++i) {
GstElement* pipeline = gst_pipeline_new (NULL);
GstElement* src = gst_element_factory_make ("udpsrc", NULL);
GstElement* parse = gst_element_factory_make ("jpegparse", NULL);
GstElement* decoder = gst_element_factory_make ("jpegdec", NULL);
GstElement* glcolorconvert = gst_element_factory_make ("glcolorconvert", NULL);
GstElement* glupload = gst_element_factory_make ("glupload", NULL);
GstElement *sink = gst_element_factory_make ("qmlglsink", NULL);
g_assert (src && parse && decoder && glupload && glcolorconvert && sink);
g_object_set (G_OBJECT (src), "port", startingPort + i, NULL);
g_object_set (G_OBJECT (sink), "sync", FALSE, NULL);
gst_bin_add_many (GST_BIN (pipeline), src, parse, decoder, glupload, glcolorconvert, sink, NULL);
if (!gst_element_link_many ( src, parse, decoder, glupload, glcolorconvert, sink, NULL)) {
qDebug() << "Linking GStreamer pipeline elements failed";
}
sinks.insert(std::make_pair(QString::number(startingPort+i), sink));
pipelines.insert(std::make_pair(QString::number(startingPort+i), pipeline));
}
In Qml sink is connected and processed with
import QtQuick 2.15
import QtQuick.Layouts 1.15
import CustomProject 1.0
import org.freedesktop.gstreamer.GLVideoItem 1.0
Item {
id: root
signal clicked()
required property int udpPort
property var camConnect: undefined
onUdpPortChanged: { setupConnection(); }
onVisibleChanged: {
if (visible) { setupConnection();
} else { camConnect = undefined }
}
GstGLVideoItem {
id: videoItem
anchors.fill: parent
function connect() {
CameraSinksFactory.connectSink(this, udpPort)
}
}
MouseArea {
anchors.fill: parent
onClicked: {
CameraSinksFactory.stopPipeline(udpPort)
root.clicked()
}
}
function setupConnection() {
if (udpPort <= 0 || !root.visible) return;
videoItem.connect()
root.camConnect = CameraSinksFactory.getCamConnection(udpPort);
root.camConnect.resolutionX =// - 15 root.width
root.camConnect.resolutionY = root.height
root.camConnect.bitrate = 15000000
root.camConnect.streaming = root.visible
CameraSinksFactory.startPipeline(udpPort)
}
}
Problem: Main screen display 4 (2x2 grid) items using Model (which provides udpPort as unique ID). When User clicks on one Item - feed from this camera should fill whole screen. In Examples they create GridLayout with 4/6 explicit items and just manipulate their visiblity (in effect clicked item is only one remaining and take whole screen).
In my case - I'm using separate Item for full screen view. So I'm disabling streaming (CamConnection class communicating with Cameras and sending commands) and hide GridView. New GstGLVideoItem binds to qmlglsink in pipeline.
Everything is OK, until I repeat click sequence (back to GridView and to fullview). Every time it ends with:
Bail out! ERROR:../ext/qt/gstqsgtexture.cc:134:virtual void
GstQSGTexture::bind(): code should not be reached
** (KMS:20495): CRITICAL **: 15:47:36.937: gst_video_frame_map_id: assertion 'info->width <= meta->width' failed
** ERROR:../ext/qt/gstqsgtexture.cc:134:virtual void GstQSGTexture::bind(): code should not be reached
From plugins code analysis it's happening when INFO (image size read from CAPS) is bigger then size in metadata in provided buffer. Which is understandable - buffer is too small.
I used GST_DEBUG=4/5/6/7 and logs confirm that autodetected caps are matching request in commands sent to camera.
I can use example, but project assumes another panel with those cameras - so above problem will hit me in near future.
How to make this whole setup working? How to rebind pipeline qmlglsink to new QML VideoItem safely?
Two possible solutions:
set gst_element_set_state (pipeline, GST_STATE_NULL);, change sink widget to new item and start pipeline gst_element_set_state (pipeline, GST_STATE_PLAYING);
use Qt 5 MediaPlayer with gst-pipeline as source. When visible set source and execute start(). When not visible reset source to empty (important) and execute stop().
In general - possible benefits from NOT creating pipeline each time are not worth hassle when we dynamically assign new QML Item to pipeline sink.

How to display video in a portion of window using d3dvideosink in Windows

I have written following gstreamer function to display the videotestsrc video on a Win32 Window(HWND) in Windows. This function works perfectly and displays videotestsrc in the entire window for the given "win" window handle.
void runGstPipe(HWND win)
{
GstElement *pipeline =
gst_parse_launch
("rtspsrc location=\"...\" ! decodebin ! d3dvideosink name=sink", NULL);
GstElement *sink = gst_bin_get_by_name(GST_BIN(pipeline), "sink");
gst_video_overlay_set_window_handle(GST_VIDEO_OVERLAY(sink), (guintptr)win);
GstStateChangeReturn sret = gst_element_set_state(pipeline,
GST_STATE_PLAYING);
}
Next I tried to enhance the above function to display the videotestsrc in a portion of the window "win" using the following options.
a) By using glimagesink with render-rectangle option as follows
"rtspsrc location=\"...\" ! decodebin ! glvideosink render-rectange=\"<50, 50, 200, 150>\" name=sink"
b) By using gst_video_overlay_set_render_rectangle as follows
gst_video_overlay_set_render_rectangle(GST_VIDEO_OVERLAY(sink), 50, 50, 200, 150);
Both above options did not change the rendering area. i.e., videotestsrc still occupied whole window, instead of given coordinates. Appreciate, if I can get any suggestions.

Getting raw H.264 and AAC data from QML Camera

According to this code:
import QtQuick 2.5
import QtMultimedia 5.5
Item {
id: root
Camera {
objectName: "camera"
id: camera
captureMode: Camera.CaptureVideo
videoRecorder.videoCodec: "h264"
videoRecorder.audioCodec: "aac"
}
}
is it possible to get raw H.264 and AAC data (for example, in unsigned char * type) without writing it on the disk drive? Can I access that streams from C++ side?
In fact, this data in future will be sending to nginx server using librtmp.
I will use GStreamer.
#include <QGuiApplication>
#include <QQmlApplicationEngine>
#include <gst/gst.h>
int main(int argc, char *argv[])
{
QGuiApplication app(argc, argv);
QQmlApplicationEngine engine;
engine.load(QUrl(QStringLiteral("qrc:/main.qml")));
GstElement *pipeline;
GstBus *bus;
GstMessage *msg;
putenv("GST_DEBUG=6");
putenv("GST_PLUGIN_PATH_1_0=E:\\sdk\\gstreamer\\1.0\\x86_64\\lib\\gstreamer-1.0\\");
putenv("GST_PLUGIN_PATH=E:\\sdk\\gstreamer\\1.0\\x86_64\\lib\\gstreamer-1.0\\");
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Build the pipeline */
//pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm",
//NULL);
pipeline = gst_parse_launch ("ksvideosrc device-index=0 ! autovideosink", NULL); // Windows OS specific
/* Start playing */
gst_element_set_state (pipeline, GST_STATE_PLAYING);
/* Wait until error or EOS */
bus = gst_element_get_bus (pipeline);
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, (GstMessageType)(GST_MESSAGE_ERROR | GST_MESSAGE_EOS));
/* Free resources */
if (msg != NULL)
gst_message_unref (msg);
gst_object_unref (bus);
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (pipeline);
return app.exec();
}
You can write your own plugin for QML, using this lib.
Thanks to Qt forum for right path.

Gstreamer basic pipeline running but not displaying on windows 7 virtualbox

I am currently working with Gstreamer on Windows 7 (x86_64) as a VM (Virtualbox) and I wanted to run a basic pipeline:
gst-launch-1.0 -v videotestsrc pattern=snow ! autovideosync
When I run this pipeline I get:
Setting pipeline to PAUSED...
Pipeline is PREROLLING
And then an error occurs:
Pipeline doesn't want to preroll
I solved this error by adding async-handling=true at the end of the pipeline but nothing is still displaying...
I tried to run the same pipeline writing C++ code. Here is a simple main you can run. When I run this code, I get no error but nothing is displaying.
#include <gst/gst.h>
#include <glib.h>
#include <stdio.h>
int main(int argc, char* argv[]) {
GMainLoop *loop;
GstElement *pipeline, *source, *sink;
g_print("Starting...");
/* Initialisation */
gst_init(&argc, &argv);
g_print("Loop is created...");
loop = g_main_loop_new(NULL, FALSE);
/* Create gstreamer elements */
pipeline = gst_pipeline_new("gst-app-sink");
source = gst_element_factory_make("videotestsrc", "src");
sink = gst_element_factory_make("autovideosink", "sink");
if (!pipeline || !source || !sink) {
g_printerr("One element could not be created. Exiting.\n");
return -1;
}
/* Set up the pipeline */
/* we add all elements into the pipeline */
/* source | sink */
gst_bin_add_many(GST_BIN(pipeline), source, sink, NULL);
/* we link the elements together */
/* src -> sink */
gst_element_link(source, sink);
/* Set the pipeline to "playing" state*/
gst_element_set_state(pipeline, GST_STATE_PLAYING);
/* Iterate */
g_print("Running...\n");
g_main_loop_run(loop);
/* Out of the main loop, clean up nicely */
g_print("Returned, stopping playback\n");
gst_element_set_state(pipeline, GST_STATE_NULL);
g_print("Deleting pipeline\n");
gst_object_unref(GST_OBJECT(pipeline));
g_main_loop_unref(loop);
return 0;
}
I really don't know where it could come from. Any ideas?
By default, the VM doesn't enable 2D and 3D video acceleration which is necessary to display these kind of stream. Just right-click on you VM -> Settings -> Display and check "Enable 3D acceleration" and "Enable 2D Video Acceleration".