Getting raw H.264 and AAC data from QML Camera - c++

According to this code:
import QtQuick 2.5
import QtMultimedia 5.5
Item {
id: root
Camera {
objectName: "camera"
id: camera
captureMode: Camera.CaptureVideo
videoRecorder.videoCodec: "h264"
videoRecorder.audioCodec: "aac"
}
}
is it possible to get raw H.264 and AAC data (for example, in unsigned char * type) without writing it on the disk drive? Can I access that streams from C++ side?
In fact, this data in future will be sending to nginx server using librtmp.

I will use GStreamer.
#include <QGuiApplication>
#include <QQmlApplicationEngine>
#include <gst/gst.h>
int main(int argc, char *argv[])
{
QGuiApplication app(argc, argv);
QQmlApplicationEngine engine;
engine.load(QUrl(QStringLiteral("qrc:/main.qml")));
GstElement *pipeline;
GstBus *bus;
GstMessage *msg;
putenv("GST_DEBUG=6");
putenv("GST_PLUGIN_PATH_1_0=E:\\sdk\\gstreamer\\1.0\\x86_64\\lib\\gstreamer-1.0\\");
putenv("GST_PLUGIN_PATH=E:\\sdk\\gstreamer\\1.0\\x86_64\\lib\\gstreamer-1.0\\");
/* Initialize GStreamer */
gst_init (&argc, &argv);
/* Build the pipeline */
//pipeline = gst_parse_launch ("playbin uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm",
//NULL);
pipeline = gst_parse_launch ("ksvideosrc device-index=0 ! autovideosink", NULL); // Windows OS specific
/* Start playing */
gst_element_set_state (pipeline, GST_STATE_PLAYING);
/* Wait until error or EOS */
bus = gst_element_get_bus (pipeline);
msg = gst_bus_timed_pop_filtered (bus, GST_CLOCK_TIME_NONE, (GstMessageType)(GST_MESSAGE_ERROR | GST_MESSAGE_EOS));
/* Free resources */
if (msg != NULL)
gst_message_unref (msg);
gst_object_unref (bus);
gst_element_set_state (pipeline, GST_STATE_NULL);
gst_object_unref (pipeline);
return app.exec();
}
You can write your own plugin for QML, using this lib.
Thanks to Qt forum for right path.

Related

GStreamer qmlglsink vs gst_parse_launch()

I'm new to Qt and GStreamer, but I need to create a simple player for a QuickTime/H.264 video file into a Qt 5.15.2 application (running on Linux UbuntuĀ 20.04 (Focal Fossa)).
I managed to play a standard videotestsrc (bouncing ball pattern) inside my application, and this is the code (main.cpp):
#include "mainwindow.h"
#include <QApplication>
#include <QQuickView>
#include <QWidget>
#include <QQuickItem>
#include <gst/gst.h>
int main(int argc, char *argv[])
{
GstElement* mPipeline = nullptr;
GstElement* mSource = nullptr;
GstElement* mGLUpload = nullptr;
GstElement* mSink = nullptr;
QQuickView* mView = nullptr;
QWidget* mWidget = nullptr;
QQuickItem* mItem = nullptr;
gst_init(argc, argv);
QApplication app(argc, argv);
MainWindow* window = new MainWindow;
mPipeline = gst_pipeline_new(NULL);
mSource = gst_element_factory_make("videotestsrc", NULL);
mGLUpload = gst_element_factory_make("glupload", NULL);
mSink = gst_element_factory_make("qmlglsink", NULL);
gst_bin_add_many(GST_BIN (mPipeline), mSource, mGLUpload, mSink, NULL);
gst_element_link_many(mSource, mGLUpload, mSink, NULL);
g_object_set(mSource, "pattern", 18, NULL);
mView = new QQuickView;
mView->scheduleRenderJob(new SetPlaying (mPipeline),
QQuickView::BeforeSynchronizingStage);
mView->setSource(QUrl(QStringLiteral("qrc:/video.qml")));
mWidget = QWidget::createWindowContainer(mView, parent);
mItem = mView->findChild<QQuickItem*>("videoItem");
window->setCentralWidget(mWidget);
window->show();
ret = app.exec();
g_object_set(mSink, "widget", mItem, NULL);
gst_deinit();
}
SetPlaying class...
#include <QRunnable>
#include <gst/gst.h>
class SetPlaying : public QRunnable
{
public:
SetPlaying(GstElement *pipeline) {
this->pipeline_ = pipeline ? static_cast<GstElement *> (gst_object_ref (pipeline)) : NULL;
}
~SetPlaying() {
if (this->pipeline_)
gst_object_unref (this->pipeline_);
}
void run () {
if (this->pipeline_)
gst_element_set_state (this->pipeline_, GST_STATE_PLAYING);
}
private:
GstElement * pipeline_;
};
The MainWindow code should not be relevant for the issue management (it's a standard empty window).
This is the source code of the only .qml item that's needed to provide an acceptable widget surface to qmlglsink:
import QtQuick 2.15
import QtQuick.Controls 1.1
import QtQuick.Controls.Styles 1.3
import QtQuick.Dialogs 1.2
import QtQuick.Window 2.1
import org.freedesktop.gstreamer.GLVideoItem 1.0
Item {
anchors.fill: parent
GstGLVideoItem {
id: video
objectName: "videoItem"
anchors.centerIn: parent
width: parent.width
height: parent.height
}
}
Now since the actual pipeline to plays the file is quite long and complex to manage the # code I opted to use a gst_parse_launch() approach.
To proceed step by step, I tried to use such a method to create a videotestsrc pipeline, i.e.:
mPipeline = gst_parse_launch( "videotestsrc ! glupload ! qmlglsink", NULL);
mSink = gst_bin_get_by_name(GST_BIN(mPipeline), "sink");
mSource = gst_bin_get_by_name(GST_BIN(mPipeline), "source");
If I run the code this is the result:
(videotest:14930): GLib-GObject-CRITICAL **: 16:33:08.868: g_object_set: assertion 'G_IS_OBJECT (object)' failed
(videotest:14930): GLib-GObject-CRITICAL **: 16:33:09.342: g_object_set: assertion 'G_IS_OBJECT (object)' failed
Of course, the application window displays nothing.
You should give the elements a name property. They will default to ones, but they include a numerical value and are incremented whenever you rebuild the pipeline. So it is better to not rely on those.
To make your existing code work, try this:
mPipeline = gst_parse_launch( "videotestsrc name=source ! glupload ! qmlglsink name=sink", NULL);

How to add a GstVideoOrientationInterface to a Gst pipeline?

I'm trying to rotate/flip the video played by a plabin element (in C++). What I'm trying to to is similar to what asked in the question Rotate a Video in gstreamer , but I prefer not to rely on the videoflip element. Instead I'd like to use the GstVideoOrientation interface (https://thiblahute.github.io/GStreamer-doc/gst-plugins-base-video-1.0/videoorientation.html?gi-language=c#interfaces) from the gst video library (https://thiblahute.github.io/GStreamer-doc/gst-plugins-base-video-1.0/index.html?gi-language=c).
The documentation of the interface itself and how to use it is pretty clear, but I can't understand how to add such interface to a GstElement.
There is some documentation in https://gstreamer.freedesktop.org/documentation/application-development/advanced/interfaces.html and in https://gstreamer.freedesktop.org/documentation/plugin-development/advanced/interfaces.html , but still I can't figure out how this works.
Below is the code sample I'm working with:
#include <gst/video/video.h>
#include <gst/gst.h>
gint
main (gint argc, gchar * argv[])
{
//...
GstElement *pipeline;
pipeline = NULL;
gst_init (NULL,NULL);
pipeline = gst_element_factory_make("playbin", "playbin");
g_object_set (pipeline, "uri", "an_uri", NULL);
gst_element_set_state (pipeline, GST_STATE_PLAYING);
//...
return 0;
}
Any help is appricieated
Many thanks

Gstreamer basic pipeline running but not displaying on windows 7 virtualbox

I am currently working with Gstreamer on Windows 7 (x86_64) as a VM (Virtualbox) and I wanted to run a basic pipeline:
gst-launch-1.0 -v videotestsrc pattern=snow ! autovideosync
When I run this pipeline I get:
Setting pipeline to PAUSED...
Pipeline is PREROLLING
And then an error occurs:
Pipeline doesn't want to preroll
I solved this error by adding async-handling=true at the end of the pipeline but nothing is still displaying...
I tried to run the same pipeline writing C++ code. Here is a simple main you can run. When I run this code, I get no error but nothing is displaying.
#include <gst/gst.h>
#include <glib.h>
#include <stdio.h>
int main(int argc, char* argv[]) {
GMainLoop *loop;
GstElement *pipeline, *source, *sink;
g_print("Starting...");
/* Initialisation */
gst_init(&argc, &argv);
g_print("Loop is created...");
loop = g_main_loop_new(NULL, FALSE);
/* Create gstreamer elements */
pipeline = gst_pipeline_new("gst-app-sink");
source = gst_element_factory_make("videotestsrc", "src");
sink = gst_element_factory_make("autovideosink", "sink");
if (!pipeline || !source || !sink) {
g_printerr("One element could not be created. Exiting.\n");
return -1;
}
/* Set up the pipeline */
/* we add all elements into the pipeline */
/* source | sink */
gst_bin_add_many(GST_BIN(pipeline), source, sink, NULL);
/* we link the elements together */
/* src -> sink */
gst_element_link(source, sink);
/* Set the pipeline to "playing" state*/
gst_element_set_state(pipeline, GST_STATE_PLAYING);
/* Iterate */
g_print("Running...\n");
g_main_loop_run(loop);
/* Out of the main loop, clean up nicely */
g_print("Returned, stopping playback\n");
gst_element_set_state(pipeline, GST_STATE_NULL);
g_print("Deleting pipeline\n");
gst_object_unref(GST_OBJECT(pipeline));
g_main_loop_unref(loop);
return 0;
}
I really don't know where it could come from. Any ideas?
By default, the VM doesn't enable 2D and 3D video acceleration which is necessary to display these kind of stream. Just right-click on you VM -> Settings -> Display and check "Enable 3D acceleration" and "Enable 2D Video Acceleration".

Playing Audio files with GTK and c++

i'm currently building a very simple music player with the gtk+ and c++ code but now i am unable to figure out how to open and play the audio file using C++ code.
#include <gtk/gtk.h>
// simple music player to practice gtk and c++//
int main(int argc, char* argv[])
{
gtk_init(&argc,&argv);
GtkWidget *window;
GtkWidget *playButton;
GtkWidget *fileButton;
GtkWidget *frame;
GtkWidget *Dialog;
window = gtk_window_new(GTK_WINDOW_TOPLEVEL);
gtk_window_set_position(GTK_WINDOW(window),GTK_WIN_POS_CENTER);
gtk_window_set_default_size(GTK_WINDOW(window), 400, 400);
frame = gtk_fixed_new();
gtk_container_add(GTK_CONTAINER(window), frame);
playButton = gtk_button_new_with_label("Play");
gtk_widget_set_size_request(playButton,80,40);
gtk_fixed_put(GTK_FIXED(frame),playButton,40,330);
fileButton = gtk_button_new_with_label("Open");
gtk_widget_set_size_request(fileButton,80,40);
gtk_fixed_put(GTK_FIXED(frame),fileButton,40,260);
gtk_widget_show_all(window);
gtk_main();
return 0;
}
so as you can see i have created the open button to select your files and i know the dialog code;
GtkWidget *dialog;
dialog = gtk_file_chooser_dialog_new ("OpenFile",parent_window,GTK_FILE_CHOOSER_ACTION_OPEN,GTK_STOCK_CANCEL,GTK_RESPONSE_CANCEL,GTK_STOCK_OPEN, GTK_RESPONSE_ACCEPT, NULL);
if (gtk_dialog_run (GTK_DIALOG (dialog)) == GTK_RESPONSE_ACCEPT)
{
char *filename;
filename = gtk_file_chooser_get_filename (GTK_FILE_CHOOSER (dialog));
open_file (filename);
g_free (filename);
}
gtk_widget_destroy (dialog);
but my problem is i do not know how to place the code, i should more than likely create a function and set a callback to that function when the open buttot is clicked. right? And then the second problem coes no matter how hard i search i can't seem to find how to play the audio file, thanks so much in advance!
If you don't mind using external libraries Allegro makes it incredibly easy to play Audio files in a variety of formats. Here is an example of how to play a .wav audio file.
#include <stdio.h>
#include <allegro5/allegro.h>
#include <allegro5/allegro_audio.h>
#include <allegro5/allegro_acodec.h>
int main(int argc, char **argv){
ALLEGRO_DISPLAY *display = NULL;
ALLEGRO_SAMPLE *sample=NULL;
if(!al_init()){
fprintf(stderr, "failed to initialize allegro!\n");
return -1;
}
if(!al_install_audio()){
fprintf(stderr, "failed to initialize audio!\n");
return -1;
}
if(!al_init_acodec_addon()){
fprintf(stderr, "failed to initialize audio codecs!\n");
return -1;
}
if (!al_reserve_samples(1)){
fprintf(stderr, "failed to reserve samples!\n");
return -1;
}
sample = al_load_sample( "footstep.wav" );
if (!sample){
printf( "Audio clip sample not loaded!\n" );
return -1;
}
display = al_create_display(640, 480);
if(!display){
fprintf(stderr, "failed to create display!\n");
return -1;
}
/* Loop the sample until the display closes. */
al_play_sample(sample, 1.0, 0.0,1.0,ALLEGRO_PLAYMODE_LOOP,NULL);
al_rest(10.0);
al_destroy_display(display);
al_destroy_sample(sample);
return 0;
}
I know I'm incredibly late, but here's my proposal: Use the Gtk4 MediaStream API with Gtk::MediaFile.
Here's a minimal example:
#include <gtkmm.h>
int main(int argc, char ** argv)
{
auto app = Gtk::Application::create();
Gtk::Window w;
Glib::RefPtr<Gtk::MediaFile> mediafile = Gtk::MediaFile::create_for_filename("example.ogg");
mediafile->play();
return app->make_window_and_run<MyWindow>(argc, argv);
}
Compile with:
g++ example.cpp -o example `pkg-config gtkmm-4.0 --libs --cflags`

How to embed video in GTK+ application window using GStreamer & XOverlay?

I am trying to write a small media player using GTK+ and GStreamer and currently using the XOverlay Interface to embed the video in a GtkDrawing Area INSIDE the mainwindow.
The program was compiled using this command:
g++ /home/phongcao/cacao.cc -o /home/phongcao/cacao `pkg-config --cflags --libs gtk+-2.0 gstreamer-0.10 gstreamer-plugins-base-0.10 gstreamer-interfaces-0.10`
The problem is that the video was displayed in a SEPARATED window (instead of under the toolbar of the main window):
Here is the source code of the program:
#include <gst/interfaces/xoverlay.h>
#include <gtk/gtk.h>
#include <gst/gst.h>
#include <gdk/gdkx.h>
GstElement *play;
GtkAdjustment *progress;
GtkWidget *mainwindow, *drawingarea;
class TopWin
{
public:
TopWin();
~TopWin();
int Initialize(int argc, char *argv[]);
int Execute();
static void FileChooser(GtkButton *button, GtkWindow *mainwindow);
static int Play(gchar *addr);
static gboolean print_position(GstElement *element);
private:
};
TopWin::TopWin() {
}
TopWin::~TopWin() {
}
gboolean TopWin::print_position(GstElement *play) {
GstFormat fmt = GST_FORMAT_TIME;
gint64 pos, len;
if (gst_element_query_position(play, &fmt, &pos) && gst_element_query_duration(play, &fmt, &len)) {
g_print("Time: %" GST_TIME_FORMAT " / %" GST_TIME_FORMAT "\r", GST_TIME_ARGS(pos), GST_TIME_ARGS(len));
gtk_adjustment_set_value(GTK_ADJUSTMENT(progress), (pos*100)/len);
}
return TRUE;
}
int TopWin::Play(gchar *addr) {
GMainLoop *loop;
GstBus *bus;
loop = g_main_loop_new(NULL, FALSE);
play = gst_element_factory_make("playbin", "play");
g_object_set(G_OBJECT(play), "uri", addr, NULL);
bus = gst_pipeline_get_bus(GST_PIPELINE(play));
gst_object_unref(bus);
GstElement* x_overlay = gst_element_factory_make("xvimagesink", "videosink");
g_object_set(G_OBJECT(play), "video-sink", x_overlay, NULL);
gst_x_overlay_set_window_handle(GST_X_OVERLAY(x_overlay), GDK_WINDOW_XID(drawingarea->window));
gst_element_set_state(play, GST_STATE_NULL);
g_timeout_add(1000, (GSourceFunc) print_position, play);
gtk_adjustment_set_value(GTK_ADJUSTMENT(progress), 0);
gst_element_set_state(play, GST_STATE_PLAYING);
g_main_loop_run(loop);
gst_element_set_state(play, GST_STATE_NULL);
gst_object_unref(GST_OBJECT(play));
gtk_widget_show_all(mainwindow);
gtk_widget_realize(drawingarea);
return 0;
}
void TopWin::FileChooser(GtkButton *button, GtkWindow *mainwindow) {
GtkWidget *filechooser;
gchar *uri;
filechooser = gtk_file_chooser_dialog_new("Open File...", mainwindow, GTK_FILE_CHOOSER_ACTION_OPEN, GTK_STOCK_CANCEL, GTK_RESPONSE_CANCEL, GTK_STOCK_OK, GTK_RESPONSE_OK, NULL);
gtk_file_chooser_set_select_multiple(GTK_FILE_CHOOSER(filechooser), FALSE);
gint response = gtk_dialog_run(GTK_DIALOG(filechooser));
if (response == GTK_RESPONSE_OK) {
uri = gtk_file_chooser_get_uri(GTK_FILE_CHOOSER(filechooser));
gtk_widget_destroy(filechooser);
Play(uri);
g_free(uri);
}
else if (response == GTK_RESPONSE_CANCEL) {
gtk_widget_destroy(filechooser);
}
}
int TopWin::Initialize(int argc, char *argv[]) {
GtkWidget *playbutton, *openbutton, *volumebutton;
GtkWidget *prefbutton, *notebook;
GtkWidget *vbox, *hbox;
GtkWidget *entry, *hscale;
gtk_init(&argc, &argv);
gst_init(&argc, &argv);
mainwindow = gtk_window_new(GTK_WINDOW_TOPLEVEL);
gtk_container_set_border_width(GTK_CONTAINER(mainwindow), 0);
g_signal_connect(G_OBJECT(mainwindow), "destroy", G_CALLBACK(gtk_main_quit), NULL);
playbutton = gtk_button_new();
gtk_button_set_image(GTK_BUTTON(playbutton), gtk_image_new_from_stock(GTK_STOCK_MEDIA_PLAY, GTK_ICON_SIZE_SMALL_TOOLBAR));
openbutton = gtk_button_new();
gtk_button_set_image(GTK_BUTTON(openbutton), gtk_image_new_from_stock(GTK_STOCK_OPEN, GTK_ICON_SIZE_SMALL_TOOLBAR));
g_signal_connect(G_OBJECT(openbutton), "clicked", G_CALLBACK(TopWin::FileChooser), (gpointer) mainwindow);
volumebutton = gtk_button_new();
gtk_button_set_image(GTK_BUTTON(volumebutton), gtk_image_new_from_file("volume.png"));
prefbutton = gtk_button_new();
gtk_button_set_image(GTK_BUTTON(prefbutton), gtk_image_new_from_stock(GTK_STOCK_EXECUTE, GTK_ICON_SIZE_SMALL_TOOLBAR));
entry = gtk_entry_new();
progress = GTK_ADJUSTMENT(gtk_adjustment_new(0.00, 0.00, 100.00, 1.00, 0.00, 0.00));
hscale = gtk_hscale_new(progress);
gtk_scale_set_draw_value(GTK_SCALE(hscale), FALSE);
gtk_widget_set_size_request(hscale, 200, NULL);
hbox = gtk_hbox_new(FALSE, 0);
drawingarea = gtk_drawing_area_new();
vbox = gtk_vbox_new(FALSE, 0);
gtk_box_pack_start(GTK_BOX(hbox), openbutton, FALSE, FALSE, 2);
gtk_box_pack_start(GTK_BOX(hbox), playbutton, FALSE, FALSE, 2);
gtk_box_pack_start(GTK_BOX(hbox), hscale, FALSE, FALSE, 2);
gtk_box_pack_start(GTK_BOX(hbox), volumebutton, FALSE, FALSE, 2);
gtk_box_pack_start(GTK_BOX(hbox), entry, TRUE, TRUE, 2);
gtk_box_pack_start(GTK_BOX(hbox), prefbutton, FALSE, FALSE, 2);
gtk_button_set_relief(GTK_BUTTON(playbutton), GTK_RELIEF_NONE);
gtk_button_set_relief(GTK_BUTTON(openbutton), GTK_RELIEF_NONE);
gtk_button_set_relief(GTK_BUTTON(volumebutton), GTK_RELIEF_NONE);
gtk_button_set_relief(GTK_BUTTON(prefbutton), GTK_RELIEF_NONE);
gtk_box_pack_start(GTK_BOX(vbox), hbox, FALSE, FALSE, 0);
gtk_box_pack_start(GTK_BOX(vbox), drawingarea, FALSE, FALSE, 0);
gtk_container_add(GTK_CONTAINER(mainwindow), vbox);
gtk_widget_show_all(mainwindow);
gtk_widget_realize(drawingarea);
return 0;
}
int TopWin::Execute() {
gtk_main();
return 0;
}
int main(int argc, char *argv[])
{
int result = 0;
TopWin* topwin = new TopWin();
if (0 == topwin->Initialize(argc, argv)) {
result = topwin->Execute();
}
delete topwin;
return result;
}
Thank you for helping me with this problem! I have spent almost 3 days scratching over my head for this. The XOverlay reference on GStreamer website is so confusing... :(
Please tell me if you need any additional information... Thank you!!
You need to do something like this:
GstElement* x_overlay=gst_element_factory_make ("xvimagesink", "videosink");
g_object_set(G_OBJECT(play),"video-sink",x_overlay,NULL);
gst_x_overlay_set_window_handle(GST_X_OVERLAY(x_overlay), GDK_WINDOW_XID(drawingarea->window));
Create new XV video sink. Set it as video sink of your playbin. Attach xv video sink to your drawingarea window ID. You also need to add drawingarea to some container before that.
Your program produces warnings and gtk errors, they may be source of some of your problems better fix them.
gst_x_overlay_set_window_handle interface is deprecated now in recent gstreamer library. New interface is gst_video_overlay_set_window_handle. A simple supplementation can be sited from https://web.archive.org/web/20190628124320/http://wikistack.com/how-to-make-your-own-media-player-in-linux-using-gtk-and-gstreamer/