GST (gstreamer) command in QMediaPlayer command - c++

I am using Qt Creator 4.5.2 (Qt 5.9.5, GCC 7.3.0 64-bit) and running on Ubuntu 18.04 I am just trying to get live video stream from a IP camera. I used 'QGraphicsView', 'QGraphicsScene', 'QGraphicsVideoItem' and QMediaPlayer methods.
Right now, the video streaming source is a IP camera and I am using 'QMediaPlayer' with 'RTSP' to get the live video and it works. However, for performance and other reasons, I need to change to gstreamer type command, like 'gst-launch-1.0', to get the live video. I am having trouble to get the correct 'gst pipe' string. Need helps.
In the document for 'QMediaPlayer', it states: Since Qt 5.12.2, the url scheme gst-pipeline provides custom pipelines for the GStreamer backend.
My version is 5.9.5 so I think the GStreamer type command should work.
Related Code and comments:
// Setup GraphicsScene
mpView = ui->gvCam;
mpView->setVisible(true);
mpScene = new QGraphicsScene;
mpView->setScene(mpScene);
mpScene->setSceneRect(0, 0, mpView->width(), mpView->height());
mpView->setSceneRect(QRectF());
// Setup IP camera
mpPlayer1 = new QMediaPlayer;
mpVideoItem1 = new QGraphicsVideoItem;
mpPlayer1->setVideoOutput(mpVideoItem1);
//The following line works and I got the live stream.
mpPlayer1->setMedia(QUrl("rtsp://20.0.2.118:8554/0"));
//However, I need to use GST type command, like:
//gst-launch-1.0 rtspsrc location=rtsp://20.0.2.118:8554/0 ! decodebin ! videoscale \
! 'video/x-raw, width=480, height=270, format=I420' \
! xvimagesink sync=false force-aspect-ratio=false;
//The above GST command worked if I issued from the terminal and I got the live stream.
//But, I don't know how to put it as a 'gst pipeline' string as a parameter for 'setMedia' call.
mpScene->addItem(mpVideoItem1);
QSizeF qf1(mpView->width(), mpView->height());
mpVideoItem1->setSize(qf1);
mpVideoItem1->setAspectRatioMode(Qt::IgnoreAspectRatio);
mpPlayer1->play();

If your Qt version is prior to 5.12.2 then a custom pipeline won't work with QMediaPlayer, because playbin is used instead.

Related

Get gstreamer pipeline object pointer in source code

I am running below gstreamer command for live streaming:
gst-launch-1.0 imxv4l2videosrc device=/dev/video0 fps-n=30 imx-capture-mode=0 ! textoverlay name=overlay text=\"Overlay text here\" valignment=top halignment=left font-desc=\"Sans, 22\"! gdkpixbufoverlay name=imageoverlay location=/home/user/LZ_50/CamOverlay.png ! imxg2dvideotransform ! imxg2dvideosink framebuffer=/dev/fb1 use-vsync=true sync=false"
I want to change the text overlay dynamic in the GStreamer pipeline.
How can I get the pipeline object pointer to change the text overlay dynamic?
Thank
I had written an application but I have an issue with crashing the application using the GStreamer pipeline with overlay image & text.
Finally got the issue of crashing the application.more detail can be found at : imx Gstreamer plugins .
This also helps to reduce CPU usage(almost 20% 1 core).

GStreamer preview RTMP using xvimage

I want to preview RTMP using gstreamer xvimagesink. i can see the output if i use autovideosink like this:
gst-launch-1.0 -v rtmpsrc location='rtmp://127.0.0.1:1935/live/stream' ! decodebin3 ! autovideosink
but if i replace "autovideosink" with "xvimagesink" i get this:
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0: Could not initialise Xv output
Additional debug info:
xvimagesink.c(1773): gst_xv_image_sink_open (): /GstPipeline:pipeline0/GstXvImageSink:xvimagesink0:
Could not open display (null)
Setting pipeline to NULL ...
Freeing pipeline ...
Both decodebin3 and autovideosink are auto-plugging GStreamer elements. It means that both elements are auto-selecting available and the most appropriate GStreamer plugins to demux/decode (decodebin3) and render video (autovideosink) from, in this case, live RTMP stream.
So it is very possible that, for example,
decodebin3 decodes video in format that xvimagesink cannot show on your platform/hardware and/or with your Gstreamer version,
xvimagesink is not set properly on your platform and it is not related with available display/monitor.
To find out more details about
video format decoded by decodebin3
video sink element "chosen" by autovideosink,
you can set higher (more detailed) debug level of GStreamer with, for example, export GST_DEBUG=3, rerun pipeline and inspect output.

Using glcolorscale with gstreamer GPU video scaling

I am trying to find a way to do video scaling on the GPU and the only thing I could find was the glcolorscale filter. I am running streamer 1.8.0 on my ARM device and I tried to execute the following:
gst-launch-1.0 -v videotestsrc ! "video/x-raw-yuv" | glcolorscale ! ximagesink
This is an example that I found in the documentation for glcolorscale but it returns an error:
"Could not return videotestsrc0 to glcolorscale0"

What's wrong with this GStreamer pipeline?

I'm sure I've had this pipeline working on an earlier Ubuntu system I had set up (formatted for readability):
playbin
uri=rtspt://user:pswd#192.168.xxx.yyy/ch1/main
video-sink='videoconvert
! videoflip method=counterclockwise
! fpsdisplaysink'
Yet, when I try to use it within my program, I get:
Missing element: H.264 (Main Profile) decoder
WARNING: from element /GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0:
No decoder available for type 'video/x-h264,
stream-format=(string)avc, alignment=(string)au,
codec_data=(buffer)014d001fffe10017674d001f9a6602802dff35010101400000fa000030d40101000468ee3c80,
level=(string)3.1, profile=(string)main, width=(int)1280,
height=(int)720, framerate=(fraction)0/1, parsed=(boolean)true'.
Additional debug info:
gsturidecodebin.c(938): unknown_type_cb ():
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0
Now I'm pretty certain I have an H264 decoder installed and indeed the gstreamer plugins autogen.sh/configure correctly recognised the fact. Installed packages are h264enc, libx264-142, libx264-dev and x264.
It does exactly the same thing if I use the more "acceptable" autovideosink in place of fpsdisplaysink, or if I try to play the RTSP stream with gst-play-1.0. However, it works if I use the test pattern source videotestsrc.
What am I doing wrong?
It looks like gstreamer cannot find a suitable plugin for decoding H264. Either you do not have an H264 decoder element installed, or gstreamer is looking in the wrong path for your elements.
First, try running gst-inspect-1.0. This should output a long list of all the elements gstreamer has detected.
If this doesn't return any elements, you probably need to set the GST_PLUGIN_PATH environment variable to point to the directory where your plugins are installed. Running Gstreamer - This link should help.
If it DOES return many elements, run gst-inspect-1.0 avdec_h264 to verify that you have the H264 decoder element.

Stream Icecast using Gstreamer

I'm designing a program to stream an icecast server (radio.clarkson.edu). Ultimately it will be written in Python3, but for now I'm using gst-launch to test the pipeline. I've been working on Debian Jessie and using gstreamer-1.0. Using a file on Wikimedia, I was able to play pretty easily using:
url=https://upload.wikimedia.org/wikipedia/commons/0/0c/Muriel-Nguyen-Xuan-Korsakov-Flight-of-the-bumblebee.flac.oga
gst-launch-1.0 -v souphttpsrc location =$url ! decodebin ! audioconvert ! audioresample ! alsasink
Running the same commands with my stream, I get the output:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = text/uri-list
Missing element: text/uri-list decoder
ERROR: from element /GstPipeline:pipeline0/GstDecodeBin:decodebin0: Your GStreamer installation is missing a plug-in.
Additional debug info:
gstdecodebin2.c(3977): gst_decode_bin_expose (): /GstPipeline:pipeline0/GstDecodeBin:decodebin0:
no suitable plugins found
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = "NULL"
Freeing pipeline ...
I have tried too many other pipelines to put on one post, but I can answer any other questions.
Thank you
By now you probably have solved that problem, but still here's an idea: text/uri-list indicates that you didn't hand an actual stream to gstreamer, but rather a (textual) playlist that contains stream addresses. I guess gstreamer can't handle those, hence you need to parse them beforehand and then hand an actual audio stream address to it.