I want to create a Qt widget that can play incoming RTP streams where the video is encoded as H264 and contains no audio.
My basic plan for implementation is this:
Create a Phonon MediaSource object (Stream type).
Connect it with a QIODevice subclass that provides the data
Obtain the video data using either:
The JRTPLIB client library
The GStreamer gstrtpbin plugin. This plugin takes care depayloading the packages and decoding the video. Maybe this improves the chances that Phonon will recognize the data.
My environment:
Ubuntu 9.10
Qt 4.6
My questions:
Is my approach a good one? Perhaps I'm overlooking a more obvious or simple solution?
I'm currently experiencing this issue: when trying to play the video stream the state of the MediaObject turns to ErrorState with errorType FatalError. Can anyone tell me what I'm doing wrong?
Edit
One solution I found is using libVLC in combination with Qt, which I learned about in this thread. Here's a code sample for the interested.
I'm still looking for a Phonon-based solution.
Ideally I would only need to provide an SDP file and job is done.
I was able to get it to work using the libVLC solution. I can't garantuee that this is the best solution though as I simply stopped looking after that.
Here's a link to the libVLC sample.
The way I understand Phonon works at least in Windows is that QT provides a phonon backend plugin for DirectShow (\plugins\phonon_backend\phonon_ds94.dll) and GStreamer in your case. Then you would either obtain or write your own DirectShow filter which can accept RTP streams as a source. DirectShow takes care of the decoding, and Phonon will take care of the rendering.
So if the backend works, the application code is as simple as:
Phonon::MediaObject *media = new Phonon::MediaObject();
Phonon::VideoWidget *video = new Phonon::VideoWidget();
Phonon::createPath(media, video);
media->setCurrentSource(source);
media->play();
Seems that the problem lies with the GStreamer backend accepting RTP as a source. Can you playback that source in standalone GStreamer without any problems?
Related
I'm making a call software using Qt and Qml, and I need to get the microphone feed from Qml running as webgl to the C++ side, if not straight to GStreamer using server.
I already have a Qt program as the client using GStreamer to push audio stream to the server. GStreamer, of course, doesn't go to the webgl client side though. I've found, that you can get permissions to use mic/camera from Qml, but I haven't found any example actually grabbing the stream from there. I've also checked out the usage of WebRTC. It seems like it could work with Qml and I have found some examples using it with GStreamer, but I haven't been able to get the combination of WebRTC and GStreamer working even with the examples.
So the full question:
How can I get the audio from the Qml running as webgl? Is there a way within Qt or do I have to go through WebRTC? If so, is there some simpler or more beginner friendly example than the Nirbheek's gstwebrtc demos for connecting WebRTC to GStreamer?
Not the answer I wanted, but this ended up working in my case:
As the C++ side is also running another Qt GUI, Qml to be specific, I
can use WebEngineView with html and javascript to never bother the C++
implementation with GStreamer for WebRTC. So currently I'm running
PeerJS on both sides of the connection with PeerJS' signaling
server in between.
I'd have preferred to use C++ with GStreamer to connect to WebRTC, but
I can't find other easy way for me to connect browser's audio to Qt.
edit: I apologise, this answer doesn't work in the end. I have been testing the programs on a single computer, thus I didn't realize WebGL hosted Qml doesn't run WebEngineView's Javascript on the frontend, but backend instead.
The project I'm currently working on, requires an addition to the already existing VoIP capabilities. The core for speech processing is in C, the remainder is in C++ with Qt - the audio is handled via portaudio. The connection between users is currently established via UDP, which I think has to be changed for the planned video connection. Developing platform is Windows on VS2012 - however, the system is cross-platform.
In a nutshell, what I want to do is: Grab the video signal from a webcam, synchronize audio coming from C core and video from webcam and use a library and codecs for (de-)coding/muxing the signals on the respective sides and sending via RTP. The system should be capable of multicast transmission.
I did some research for possible libraries and stumbled upon ffmpeg and libVLC. For the codec I thought about using x264. And if I'm correct, ffmpeg and libVLC should both be capable of what I'm looking for?
However I'm not sure which one to pick, and from their documentations I really can't extract, which library is the better fit. Has anybody had similar problems and can help me out - I'm quite a newbie, when it comes to video processing and encoding.
Extra question: Do you have any hints or approaches on syncing the video and audio signals?
If anyone is interested, this is what I ended up doing:
I am using the WebM container format, VP8 with Vorbis currently (but going to change to VP9 with Opus soon if out of beta), handled by ffmpeg/libav libraries for encoding/decoding/muxing etc. and SDL for displaying and threading. ffmpeg/libav was cross-compiled on Unix with LGPL support to keep our project closed source.
My application displays video and audio and I want to add a recording feature.
I've considered FFmpeg, but I have to compile my application with VS so I can't use it. So I'm trying to do it with GStreamer, but I'm not finding any example or guide on how to create a video. Any help?
(I can also consider using any other alternatives, but they must be cross-platform).
Application Development Manual explains very well how to use gstreamer from your code. Try to read it first.
Than you can experiment with gst-launch tool, build pipeline and execute it from your application using gst-parse-launch function.
You can expose more details of your problem if you want more helpful answer.
I'm trying to get mpg123 audio decoder to work with QT on windows. How do i play the decoded audio data at the right speed with Qmultimedia module in push mode. Currently i'm using simple timer to get it to play audio but it's not very efficient way to do it, if I do anything else at the same time audio get all distorted. Is there any better way to send the decoded data to audio output? It would be nice if anyone could point me to any nice examples using Qmultimedia module and Qaudiooutput class. I've tried to figure out QT example project "audiooutput" but it seems that it's also using timer to send audio to output in push mode.. Hope that I'm not too confusing.
I also had to figure that out and I would also suggest using the Phonon framework to do this.
It uses Windows Media Player as host on Windows, QuickTime on Mac and some KDE stuff on Linux.
So it's pretty platform independent.
If you need more low-level functionality, you should take a look into an open-source project called portaudio. It's very easy to use and you can manipulate or even fill buffers from code.
I used it to build an oscillator.
Hope that helps!
Best,
guitarflow
I want to use Qt to create a simple GUI application that can play a local video file. I could use Phonon which does all the work behind the scenes, but I need to have a little more control. I have already succeeded in implementing an GStreamer pipeline using the decodebin and autovideosink elements. Now I want to use a Qt widget to channel the output to.
Has anyone ever succeeded in doing this? (I suppose so since there are Qt-based video players that build upon GStreamer.) Can someone point me in the right direction on how to do it?
Note: This question is similar to my previous posted question on how to connect Qt with an incoming RTP stream. This seemed to be quite challenging. This question will be easier to answer I think.
Update 1
Patrice's suggestion to use libVLC is very helpful already. Here's a somewhat cleaner version of the code found on VLC's website:
Sample for Qt + libVLC.
However, my original question remains: How do I connect GStreamer to a Qt widget?
Update 2
After some experimentation I ended up with this working sample. It depends on GstWidget.h and GstWidget.cpp from my own little GstSupport library. However, take note that is is currently only tested on the Mac version of Qt.
To connect Gstreamer with your QWidget, you need to get the window handle using QWidget::winId() and you pass it to gst_x_overlay_set_xwindow_id();
Rough sample code:
sink = gst_element_factory_make("xvimagesink", "sink");
gst_element_set_state(sink, GST_STATE_READY);
QApplication::syncX();
gst_x_overlay_set_xwindow_id(GST_X_OVERLAY(sink), widget->winId());
Also, you will want your widget to be backed by a native window which is achieved by setting the Qt::AA_NativeWindows attribute at the application level or the Qt::WA_NativeWindow attribute at the widget level.
Since Phonon is based on gstreamer, the place to look for details is the Phonon source tree (available here: http://gitorious.org/phonon/import/trees/master). For a video player you are most likely going to need a video display widget, such as the gstreamer/videowidget.h (cpp) that in turn used the X11 renderer (gstreamer/x11renderer.h, cpp). The sink used is the xvimagesink, falling back onto the ximagesink if the first cannot be created.
The basic trick is to overlay the VideoWidget with the video output. The X11 handle needed to do this is retrieved using the QWidget::winId method, which is platform specific (as are the sinks, so no biggie).
Also, if overlay is unavailable, a QWidgetVideoSink is used, which converts the video frames into individual frames for the WidgetRenderer class. This class, in turn, makes the current frame available as a QImage object, ready for any type of processing.
So to answer your question - use either overlays (as X11Renderer) or extract individual QImages from the video stream (as QWidgetVideoSink).
VLC version is a QT-based video player (since version 0.99). It allows too to stream or read a stream: You can find all information you need here: http://wiki.videolan.org/Developers_Corner. You only have create an instance of the player and associate it to a widget. Then you have full control on the player.
I have already tested it (on Linux and Windows) playing local music and video files and it works fine.
Give it a try and see by yourself.
Hope that helps.
Edit:
It seems if you want to use VLC, you need to write or find (I do not know if one exists) a GStreamer codec as explain on the videolan wiki. I think I would do that.