Force QMediaPlayer to update position accurately for video scrubbing application? - c++

I am writing an application which will allow the user to scrub through an open video. Developing on Windows 7/8 with Qt 5.3, I have been using QMediaPlayer and QVideoWidget following the qvideowidget example project. The result has been pretty good, except that the QVideoWidget seems only to update during idle time. Still, it's a good start and it's usable.
However, when I build on Mac OS 10.10 (again with Qt 5.3), scrubbing behaves as though there were only one frame per second in the video. As I drag the "position" slider, the video jumps from one frame to the frame one second later, then one second after that, even though I am calling QMediaPlayer::setPosition several times with positions between those two frames.
The problem can be reproduced using the videowidget example that ships with Qt 5.3 here: Qt\Examples\Qt-5.3\multimediawidgets\videowidget. When the slider is dragged on a Windows machine, the QVideoWidget moves between frames that are spaced fairly close together. When the slider is dragged on a Mac (at least on mine), the QVideoWidget jumps between frames spaced about one second apart. No matter how long I wait for an "in between" frame to render, it won't happen unless I hit the "play" button.
I've tried calling QMediaPlayer::play() and QMediaPlayer::pause() one after the other to force an update, but this doesn't seem to work--QMediaPlayer works asynchronously, so the update doesn't have time to take effect.
If I check the value of QMediaPlayer::position, I find that it actually doesn't change between these jumps. It appears that when I call QMediaPlayer::setPosition, it is actually rounding the position to one second increments on a Mac and finer increments on a Windows machine.
Ideally, I would like to jump to a particular position in the video and render that frame immediately on the QVideoWidget. Is there any way to force QMediaPlayer to set the position accurately and update the associated QVideoWidget? Is there a better way to implement smooth scrubbing in a video?
Thanks for your help!

In case anyone else has a similar problem...
My best guess is that the issue stems from limitations in the codec used by QMediaPlayer, since this seems to be the main difference between the two platforms. Rather than deal with the codec issues directly, I looked around for other options.
MLT (http://www.mltframework.org/) seemed promising, but it is a major pain to compile and the primary author seems to have settled on offering SDK support to commercial users only.
libVLC (https://wiki.videolan.org/LibVLC/) looks a lot better. In particular, I’ve been using vlc-qt (https://github.com/ntadej/vlc-qt). The latter has an interface that will look quite familiar to users of QMediaPlayer and QVideoWidget. It was an easy replacement in my own application, and the result was much smoother video scrubbing on both Windows and Mac.
Hope this helps someone else!

Related

Qt6 widget->showMaximized freezes on first show

After upgrading to qt version 6.2 from qt5 I faced the problem with starting the app in the maximized mode. The widget is first painted with it's normal size and then repainted fine. The intermediate state is shown on the gif below:
The code of the app from the giff comes together with Qt (my version is Qt-6.2.3) and called "styles" (path is Qt/Examples/Qt-6.2.3/widgets/widgets/styles) or can be found on the qt.io. I have just changed widget.show() to widget.showMaximized() in main.cpp.
I figured out that sometimes widget starts fine, but this is rather rare. Also the more complicated widget is, the more time it takes to correctly paint it on the first show.
In the example from the gif the start is so quick that it is almost invisible, but in the heavy app I'm working with this slow startup really catches the eye.
If this can not be easy fixed, does anybody have any idea how to mask this behaviour? I thought about painting the widget on the virtual framebuffer and moving it back to the main display right after first show, but I did not find any easy means to create it in c++.
Any help is appreciated.
EDIT:
I finally reported a bug to qt developers.
See QTBUG-104357

Qt video playback with libmpv makes QtWebEngine jquery content unsmooth - Ubuntu

I have a project that uses libmpv with the opengl widget (as per the examples that come with libmpv) along with a QtWebEngine widget that displays information, graphics and animations (a scrolling ticker for example).
I found that of the video playback options in Qt, mpv was the smoothest and most reliable. It will play back perfectly smoothly any video up to 1080p.
However while video is playing, any animations in QtWebEngine are unsmooth and jittery. Video is also slightly less smooth when something is moving in the webpage.
The system I'm testing with is not struggling for resources (cpu use is around 45%). There is also not any video decoding, as it's playing back raw video (but while it plays back encoded video, the effect is the same, regardless if hardware acceleration is enabled or not).
I figure that the mpv widget is interrupting the MainWindow thread while it processes frames and causing it to freeze every few milliseconds.
As far as I know there is no way to separate the mpv thread from the MainWindow thread though.
I don't know if it'll be possible to make mpv and webengine work together smoothly. I feel like there must be some way to run two widgets at once in one window and not have them mess with each other.
I'm testing with Ubuntu 18.04, QT 5.11 and the latest mpv from git.
Does anyone have any advice or pointers for what to try first? I realise this is somewhat of a broad question but my knowledge of graphics is limited. If anyone has advice on a conceptual level (I don't need someone to code me a fix) I can investigate myself.
Thank you.

Can a Windows Time Limit App, bump a game out of Full Screen (DirectX?) mode when time is up?

I work on software that keeps track of time (C++/MFC), and when time is up (after a handful of warnings as the time limit approaches), we need to bump the person off of the computer.
Works great with Windows apps, however, it seems that a fair number of games, typically when they are in full screen mode, can be played even after we've done our work to hide other windows and/or swap to another desktop.
I know nothing about DirectX, and since I know nothing about it, I'm eager to blame it. :-)
My assumption is that when in some kind of "DirectX" mode, the game is interacting with the hardware and whatever the Windows API is doing, the game and the video hardware could care less.
The problem is that I have unhappy parents who thought our software was going to be effective at getting little Jimmy out in the sunlight to play, and it's not.
Is there a way that my Windows App can give the game "the boot" when time is up, forcing the Windows desktop to be displayed, pausing the game, or at least detecting that we're in a hopeless situation with the display mode being in full-screen DirectX mode which can't be programatically switched out of?
Sure, this isn't exceptionally hard. The most obvious thing to do would be to send the game a few messages. There are quite a few games which will respond to WM_QUIT. A bit more drastic is LockWorkStation(). If that fails, TerminateProcess works at the core OS level and ignores details like DirectX.

OpenGL Window Overlay

What I need to do is create a program that overlays the whole screen and every 30 seconds the screen needs to flash black once.
the program just needs to be on top of everything, doesn't have to work over the top of games, but wouldn't say no if it did!
But i've got no idea where to start. Ideally the solution would be cross-platform for both windows and osx.
Does anybody have any ideas about where I should start or could whip up a quick demo?
OpenGL (you tagged it as such) will not help you with this.
Create a program, that overlays the whole screen,
The canonical way to do this is by creating a decorationless, borderless top level window with some stay-on-top property being set.
and every 30 seconds the screen needs to flash black once.
How do you define "flash back once"? You mean you want the display become visible for one single vertical retrace period or a given amount of time? Being the electronics tinkerer I am, honestly, I'd do this using a handfull of transistors, resistors and capacitors, blanking the analog VGA signal.
Anyway, if you want to do this using software, this is going to be hard work. If you'd do this using the aforementioned stay-on-top window, when you "flash" it away, all the programs with visible output would receive redraw events, which to process would take some time. In the best case scenario the system uses a compositing window manager which can practically immediately show the desktop. Without a compositor its going to be impossible to "flash" the screen.
Ideally the solution would be cross-platform for both windows and osx
A task like this can not be solved cross plattform. There's too much OS dependent work to do for this.
I presume this is for some kind of nerological or psychological experiment. I think doing this using some VGA intercepting circurity would be actually the easier, quicker to implement solution. I can help you with that. But I think there's another StackExchange better suited for this. Unfortunately digital display interfaces (DVI, HDMI and Display Port) use a complex line code scheme, which can not be blanked as easily as VGA, so you must have a computer capable of analog (=VGA) output and a display with a VGA input.

How to sync page-flips with vertical retrace in a windowed SDL application?

I'm currently writing a game of immense sophistication and cunning, that will fill you with awe and won- oh, OK, it's the 15 puzzle, and I'm just familiarising myself with SDL.
I'm running in windowed mode, and using SDL_Flip as the general-case page update, since it maps automatically to an SDL_UpdateRect of the full window in windowed mode. Not the optimum approach, but given that this is just the 15 puzzle...
Anyway, the tile moves are happening at ludicrous speed. IOW, SDL_Flip in windowed mode doesn't include any synchronisation with vertical retraces. I'm working in Windows XP ATM, but I assume this is correct behaviour for SDL and will occur on other platforms too.
Switching to using SDL_UpdateRect obviously won't change anything. Presumably, I need to implement the delay logic in my own code. But a simple clock-based timer could result in updates occuring when the window is half-drawn, causing visible distortions (I forget the technical name).
EDIT This problem is known as "tearing".
So - in a windowed mode game in SDL, how do I synchronise my page-flips with the vertical retrace?
EDIT I have seen several claims, while searching for a solution, that it is impossible to synchronise page-flips to the vertical retrace in a windowed application. On Windows, at least, this is simply false - I have written games (by which I mean things on a similar level to the 15-puzzle) that do this. I once wasted some time playing with Dark Basic and the Dark GDK - both DirectX-based and both syncronising page-flips to the vertical retrace in windowed mode.
Major Edit
It turns out I should have spent more time looking before asking. From the SDL FAQ...
http://sdl.beuc.net/sdl.wiki/FAQ_Double_Buffering_is_Tearing
That seems to imply quite strongly that synchronising with the vertical retrace isn't supported in SDL windowed-mode apps.
But...
The basic technique is possible on Windows, and I'm beginning the think SDL does it, in a sense. Just not quite certain yet.
On Windows, I said before, synchronising page-flips to vertical syncs in Windowed mode has been possible all the way back to the 16-bit days using WinG. It turns out that that's not exactly wrong, but misleading. I dug out some old source code using WinG, and there was a timer triggering the page-blits. WinG will run at ludicrous speed, just as I was surprised by SDL doing - the blit-to-screen page-flip operations don't wait for a vertical retrace.
On further investigation - when you do a blit to the screen in WinG, the blit is queued for later and the call exits. The blit is executed at the next vertical retrace, so hopefully no tearing. If you do further blits to the screen (dirty rectangles) before that retrace, they are combined. If you do loads of full-screen blits before the vertical retrace, you are rendering frames that are never displayed.
This blit-to-screen in WinG is obviously similar to the SDL_UpdateRect. SDL_UpdateRects is just an optimised way to manually combine some dirty rectangles (and be sure, perhaps, they are applied to the same frame). So maybe (on platforms where vertical retrace stuff is possible) it is being done in SDL, similarly to in WinG - no waiting, but no tearing either.
Well, I tested using a timer to trigger the frame updates, and the result (on Windows XP) is uncertain. I could get very slight and occasional tearing on my ancient laptop, but that may be no fault of SDLs - it could be that the "raster" is outrunning the blit. This is probably my fault for using SDL_Flip instead of a direct call to SDL_UpdateRect with a minimal dirty rectangle - though I was trying to get tearing in this case, to see if I could.
So I'm still uncertain, but it may be that windowed-mode SDL is as immune to tearing as it can be on those platforms that allow it. Results don't seem as bad as I imagined, even on my ancient laptop.
But - can anyone offer a definitive answer?
You can use the framerate control of SDL_gfx.
Looking at the docs of library, the flow of your application will be like this:
// initialization code
FPSManager *fpsManager;
SDL_initFramerate(fpsManager);
SDL_setFramerate(fpsManager, 60 /* desired FPS */);
// in the render loop
SDL_framerateDelay(fpsManager);
Also, you may look at the source code to create your own framerate control.