I am using gstreamer-1.0 with ximagesrc for capturing the screen of my Linux box...
Unfortunately I see some "tearing" on the captured images, and I guess it is due to the lack of v-sync while grabbing the desktop...
Is v-sync supported somehow by ximagesrc? Does it depend on system configuration and/or driver? Looking at ximagesrc documentation I can't find any useful details about that... but maybe there is some "black magic" for making this working... :)
Related
I have a project that uses libmpv with the opengl widget (as per the examples that come with libmpv) along with a QtWebEngine widget that displays information, graphics and animations (a scrolling ticker for example).
I found that of the video playback options in Qt, mpv was the smoothest and most reliable. It will play back perfectly smoothly any video up to 1080p.
However while video is playing, any animations in QtWebEngine are unsmooth and jittery. Video is also slightly less smooth when something is moving in the webpage.
The system I'm testing with is not struggling for resources (cpu use is around 45%). There is also not any video decoding, as it's playing back raw video (but while it plays back encoded video, the effect is the same, regardless if hardware acceleration is enabled or not).
I figure that the mpv widget is interrupting the MainWindow thread while it processes frames and causing it to freeze every few milliseconds.
As far as I know there is no way to separate the mpv thread from the MainWindow thread though.
I don't know if it'll be possible to make mpv and webengine work together smoothly. I feel like there must be some way to run two widgets at once in one window and not have them mess with each other.
I'm testing with Ubuntu 18.04, QT 5.11 and the latest mpv from git.
Does anyone have any advice or pointers for what to try first? I realise this is somewhat of a broad question but my knowledge of graphics is limited. If anyone has advice on a conceptual level (I don't need someone to code me a fix) I can investigate myself.
Thank you.
I am writing an application which will allow the user to scrub through an open video. Developing on Windows 7/8 with Qt 5.3, I have been using QMediaPlayer and QVideoWidget following the qvideowidget example project. The result has been pretty good, except that the QVideoWidget seems only to update during idle time. Still, it's a good start and it's usable.
However, when I build on Mac OS 10.10 (again with Qt 5.3), scrubbing behaves as though there were only one frame per second in the video. As I drag the "position" slider, the video jumps from one frame to the frame one second later, then one second after that, even though I am calling QMediaPlayer::setPosition several times with positions between those two frames.
The problem can be reproduced using the videowidget example that ships with Qt 5.3 here: Qt\Examples\Qt-5.3\multimediawidgets\videowidget. When the slider is dragged on a Windows machine, the QVideoWidget moves between frames that are spaced fairly close together. When the slider is dragged on a Mac (at least on mine), the QVideoWidget jumps between frames spaced about one second apart. No matter how long I wait for an "in between" frame to render, it won't happen unless I hit the "play" button.
I've tried calling QMediaPlayer::play() and QMediaPlayer::pause() one after the other to force an update, but this doesn't seem to work--QMediaPlayer works asynchronously, so the update doesn't have time to take effect.
If I check the value of QMediaPlayer::position, I find that it actually doesn't change between these jumps. It appears that when I call QMediaPlayer::setPosition, it is actually rounding the position to one second increments on a Mac and finer increments on a Windows machine.
Ideally, I would like to jump to a particular position in the video and render that frame immediately on the QVideoWidget. Is there any way to force QMediaPlayer to set the position accurately and update the associated QVideoWidget? Is there a better way to implement smooth scrubbing in a video?
Thanks for your help!
In case anyone else has a similar problem...
My best guess is that the issue stems from limitations in the codec used by QMediaPlayer, since this seems to be the main difference between the two platforms. Rather than deal with the codec issues directly, I looked around for other options.
MLT (http://www.mltframework.org/) seemed promising, but it is a major pain to compile and the primary author seems to have settled on offering SDK support to commercial users only.
libVLC (https://wiki.videolan.org/LibVLC/) looks a lot better. In particular, I’ve been using vlc-qt (https://github.com/ntadej/vlc-qt). The latter has an interface that will look quite familiar to users of QMediaPlayer and QVideoWidget. It was an easy replacement in my own application, and the result was much smoother video scrubbing on both Windows and Mac.
Hope this helps someone else!
I'm using ofVideoPlayer and for some reason there is some flicker/tear the moment of the transitions between the videos. I've tried to change the video format, i added ofSetVerticalSync(true);and it stills gets that error.
How can i fix this?
I had a lot of struggle with Openframeworks and Tearing.
Basically, it depends a lot on your Hardware.
You may install drivers to force the graphic cards vsync.
It is more true if you run OF with Linux OS.
If you are on windows certainly look into the graphics cards settings panel and experiment with different settings for vertical sync.
If you are on on OSX there are some other video players you might want to test.
http://forum.openframeworks.cc/t/ofxavfvideoplayer/12770
Recently I have had the best results with the HAP codec and the hap player. It loads super quick and supports alpha channels. I would definitely try that.
https://github.com/bangnoise/ofxHapPlayer
What I need to do is create a program that overlays the whole screen and every 30 seconds the screen needs to flash black once.
the program just needs to be on top of everything, doesn't have to work over the top of games, but wouldn't say no if it did!
But i've got no idea where to start. Ideally the solution would be cross-platform for both windows and osx.
Does anybody have any ideas about where I should start or could whip up a quick demo?
OpenGL (you tagged it as such) will not help you with this.
Create a program, that overlays the whole screen,
The canonical way to do this is by creating a decorationless, borderless top level window with some stay-on-top property being set.
and every 30 seconds the screen needs to flash black once.
How do you define "flash back once"? You mean you want the display become visible for one single vertical retrace period or a given amount of time? Being the electronics tinkerer I am, honestly, I'd do this using a handfull of transistors, resistors and capacitors, blanking the analog VGA signal.
Anyway, if you want to do this using software, this is going to be hard work. If you'd do this using the aforementioned stay-on-top window, when you "flash" it away, all the programs with visible output would receive redraw events, which to process would take some time. In the best case scenario the system uses a compositing window manager which can practically immediately show the desktop. Without a compositor its going to be impossible to "flash" the screen.
Ideally the solution would be cross-platform for both windows and osx
A task like this can not be solved cross plattform. There's too much OS dependent work to do for this.
I presume this is for some kind of nerological or psychological experiment. I think doing this using some VGA intercepting circurity would be actually the easier, quicker to implement solution. I can help you with that. But I think there's another StackExchange better suited for this. Unfortunately digital display interfaces (DVI, HDMI and Display Port) use a complex line code scheme, which can not be blanked as easily as VGA, so you must have a computer capable of analog (=VGA) output and a display with a VGA input.
I'm looking for a solution to play HD videos on a multimonitor OSX environment for a projector/desktop application. It could be one huge video, or a video split in parts.
So far I've been using Flash StageVideo successfully to play 1080p and 720p on single monitors. This works great with flash projectors. The problem with flash projectors is you can't span multiple monitors, or multiple windows. I still haven't tried opening multiple projectors, because I wouldn't know how to position each projector in a different monitor consistently.
In Adobe AIR you can have multiple windows and control their position, but AFAIK you can't use StageVideo to decode videos with the GPU... and using the classic Video class is really out of the question.
With C++ there are multiple frameworks (cinder/openFrameworks) but AFAIK opening multiple windows, or spaning multiple monitors is not such a good idea because of bad performance. I stil haven't figured out if it's possible or even a good idea to open one app per monitor and control it's position.
Has anyone succeeded in this problem using AS3 or any other language/framework like C++ with openFrameworks?
With AIR you can have a single window span multiple windows.
I have a 1920x1080 video scaled by 2 on stage of 3840x2160. I haven't yet tried using StageVideo to up the resolution, but I am hopeful it will work.