Flutter video_player initialization error when initializing multiple videos - amazon-web-services

Problem I need help with
Assist with optimizing latency in its "Short-Video Feed" and solving intermittent performance bugs. A central feature of my app requires seamless plays 15 to 60 second clips when users use "swipe up" hand gestures similarly to TikTok and Instagram reels. Right now, I have performance bugs (intermittent) such as black screens, delayed loading screens, sometimes long loading, etc.
The bugs may be caused because Flutter is slower than Native iOS. However, our "Short-Video Feed" has lots of bugs whether I use a M3U8 (Mux), or whether I use a MP4 based approach with AWS S3.
If I use the Mux based approach with M3U8, "Short-Video Feed" there is a noticeable few milliseconds black screen for each short-video playback.
If I use the Amazon based approach with MP4, "Short-Video Feed" intermittently loads for a few seconds (sometimes minutes) when there is low bandwidth, and some videos get stuck even when a user returns to a location with faster bandwidth.
Open issue on Flutter
https://github.com/flutter/flutter/issues/25558
Approaches I have tried with no success:
Native Player. I tried to use a native video player for Android/iOS, with MP4 and M3U8, but the UI was still very laggy (because of data transfer between Android/iOS and flutter latenices).
Flutter Player. I tried to use a Flutter video player for Android/iOS, with MP4 and M3U8, but the UI shows a black screen with M3U8 & heavy loading for poor internet connections with MP4.
Approaches I need help to try:
Optimize M3U8 player to minimize the black screen issue. or...
Create MP4 chunks to optimize for poor reception areas (this is what I think TikTok Instagram Reels, and similar applications do based on what I can see).
Has anyone solved this issue?

How about isolating whether these lags are due to network buffering or due to Flutter (or even a device hardware limitation such as memory or GPU)?
Perhaps use a few local MP4 files with identical frame rates and encoding parameters (both video and audio) and see whether the UI lag is reproducible upon swipe-up scrolling?

Related

Multiple audio tracks for video playback in Qt

I'm developing a small video editor to make quick edits to a multiple audio track video, using Qt. I'm a bit confused about whether it is possible or not to handle multiple audio tracks in payback and processing in Qt.
What I want to do with the video
list audio tracks
for each track, manage its volume, and choose to perform channel duplication on a mono track (to make it stereo).
play the video with the setting I chose for audio tracks, eventually being able to change settings on the fly.
extract a part of the video with the settings I chose.
I'm not sure if Qt can handle this by itself, or if I need to rely on a library specialized in video processing. Therefore, my question is double
If it is possible in Qt-only : how ?
I suppose I need to use QMediaPlayer class, but it doesn't look like it can handle multiple tracks at once.
Maybe by splitting the media into several sub-media, but then how to synchronize their playback?
Otherwise : external library. Are there any caveats to avoid?
I wonder what is the best way, if there is one, to display video frames (assuming audio will be handled by external lib)?
Should I draw frames directly on a QWidget, or should I use OpenGL directly? Or another method?
Note: I'm not forcibly looking for a detailed answer, I'm fine with short ones and/or external resources.
After a bit of searching around, I found that it is impossible to do in with Qt as for now.
I found several libraries able to fulfill such purpose (list is non exhaustive for sure) :
libvlc, which was able to render easily on a QWidget surface, but at the time I looked, was not able to play multiple channels at a time. Next version should be able to do it tho.
libffmpeg and libav, which are a bit too low-level for what I want, and which I found fairly complicated to use to render on a QWidget surface.
libopenshot, which has quite a difficult learning curve, and not that much documentation unfortunately, as it is oriented toward nonlinear video editing.
libgstreamer. The one I chose.
I found libgstreamer to be the best for my purpose, being fairly high level, and well documented. It also is flexible to use, as it allows loading from and dumping to any kind of file automatically.
It is able to render directly on a QWidget surface, which is super convenient.
On the other hand, it has an async and dynamic design which requires a bit more planning and error management, but documentation is there to help.
So far I didn't encountered any major problem so far, I'll update this post if something new comes up.

Qt video playback with libmpv makes QtWebEngine jquery content unsmooth - Ubuntu

I have a project that uses libmpv with the opengl widget (as per the examples that come with libmpv) along with a QtWebEngine widget that displays information, graphics and animations (a scrolling ticker for example).
I found that of the video playback options in Qt, mpv was the smoothest and most reliable. It will play back perfectly smoothly any video up to 1080p.
However while video is playing, any animations in QtWebEngine are unsmooth and jittery. Video is also slightly less smooth when something is moving in the webpage.
The system I'm testing with is not struggling for resources (cpu use is around 45%). There is also not any video decoding, as it's playing back raw video (but while it plays back encoded video, the effect is the same, regardless if hardware acceleration is enabled or not).
I figure that the mpv widget is interrupting the MainWindow thread while it processes frames and causing it to freeze every few milliseconds.
As far as I know there is no way to separate the mpv thread from the MainWindow thread though.
I don't know if it'll be possible to make mpv and webengine work together smoothly. I feel like there must be some way to run two widgets at once in one window and not have them mess with each other.
I'm testing with Ubuntu 18.04, QT 5.11 and the latest mpv from git.
Does anyone have any advice or pointers for what to try first? I realise this is somewhat of a broad question but my knowledge of graphics is limited. If anyone has advice on a conceptual level (I don't need someone to code me a fix) I can investigate myself.
Thank you.

[C++|winapi]Can you access video output of application before it is displayed?

I want to capture the video output of an application using C++ and winapi, and stream it over the network. At the moment, I am capturing this output using a DirectShow filter. The application displays it's video output on the screen, and I just capture whatever it is there. I want to optimize this process.
My question is: Is there a way to capture the video/audio output of an application before it is displayed on the screen?
Thanks.
Capture video before it is shown?
It is depends on how is the application provides the video for you.
Real-time rendering - You can't access what's not exists. Like video games, or any dynamic rendering only displaying the actual state, and perhaps don't know anything about the future.
Also there's an anomaly, when rendering becomes slower than the screen's refresh rate, called screen tearing.
Static displaying - All the data is available already. For example if it's a video player application, with a video on your local machine, your only task is to get the data, and capture it with the appropriate position in time.
Last but not least, every hardware has a reaction time, a small delay to process data.
Also, there is a similar question Fastest method of screen capturing on Windows

cocos2d-iphone games stop playing background music sometimes

cocos2d-iphone 1.0.1.
I have noticed this with other cocos2d-iphone games installed in my device, like Kingdom Rush.
Basically, most of the times, the audio is fine (almost always). But suddenly, at an unexpected moment, the background music stops playing and only the sound effects work. Sometimes, killing the application will not be enough to fix it.
With my cocos2d-iphone game this happens as well, with no hint in the console. I use SimpleAudioEngine to play background music and sound effects.
Killing my application, restarting Xcode will not fix it. I usually just ignore the problem and, in the near future, it is suddenly gone. I suspect that rebooting the device tends to fix this, but that's beyond the point: I should know why is it happening.
I also tried preloading my background music. Doesn't change a thing.
I believe I have experienced this problem with both .mp3 and .wav formats.
Why might this be happening?
No idea to the exact reason, but I can think of a few:
memory warning causes audio stream to be interrupted
audio interruptions (calendar notification, incoming SMS/call) not handled properly by CocosDenshion
other streaming music is played (ie perhaps videoplayer, iPod music player)
music isn't streamed but buffered, which means music is fighting over audio buffers with all other effects - eventually so many audio effects are played that older buffers have to cancel playback (which might be the music) in order to allow the new effect to play
defective device (since it happens in other apps …)
bug in CocosDenshion (check the cocos2d issue tracker and forum for any unresolved audio bugs)
I think you can exclude the latter if you're using the playBackgroundMusic API to stream music instead of buffering it.

hot to play Hd videos in multiple monitors

I'm looking for a solution to play HD videos on a multimonitor OSX environment for a projector/desktop application. It could be one huge video, or a video split in parts.
So far I've been using Flash StageVideo successfully to play 1080p and 720p on single monitors. This works great with flash projectors. The problem with flash projectors is you can't span multiple monitors, or multiple windows. I still haven't tried opening multiple projectors, because I wouldn't know how to position each projector in a different monitor consistently.
In Adobe AIR you can have multiple windows and control their position, but AFAIK you can't use StageVideo to decode videos with the GPU... and using the classic Video class is really out of the question.
With C++ there are multiple frameworks (cinder/openFrameworks) but AFAIK opening multiple windows, or spaning multiple monitors is not such a good idea because of bad performance. I stil haven't figured out if it's possible or even a good idea to open one app per monitor and control it's position.
Has anyone succeeded in this problem using AS3 or any other language/framework like C++ with openFrameworks?
With AIR you can have a single window span multiple windows.
I have a 1920x1080 video scaled by 2 on stage of 3840x2160. I haven't yet tried using StageVideo to up the resolution, but I am hopeful it will work.