OpenCV small screen showing delay? - c++

I am using OpenCV to write an app (in C++ on Windows 7) that uses the cv.camshift() function to track an object on the screen. I noticed that the my camera window (my application window showing what the camera sees) has a little delay with respect to very rapid motions. The delay seems to be about 0.1 seconds - very small, but noticable. I am developing an application that is very sensitive to these delays. In order to rule out my coding error, I tried to use one of the OpenCV sample video apps that shows what the camera sees on the screen and it also had this tiny delay. Interestingly, when I look at what my camera sees through Skype, there seems to be virtually no delay at all. Is there anything I can do to make OpenCV operate faster to get rid of this tiny delay?

CamShift detects motion using meanShift - the mean motion of the object center. This has to be calculated over more than one frame. For a frame rate of 30 Hz, a depth of 3 frames would be 0.1 seconds.

Related

Holoens 2 - VLC sensor frames have incorrect timestamps (frames out of order)

Im using the following repo to access and save the device streams:
https://github.com/microsoft/HoloLens2ForCV
When recording using StreamRecorder it seems that the timestamps returned by all of the visible light cameras are frequently incorrect resulting in an out of order sequence of frames.
To confirm this, I made a recording while looking at a stopwatch with each visible light camera. There are many frames where the reading on the stopwatch is lower than the previous frame (despite the frame's larger timestamp). Sometimes a disruption lasting more than 5 frames happens before the timesteps seem to become correct again.
This happens often enough for it to be a serious inconvenience. For a rough idea, I counted 12 times where the stopwatch time decreased compared to the previous frame in a 10 second recording. The out of order frames are very noticeable in the resulting video playback.
I tried using timestamp.SensorTicks instead of timestamp.HostTicks in RMCameraReader.cpp but the issue persisted.
This does not happen with the PV frames or with either mode of the depth sensor frames.
I'm using the latest insider preview build: Windows Version 21H1, OS build 20346.1402
I may be wrong but I do not recall this issue occurring with the first few insider builds that supported research mode, however, I couldn't find the older insider builds online to try.
Is there any way to fix this issue?
Thanks a lot!

Why does a full screen window resolution in OpenCV (# Banana Pi, Raspbian) slow down the camera footage and let it lag?

Currently I’m working on a project to mirror a camera for a blind spot.
The camera got 640 x 480 NTSC signal.
The output screen is 854 x 480 NTSC.
I grab the camera with an EasyCAP video grabber.
On the Banana Pi I installed open cv 2.4.9.
The critical point of this project is that the video on the display needs to be real time.
Whenever I comment the line that puts the window into fullscreen, there pop ups a small window and the footage runs without delay and lagg.
But when I set the video to full screen, the footage becomes slow, and lags.
Part of the code:
namedWindow("window",0);
setWindowProperty("window",CV_WND_PROP_FULLSCREEN,CV_WINDOW_FULLSCREEN);
while(1){
cap>>image;
flip(image, destination,1);
imshow("window",destination);
waitKey(33); //delay 33 ms
}
How can I fill the screen with the camera footage without losing speed and frames?
Is it possible to output the footage directly to the composite output?
The problem is that upscaling and drawing is done in software here. The Banana Pi processor is not powerful enough to process the needed throughput with 30 frames per second.
This is an educated guess on my side, as even desktop systems can run into lag problems when processing and simultaneously displaying video.
A common solution in the computer vision community for this problem is to use OpenGL for display. Here, the upscaling and display is offloaded to the graphics processor. You can do the same thing on a Banana Pi.
If you compiled OpenCV with OpenGL support, you can try it like this:
namedWindow("window", WINDOW_OPENGL);
imshow("window", destination);
Note that if you use OpenGL, you can also save on the flip operation by using an approprate modelview matrix. For this however you probably need to dive into GL code yourself instead of using imshow.
I fixed the whole problem by using:
namedWindow("window",1);
With FLAG 1 stands for WINDOW_AUTOSIZE.
The footage is more real-time now.
I’m using a small monitor, so the window size is nearly the same as the monitor.

OpenCv C++ record video when motion detected from cam

I am attempting to use a straightforward motion detection code to detect movement from a camera. I'm using the OpenCV library and I have some code that takes the difference between two frames to detect a change.
I have the difference frame working just fine and it's black when no motion is present.
The problem is how now i can detect that blackness to stop recording or no darkness to begin recording frames.
Thank u all.
A very simple thing to do is to sum the entire diff image into an integer. If that sum is above a threshold you have movement. Then you can use a second threshold and when the sum is below that limit you stopped having movement.
You can also make the threshold only change the program state if some elapsed time has occurred since the last threshold. i.e. after movement is detected you don't check for lack of movement for 10 seconds.
Take a look at the code of the free software motion for getting inspiring ideas.
There are quite a few things to keep in mind for reliable motion detection. For example tolerate the slow changes from the sun's rotation. Or accepting momentary image glitches which can come especially from the cheapest cameras.
From a small experience I have had, I think that better than just adding up all differences, it works better to count the number of pixels whose variation exceeds a certain threshold.
Motion also offers masks, which let you for example ignore movements in a nearby road.
What about storing a black frame internally and using your same comparison code? If your new frame is different (above a threshold) from the all-black frame, start recording.
This seems the most straightforward since you already have the image-processing algorithms down.

Cocos2d frame rate dropping

I am really close to completing my cocos2d project and I have a problem, the frame rate keeps dropping in certain parts of my game. I have tried to look at the code that is being executed at the time that the fps drops but the problem with that is it doesn't always drop at the same point. So my question is how can I analyze my code and see where and why the frame rate is dropping? I am using cocos2d v1.1.0-beta2b

Constantly lag in opengl application

I'm getting some repeating lags in my opengl application.
I'm using the win32 api to create the window and I'm also creating a 2.2 context.
So the main loop of the program is very simple:
Clearing the color buffer
Drawing a triangle
Swapping the buffers.
The triangle is rotating, that's the way I can see the lag.
Also my frame time isn't smooth which may be the problem.
But I'm very very sure the delta time calculation is correct because I've tried plenty ways.
Do you think it could be a graphic driver problem?
Because a friend of mine run almost the exactly same program except I do less calculations + I'm using the standard opengl shader.
Also, His program use more CPU power than mine and the CPU % is smoother than mine.
I should also add:
On my laptop I get same lag every ~1 second, so I can see some kind of pattern.
There are many reasons for a jittery frame rate. Off the top of my head:
Not calling glFlush() at the end of each frame
other running software interfering
doing things in your code that certain graphics drivers don't like
bugs in graphics drivers
Using the standard windows time functions with their terrible resolution
Try these:
kill as many running programs as you can get away with. Use the process tab in the task manager (CTRL-SHIFT-ESC) for this.
bit by bit, reduce the amount of work your program is doing and see how that affects the frame rate and the smoothness of the display.
if you can, try enabling/disabling vertical sync (you may be able to do this in your graphic card's settings) to see if that helps
add some debug code to output the time taken to draw each frame, and see if there are anomalies in the numbers, e.g. every 20th frame taking an extra 20ms, or random frames taking 100ms.