Setting fixed FPS doesn't work - c++

I am working on a 2D game for a school project. The template my teacher gave works well, but I wanted to replace a very, very stupid thing in the code. In the code he calls a heavy method 20 times to slow the game down. Instead of doing that, I want to check if the next frame should be handled.
The game is an object inside the template namespace. This namespace had an endless loop that calls the game tick method and swaps the frame buffers.
Inside this template, I replaced the Game->Tick() with a simple if statement:
if (game->Ready(lastftime)) {
game->Tick();
}
lastframe is the time difference in in seconds between the last time it's called and now. I know I could use this time to calculate movements inside the game tick, but that's not what I want to do right now!
This is the Ready method:
bool Game::Ready(float timedif) {
// Add time to framecounter
framecounter += timedif;
// Check if the counter over lapses the fps
if (framecounter > fps) {
// If so, substract the fps from the counter
while (framecounter > fps) {
framecounter -= fps;
}
m_Screen->Clear(0);
Draw();
return true;
}
// Frame is still inside the fps margin
return false;
}
fps is calculated as follows: fps = 1000.0f/60
I have no idea why it doesn't run at 60 frames per second, and I'm 100% sure it's been called more than that (tested with a printf). Any help would be appreciated.

what is timedif?
you should have some info about when in "real time" - for examples in ms was the last frame
It seems to me that you assume that every iteration of ur while loop takes one ms.
bool Game::Ready(float msSinceLastFrame)
{
if(msSinceLastFrame > fps)
{
m_Screen->Clear();
Draw();
return true;
}
return false
}
//call this using some kind of while where you update msSinceLastFrame
if(game->Ready(msSinceLastFrame)
{
msSinceLastFrame = 0;
game->Tick();
}
but, if you will use this approach you need to call game::ready loop using while();
I Would suggest other approach.
void Game::gameLoop()
{
auto lastFrameTimeStamp = GetTickCount(); // if you are on windows
auto frameLength = 1000/fps;
while(true) //true or some kind of condition
{
auto currentTimeStamp = GetTickCount();
if( currentTimeStamp - lastFrameTimeStamp >= frameLength)
{
//do your job;
lastFrameTimeStamp = currentTimeStamp;
}
Sleep(1); // you can use sleep here if you dont want to heat ur pcu =)
}
}

Why are you subtracting the fps from framecounter? You need to reset framecounter when it exceeds fps.
Try the following:
bool Game::Ready(float timedif) {
// Add time to framecounter
framecounter += timedif;
// Check if the counter over lapses the fps
if (framecounter > fps) {
framecounter = 0;
m_Screen->Clear(0);
Draw();
return true;
}
// Frame is still inside the fps margin
return false;
}

Related

QueryPerformanceCounter limiting/speeding up slide speed

I have a thread that waits on a std::condition_variable then loops till it is done.
Im trying to slide my rect that is drawn in opengl.
Everything works fine without using a delta, But i would like my rect to slide at the same speed no matter what computer it is ran on.
At the moment it jumps about half way then slides really slow.
If i dont use my delta it does not run at the same speed if ran on slower computers.
Im not sure if i should ihave a if statement and check if time has passed then do the sliding, an not use a delta?
auto toolbarGL::Slide() -> void
{
LARGE_INTEGER then, now, freq;
QueryPerformanceFrequency(&freq);
QueryPerformanceCounter(&then);
while (true)
{
// Waits to be ready to slide
// Keeps looping till stopped then starts to wait again
SlideEvent.wait();
QueryPerformanceCounter(&now);
float delta_time_sec = (float)(now.QuadPart - then.QuadPart) / freq.QuadPart;
if (slideDir == SlideFlag::Right)
{
if (this->x < 0)
{
this->x += 10 * delta_time_sec;
this->controller->Paint();
}
else
SlideEvent.stop();
}
else if (slideDir == SlideFlag::Left)
{
if (this->x > -90)
{
this->x -= 10 * delta_time_sec;
this->controller->Paint();
}
else
SlideEvent.stop();
}
else
SlideEvent.stop();
then = now;
}
}
If you want your rectangle to move at a steady speed no matter what, I suggest a different approach -- instead of relying on your code executing at a particular time and causing a side effect (like x += 10) each time, come up with a function that will tell you what the rectangle's location should be at any given time. That way, no matter when your Paint() method is called, it will always draw the rectangle at the location that corresponds to that time.
For example:
// Returns the current time, in microseconds-since-some-arbitrary-time-zero
unsigned long long GetCurrentTimeMicroseconds()
{
static unsigned long long _ticksPerSecond = 0;
if (_ticksPerSecond == 0) _ticksPerSecond = (QueryPerformanceFrequency(&tps)) ? tps.QuadPart : 0;
LARGE_INTEGER curTicks;
if ((_ticksPerSecond > 0)&&(QueryPerformanceCounter(&curTicks)))
{
return (curTicks.QuadPart*1000000)/_ticksPerSecond;
}
else
{
printf("GetCurrentTimeMicroseconds() failed, oh dear\n");
return 0;
}
}
[...]
// A particular location on the screen
int startPositionX = 0;
// A clock-value at which the rectangle was known to be at that location
unsigned long long timeStampAtStartPosition = GetCurrentTimeInMicroseconds();
// The rectangle's current velocity, in pixels-per-second
int speedInPixelsPerSecond = 10;
// Given any clock-value (in microseconds), returns the expected position of the rectangle at that time
int GetXAtTime(unsigned long long currentTimeInMicroseconds)
{
const long long timeSinceMicroseconds = currentTimeInMicroseconds-timeStampAtStartPosition;
return startPositionX + ((speedInPixelsPerSecond*timeSinceMicroseconds)/1000000);
}
void PaintScene()
{
const int rectX = GetXAtTime(GetCurrentTimeMicroseconds());
// code to paint the rectangle at position (rectX) goes here...
}
Given the above, your program can call PaintScene() as seldom or as often as it wants, and your rectangle's on-screen speed will not change (although the animation will look more or less smooth, depending on how often you call it).
Then if you want the rectangle to change its direction of motion, you can just do something like this:
const unsigned long long now = GetCurrentTimeInMicroseconds();
startPositionX = GetXAtTime(now);
speedInPixelsPerSecond = -speedInPixelsPerSecond; // reverse course!
The above example uses a simple y=mx+b-style equation that provides linear motion, but you can get many different types of motion, by using different parametric equations that take a time-value argument and return a corresponding position-value.

c++ calculate FPS from hooking a function that is called each frame

Ok, so i am making this little 'program' and would like to be able to calculate FPS. I had an idea that if i hook a function that is called each frame i could possibly calculate the FPS?
Here's a complete fail, now that i look at it this code again i see how stupid i was to think this would work:
int FPS = 0;
void myHook()
{
if(FPS<60) FPS++;
else FPS = 0;
}
Obviously this is an idiotic attempt, though not sure why i even logically thought it might work in the first place...
But yeah, IS it possible to calculate FPS via hooking a function that is called each frame?
I sat down and was thinking of possible ways to do this but i just couldn't come up with anything. Any info or anything would be helpful, thanks for reading :)
This should do the trick:
int fps = 0;
int lastKnownFps = 0;
void myHook(){ //CALL THIS FUNCTION EVERY TIME A FRAME IS RENDERED
fps++;
}
void fpsUpdater(){ //CALL THIS FUNCTION EVERY SECOND
lastKnownFps = fps;
fps = 0;
}
int getFps(){ //CALL THIS FUNCTION TO GET FPS
return lastKnownFps;
}
You can call your hook function to do the fps calculation but before being able to do that you should:
Keep track of the frames by incrementing a counter each time a redraw is performed
Keep track of how much time has passed since last update (get the current time in your hook function)
Calculate the following
frames / time
Use a high resolution timer. Use a reasonable update rate (1/4 sec or the like).
You can find the time difference between succussive frames. The inverse of this time will give you frame rate. You need to implement a finction getTime_ms() which returns current time in ms.
unsigned int prevTime_ms = 0;
unsigned char firstFrame = 1;
int FPS = 0;
void myHook()
{
unsigned int timeDiff_ms = 0;
unsigned int currTime_ms = getTime_ms(); //Get the current time.
/* You need at least two frames to find the time difference. */
if(0 == firstFrame)
{
//Find the time difference with respect to previous time.
if(currTime_ms >= prevTime_ms)
{
timeDiff_ms = currTime_ms-prevTime_ms;
}
else
{
/* Clock wraparound. */
timeDiff_ms = ((unsigned int) -1) - prevTime_ms;
timeDiff_ms += (currTime_ms + 1);
}
//1 Frame:timeDiff_ms::FPS:1000ms. Find FPS.
if(0 < timeDiff_ms) //timeDiff_ms should never be zero. But additional check.
FPS = 1000/timeDiff_ms;
}
else
{
firstFrame = 0;
}
//Save current time for next calculation.
prevTime_ms = currTime_ms;
}

FPS stutter - game running at double FPS specified?

I have managed to create a system that allows game objects to move according to the change in time (rather than the change in number of frames). I am now, for practise's sake, trying to impose a FPS limiter.
The problem I'm having is that double the amount of game loops are occurring than what I expect. For example, if I try to limit the number of frames to 1 FPS, 2 frames will pass in that one second - one very long frame (~1 second) and 1 extremely short frame (~15 milliseconds). This can be shown by outputting the difference in time (in milliseconds) between each game loop - an example output might be 993, 17, 993, 16...
I have tried changing the code such that delta time is calculated before the FPS is limited, but to no avail.
Timer class:
#include "Timer.h"
#include <SDL.h>
Timer::Timer(void)
{
_currentTicks = 0;
_previousTicks = 0;
_deltaTicks = 0;
_numberOfLoops = 0;
}
void Timer::LoopStart()
{
//Store the previous and current number of ticks and calculate the
//difference. If this is the first loop, then no time has passed (thus
//previous ticks == current ticks).
if (_numberOfLoops == 0)
_previousTicks = SDL_GetTicks();
else
_previousTicks = _currentTicks;
_currentTicks = SDL_GetTicks();
//Calculate the difference in time.
_deltaTicks = _currentTicks - _previousTicks;
//Increment the number of loops.
_numberOfLoops++;
}
void Timer::LimitFPS(int targetFps)
{
//If the framerate is too high, compute the amount of delay needed and
//delay.
if (_deltaTicks < (unsigned)(1000 / targetFps))
SDL_Delay((unsigned)(1000 / targetFps) - _deltaTicks);
}
(Part of the) game loop:
//The timer object.
Timer* _timer = new Timer();
//Start the game loop and continue.
while (true)
{
//Restart the timer.
_timer->LoopStart();
//Game logic omitted
//Limit the frames per second.
_timer->LimitFPS(1);
}
Note: I am calculating the FPS using a SMA; this code has been omitted.
The problem is part of how your timer structure is setup.
_deltaTicks = _currentTicks - _previousTicks;
I looked though the code a few times and based on how you have the timer setup this seems to be the problem line.
Walking though it at the start of your code _currentTicks will equal 0 and _previousTicks will equal 0 so for the first loop your _deltaTicks will be 0. This means regardless of your game logic LimitFPS will misbehave.
void Timer::LimitFPS(int targetFps)
{
if (_deltaTicks < (unsigned)(1000 / targetFps))
SDL_Delay((unsigned)(1000 / targetFps) - _deltaTicks);
}
We just calculated that _deltaTicks was 0 at the start of the loop, thus we wait the full 1000ms regardless of what happens in your game logic.
On the next frame we have this, _previousTicks will equal 0 and _currentTicks will now equal approximately 1000ms. This means the _deltaTicks will be equal to 1000.
Again we hit the LimitFPS function but this time with a _deltaTicks of 1000 which makes it so we wait for 0ms.
This behavior will constantly flip like this.
wait 1000ms
wait 0ms
wait 1000ms
wait 0ms
...
those wait 0ms essentially double your FPS.
You need to test for time at the start and end of your game logic.
//The timer object.
Timer* _timer = new Timer();
//Start the game loop and continue.
while (true)
{
//Restart the timer.
_timer->LoopStart();
//Game logic omitted
//Obtain the end time
_timer->LoopEnd();
//Now you can calculate the delta based on your start and end times.
_timer->CaculateDelta();
//Limit the frames per second.
_timer->LimitFPS(1);
}
Hope that helps.
Assuming your gameloop is triggered only when SDL_Delay() expired wouldn't it be enough to write:
void Timer::LimitFPS(int targetFps)
{
SDL_Delay((unsigned)(1000 / targetFps));
}
to delay your gameloop by the right amount of milliseconds depending on the requested target FPS?

C++ SDL framerate drops when holding key down

I'm trying to make a simple tile-based platformer in C++ and SDL2. My framerate stays at 59-60 fps, but when I start to hold down a key, it loses about 10 fps. This happens even when I don't call update or retrieve the keystates. This is the code inside my game loop:
//keys = (Uint8 *)SDL_GetKeyboardState(NULL);
elapsed = SDL_GetTicks() - current;
current += elapsed;
timeSinceSecond += elapsed;
//update(keys, elapsed / 1000.0);
draw();
frames++;
if(timeSinceSecond >= 1000) {
timeSinceSecond = 0;
cout << frames << endl;
frames = 0;
}
next = SDL_GetTicks();
if(next - current < 1000.0 / framerate) {
SDL_Delay(1000.0 / framerate - (next - current));
}
Any ideas on why this is happening? Could it be that it's a problem with SDL2? I haven't tried this with SDL 1.2.
SDL_Delay will not work the way you want. It is not precise enough (has 10 millisecond precision), so it will be impossible to get required number of frames per second this way. Us vsync instead. Another thing is that printing to stderr/stdout is slow when console is visible. If you're printing something when key is pressed, or if pressing the key somehow increases amount of text being printed, game will slow down.

Start count from zero on each keypress

I have a program in which I am drawing images on the screen. The draw function here is called per frame inside in which I have all my drawing code.
I have written an image sequencer that return the respective image from an index of images.
void draw()
{
sequence.getFrameForTime(getCurrentElapsedTime()).draw(0,0); //get current time returns time in float and startson application start
}
On key press, I have start the sequences from the first image [0] and then go on further. So, everytime I press a key, it has to start from [0] unlike the above code where it basically uses the currentTime%numImages to get the frame (which is not the start 0 position of image).
I was thinking to write a timer of own that basically can be triggered everytime I press the key so that the time always starts from 0. But before doing that, I wanted to ask if anybody had better/easier implementation ideas for this?
EDIT
Why I didn't use just a counter?
I have framerate adjustments in my ImageSequence as well.
Image getFrameAtPercent(float rate)
{
float totalTime = sequence.size() / frameRate;
float percent = time / totalTime;
return setFrameAtPercent(percent);
}
int getFrameIndexAtPercent(float percent){
if (percent < 0.0 || percent > 1.0) percent -= floor(percent);
return MIN((int)(percent*sequence.size()), sequence.size()-1);
}
void draw()
{
sequence.getFrameForTime(counter++).draw(0,0);
}
void OnKeyPress(){ counter = 0; }
Is there a reason this wont suffice?
What you should do is increase a "currentFrame" as a float and convert it to an int to index your frame:
void draw()
{
currentFrame += deltaTime * framesPerSecond; // delta time being the time between the current frame and your last frame
if(currentFrame >= numImages)
currentFrame -= numImages;
sequence.getFrameAt((int)currentFrame).draw(0,0);
}
void OnKeyPress() { currentFrame = 0; }
This should gracefully handle machines with different framerates and even changes of framerates on a single machine.
Also, you won't be skipping part of a frame when you loop over as the remainder of the substraction is kept.