Making a loop with sf::Time - c++

Can I make a loop using sf::time class, as when I try and do this my program crashes.
while(time.asSeconds() <= 10.0f)
This is the while loop I used with sf::Time class.

EDIT:
sf::Clock clock; // Initializing clock starts counting time.
while(someCondition)
{
sf::Time time = clock.getElapsedTime();
//It's going to execute do something() until time is higher than 10.0f
while(time.asSeconds() <= 10.0f)
{
//we need this to ensure that we won't fall into the infinite loop
time = clock.getElapsedTime();
do something()
//Optional: clock.restart(); If you want your code to be executed every x seconds.
}
}
First you have to create sf::Clock variable, then grab time from clock to sf::Time variable.

Related

Is my solution to fixed timestep with delta time and interpolation wrong?

I am trying to write simple loop with fixed delta time used for physics and interpolation before rendering the state. I am using Gaffer on games tutorial on fixed timesteps and I tried to understand it and make it work.
float timeStep = 0.01;
float alpha = 1.0;
while (isOpen()) {
processInput();
deltaTime = clock.restart(); // get elapsed time
if (deltaTime > 0.25) { deltaTime = 0.25; } // drop frame guard
accumulator += deltaTime;
while (accumulator >= timeStep) {
// spritePosBefore = sprite.getPosition();
accumulator -= timeStep;
// sprite.move(velocity * timeStep, 0);
// spritePosAfter = sprite.getPosition();
}
if (accumulator > timeStep) { alpha = accumulator / timeStep; } else { alpha = 1.0; }
// sprite.setPosition(Vector2f(spritePosBefore * (1 - alpha) + spritePosAfter * alpha));
clear();
draw(sprite);
display();
}
Now, everything looks good. I have fixed timestep for physics, draw whenever I can after physics are updated and interpolate between two positions. It should work flawless but I can still see sprite stuttering or even going back by one pixel once in a while. Why does it happen? Is there any problem with my code? I spent last two days trying to understand game loop which would ensure me flawless motions but it seems like it doesn't work as I thought it will. Any idea what could be improved?
You should remove the if statement and always calculate alpha; the if statement will never be executed as the condition is always false after the while loop is exited!
After the loop the accumulator will be between 0 and timeStep so you just end up drawing the latest position instead of interpolating.
I don't think the way you do it is necessarily wrong but it looks a bit overcomplicated. I don't understand exactly what you're trying to do so I'm just going to share the way I implement a "fixed time step" in my SFML applications.
The following is the simplest way and will be "good enough" for most applications. It's not the most precise though (the can be a little error between the measured time and the real time) :
sf::Clock clock;
sf::Event event;
while (window_.isOpen()) {
while (window_.pollEvent(event)) {}
if (clock.getElapsedTime().asSeconds() > FLT_FIXED_TIME_STEP) {
clock.restart();
update(FLT_FIXED_TIME_STEP);
}
render();
}
And if you really need precision, you can add a float variable that will act as a "buffer" :
sf::Clock clock;
sf::Event event;
float timeBeforeNextStep = 0.f; // "buffer"
float timeDilation = 1.f; // Useful if you want to slow or speed up time ( <1 for slowmo, >1 for speedup)
while (window_.isOpen()) {
while (window_.pollEvent(event)) {}
timeBeforeNextStep -= clock.restart().asSeconds() * timeDilation;
if (timeBeforeNextStep < FLT_FIXED_TIME_STEP) {
timeBeforeNextStep += FLT_FIXED_TIME_STEP; // '+=', not '=' to make sure we don't lose any time.
update(FLT_FIXED_TIME_STEP);
// Rendering every time you update is not always the best solution, especially if you have a very small time step.
render();
}
}
You might want to use another buffer for rendering (if you want to run at exactly 60 FPS for example).

SFML seconds counter

I'm making a game using SFML Library, I wanna implement a function showing the seconds on the screen since the running of the program, and it will get increasing until the window is closed. I tried this:
sf::Clock clock;
while (window.isOpen())
{
sf::Time elapsed = clock.restart();
updateGame(elapsed);
}
But I have no idea how it's work or even if it's the right function.
Here is my code so far https://github.com/basmaashouur/GamesLib/blob/master/cards/main.cpp
There are multiple ways to get the number of seconds.
First of all, you can use an exclusive sf::Clock for this that's never reset:
sf::Clock clock;
const unsigned int seconds = static_cast<unsigned int>(clock.getElapsedTime().asSeconds());
As an alternative, you can use an sf::Time to accumulate the time between frames (e.g. inside your updateGame() function):
sf::Clock clock;
sf::Time time;
time += clock.restart();
const unsigned int seconds = static_cast<unsigned int>(time.asSeconds());

SFML event loop performance issues

int main()
{
RenderWindow window(VideoMode(1280,720), "Game");
window.setActive(false);
std::thread gameThread(game, &window);
while (window.isOpen())
{
Event event;
while (window.pollEvent(event))
{
if (event.type == Event::Closed)
window.close();
}
}
gameThread.join();
return 0;
}
I'm making a basic game loop, and I noticed that every second or so the frametime will spike, sometimes up to 50 ms for a frame or two. I realized that when I comment out while (window.pollEvent(event)), the stuttering stops. Could this be a bug with SFML, or am I just doing something wrong?
Platform - Windows 10
Compiler - MSVC 2015
SFML version - 2.4.0
I got the frametime measurements by writing
Time elapsedTime = clock.getElapsedTime();
Int64 frameTime = elapsedTime.asMicroseconds() - oldTime.asMicroseconds();
fps.setString(std::to_string(frameTime));
oldTime = elapsedTime;
into the loop
EDIT - Now the issue persists no matter whether the event loop is commented out or not. Which is strange, because that definitely wasn't the case yesterday.

Sprite just every few seconds when moving with time SFML C++

This code move a sprite across the screen relative to time. However it appears to jump to the left every couple of seconds.
int ogreMaxCell = 9;
if (SpriteVector[i].getPosition().x > ogreDirectionX[i] )
{
sf::Vector2f ogreDirection = sf::Vector2f(-1,0);
float ogreSpeed = 1;
sf::Vector2f ogreVelocity = ogreDirection * ogreSpeed * 250000.0f * dt.asSeconds();
this->SpriteVector[i].move(ogreVelocity);
//gets the spritesheet row
orcSource.y = getCellYOrc(orcLeft);
}
if (ogreClock.getElapsedTime().asMilliseconds() > 250)
{
orcxcell = (orcxcell + 1) % ogreMaxCell;
ogreClock.restart();
}
SpriteVector[i].setTextureRect(sf::IntRect(orcSource.x + (orcxcell * 80), orcSource.y, 80, 80));
The statement for time is:
sf::Time dt; // delta time
sf::Time elapsedTime;
sf::Clock clock;
elapsedTime += dt;
dt = clock.restart();
Any insight as to why this is happening?
Regards
You didn't show how you implemented your time function, 2 possibilities:
The first possibility is where you, you declared the time variables outside of
the time function's loop, in that case the result is movement with some varying degree, but it looks to me from the if structure the error most likely lies in possibility 2. 250000.0f is an insainly huge number to have to use when dealing with offsets, and the use of ogre.clock tells me #2 is more likely
2
Both the variables and the declarations of the time function are looped.
I threw that function in a compilier, and set cout to output both values as microseconds.
output is elapsedTime is always 0, and dt is always around 0-4ish microseconds, except every so often for a certian reason it is equal 400-2000ish microseconds.
The effect of this is that it made you have to use a second clock to control your time so your animations didn't glitch, and your animation will jump to the left every so often, because dt goes from being 4 microseconds to 1500 microseconds randomly. It also explains why you have to multiply by such a huge constant, because you are using miliseconds,, and keep getting infintesmily small values for dt.
There are a few problems in the time function
dt = clock.restart(); =/= 0
you will always get some small time value because in the time it takes to reset the clock to 0 and to give the clock's value to the sf::time variable.
When the animation jumps it's because in that particular cycle it took the computer a little longer to assign the value after the clock reset.
the fix is pretty simple:
declare the variables outside the loop structure,
and adjust the code like this:
//declare before loop, if you dont, elapsed time constantly gets set to 0
sf::Time dt; // delta time
sf::Time elapsedTime;
sf::Clock clock;
//startloop structure of your choice
elapsedTime += clock.getElapsedTime();
dt = clock.getElapsedTime();
clock.restart();
and modify the second if statement to
if (elapsedTime.asMilliseconds() > 250)
{
orcxcell = (orcxcell + 1) % ogreMaxCell;
elapsedTime = milliseconds(0)
}
sf::Time is just a variable, the clock has to do the counting.
Hope this helps.
p.s. always write declarations outside of your loop structures, it works okay most of the time, but from time to time it will cause you to get strange errors like this one, or random crashes.

FPS stutter - game running at double FPS specified?

I have managed to create a system that allows game objects to move according to the change in time (rather than the change in number of frames). I am now, for practise's sake, trying to impose a FPS limiter.
The problem I'm having is that double the amount of game loops are occurring than what I expect. For example, if I try to limit the number of frames to 1 FPS, 2 frames will pass in that one second - one very long frame (~1 second) and 1 extremely short frame (~15 milliseconds). This can be shown by outputting the difference in time (in milliseconds) between each game loop - an example output might be 993, 17, 993, 16...
I have tried changing the code such that delta time is calculated before the FPS is limited, but to no avail.
Timer class:
#include "Timer.h"
#include <SDL.h>
Timer::Timer(void)
{
_currentTicks = 0;
_previousTicks = 0;
_deltaTicks = 0;
_numberOfLoops = 0;
}
void Timer::LoopStart()
{
//Store the previous and current number of ticks and calculate the
//difference. If this is the first loop, then no time has passed (thus
//previous ticks == current ticks).
if (_numberOfLoops == 0)
_previousTicks = SDL_GetTicks();
else
_previousTicks = _currentTicks;
_currentTicks = SDL_GetTicks();
//Calculate the difference in time.
_deltaTicks = _currentTicks - _previousTicks;
//Increment the number of loops.
_numberOfLoops++;
}
void Timer::LimitFPS(int targetFps)
{
//If the framerate is too high, compute the amount of delay needed and
//delay.
if (_deltaTicks < (unsigned)(1000 / targetFps))
SDL_Delay((unsigned)(1000 / targetFps) - _deltaTicks);
}
(Part of the) game loop:
//The timer object.
Timer* _timer = new Timer();
//Start the game loop and continue.
while (true)
{
//Restart the timer.
_timer->LoopStart();
//Game logic omitted
//Limit the frames per second.
_timer->LimitFPS(1);
}
Note: I am calculating the FPS using a SMA; this code has been omitted.
The problem is part of how your timer structure is setup.
_deltaTicks = _currentTicks - _previousTicks;
I looked though the code a few times and based on how you have the timer setup this seems to be the problem line.
Walking though it at the start of your code _currentTicks will equal 0 and _previousTicks will equal 0 so for the first loop your _deltaTicks will be 0. This means regardless of your game logic LimitFPS will misbehave.
void Timer::LimitFPS(int targetFps)
{
if (_deltaTicks < (unsigned)(1000 / targetFps))
SDL_Delay((unsigned)(1000 / targetFps) - _deltaTicks);
}
We just calculated that _deltaTicks was 0 at the start of the loop, thus we wait the full 1000ms regardless of what happens in your game logic.
On the next frame we have this, _previousTicks will equal 0 and _currentTicks will now equal approximately 1000ms. This means the _deltaTicks will be equal to 1000.
Again we hit the LimitFPS function but this time with a _deltaTicks of 1000 which makes it so we wait for 0ms.
This behavior will constantly flip like this.
wait 1000ms
wait 0ms
wait 1000ms
wait 0ms
...
those wait 0ms essentially double your FPS.
You need to test for time at the start and end of your game logic.
//The timer object.
Timer* _timer = new Timer();
//Start the game loop and continue.
while (true)
{
//Restart the timer.
_timer->LoopStart();
//Game logic omitted
//Obtain the end time
_timer->LoopEnd();
//Now you can calculate the delta based on your start and end times.
_timer->CaculateDelta();
//Limit the frames per second.
_timer->LimitFPS(1);
}
Hope that helps.
Assuming your gameloop is triggered only when SDL_Delay() expired wouldn't it be enough to write:
void Timer::LimitFPS(int targetFps)
{
SDL_Delay((unsigned)(1000 / targetFps));
}
to delay your gameloop by the right amount of milliseconds depending on the requested target FPS?