I am trying to write simple loop with fixed delta time used for physics and interpolation before rendering the state. I am using Gaffer on games tutorial on fixed timesteps and I tried to understand it and make it work.
float timeStep = 0.01;
float alpha = 1.0;
while (isOpen()) {
processInput();
deltaTime = clock.restart(); // get elapsed time
if (deltaTime > 0.25) { deltaTime = 0.25; } // drop frame guard
accumulator += deltaTime;
while (accumulator >= timeStep) {
// spritePosBefore = sprite.getPosition();
accumulator -= timeStep;
// sprite.move(velocity * timeStep, 0);
// spritePosAfter = sprite.getPosition();
}
if (accumulator > timeStep) { alpha = accumulator / timeStep; } else { alpha = 1.0; }
// sprite.setPosition(Vector2f(spritePosBefore * (1 - alpha) + spritePosAfter * alpha));
clear();
draw(sprite);
display();
}
Now, everything looks good. I have fixed timestep for physics, draw whenever I can after physics are updated and interpolate between two positions. It should work flawless but I can still see sprite stuttering or even going back by one pixel once in a while. Why does it happen? Is there any problem with my code? I spent last two days trying to understand game loop which would ensure me flawless motions but it seems like it doesn't work as I thought it will. Any idea what could be improved?
You should remove the if statement and always calculate alpha; the if statement will never be executed as the condition is always false after the while loop is exited!
After the loop the accumulator will be between 0 and timeStep so you just end up drawing the latest position instead of interpolating.
I don't think the way you do it is necessarily wrong but it looks a bit overcomplicated. I don't understand exactly what you're trying to do so I'm just going to share the way I implement a "fixed time step" in my SFML applications.
The following is the simplest way and will be "good enough" for most applications. It's not the most precise though (the can be a little error between the measured time and the real time) :
sf::Clock clock;
sf::Event event;
while (window_.isOpen()) {
while (window_.pollEvent(event)) {}
if (clock.getElapsedTime().asSeconds() > FLT_FIXED_TIME_STEP) {
clock.restart();
update(FLT_FIXED_TIME_STEP);
}
render();
}
And if you really need precision, you can add a float variable that will act as a "buffer" :
sf::Clock clock;
sf::Event event;
float timeBeforeNextStep = 0.f; // "buffer"
float timeDilation = 1.f; // Useful if you want to slow or speed up time ( <1 for slowmo, >1 for speedup)
while (window_.isOpen()) {
while (window_.pollEvent(event)) {}
timeBeforeNextStep -= clock.restart().asSeconds() * timeDilation;
if (timeBeforeNextStep < FLT_FIXED_TIME_STEP) {
timeBeforeNextStep += FLT_FIXED_TIME_STEP; // '+=', not '=' to make sure we don't lose any time.
update(FLT_FIXED_TIME_STEP);
// Rendering every time you update is not always the best solution, especially if you have a very small time step.
render();
}
}
You might want to use another buffer for rendering (if you want to run at exactly 60 FPS for example).
Related
I calculate Newtonian physics based on gravitation in my 2D game. It works exactly how it's supposed to, when vsync is turned on (60fps), but once I turn it off and gain about 3.5k fps, the character starts to fall incredibly fast. The answer seems obvious, I just need to multiply character's velocity by deltaTime, but I already do that and still no result. It slows down the character a bit, but seems to be sort of not enough..
this is what character's update function looks like:
void Update(float deltaTime) {
if (!onGround) {
acceleration += -Physics::g; // 9.81f
/* EDIT: THIS IS WHAT IT ACTUALLY LOOKS LIKE, sorry*/
SetPosition(position + Vec2(0.0f, 1.0f) * deltaTime * acceleration);
/* instead of this:
SetPosition(position + Vec2(0.0f, 1.0f) * acceleration); */
if (ceiling) {
acceleration = 0;
ceiling = false;
}
} else {
acceleration = 0;
}
}
and here's the calculation of deltaTime
inline static void BeginFrame() {
currentTime = static_cast<float>(glfwGetTime()); // Time in seconds
delta = (currentTime - lastTime);
lastTime = currentTime;
}
What am I missing?
Thanks in advance.
The acceleration means how large the velocity increases per unit time, so you should multiply deltaTime to the acceleration, not only to the velocity.
In other words,
acceleration += -Physics::g; // 9.81f
should be:
acceleration += deltaTime * -Physics::g; // 9.81f
I am using SFML making a 2D platformer. I read so many timestep articles but they don't work well for me. I am implementing it like 2500 FPS timestep, on my desktop pc it's amazingly smooth, on my laptop it's getting 300 FPS(I check with Fraps), it's not that smooth at laptop but still playable.
Here are the code snippets:
sf::Clock clock;
const sf::Time TimePerFrame = sf::seconds(1.f/2500.f);
sf::Time TimeSinceLastUpdate = sf::Time::Zero;
sf::Time elapsedTime;
These are variables and here is the game loop,
while(!quit){
elapsedTime = clock.restart();
TimeSinceLastUpdate += elapsedTime;
while (TimeSinceLastUpdate > TimePerFrame){
TimeSinceLastUpdate -= TimePerFrame;
Player::instance()->handleAll();
}
Player::instance()->render();
}
In the Player.h, I've got movement constants,
const float GRAVITY = 0.35 /2500.0f; // Uses += every frame
const float JUMP_SPEED = -400.0f/2500.0f; //SPACE -> movementSpeed.y = JUMP_SPEED;
//When character is touching to ground
const float LAND_ACCEL = 0.075 /2500.0f; // These are using +=
const float LAND_DECEL = 1.5 /2500.0f;
const float LAND_FRICTION = 0.5 /2500.0f;
const float LAND_STARTING_SPEED = 0.075; // This uses =, instead of +=
In the handleAll function of Player class, there is
cImage.move(movementSpeed);
checkCollision();
And lastly, checkCollision function, simply checks if character's master bounding box intersects the object's rectangle from each side, sets the speed x or y to 0, then fixes the overlapping by setting character position to the edge.
//Collision
if(masterBB().intersects(objectsIntersecting[i]->GetAABB())){
//HORIZONTAL
if(leftBB().intersects(objectsIntersecting[i]->GetAABB())){
if(movementSpeed.x < 0)
movementSpeed.x = 0;
cImage.setPosition(objectsIntersecting[i]->GetAABB().left + objectsIntersecting[i]->GetAABB().width + leftBB().width , cImage.getPosition().y);
}
else if(rightBB().intersects(objectsIntersecting[i]->GetAABB())){
if(movementSpeed.x > 0)
movementSpeed.x = 0;
cImage.setPosition(objectsIntersecting[i]->GetAABB().left - rightBB().width , cImage.getPosition().y);
}
//VERTICAL
if(movementSpeed.y < 0 && topBB().intersects(objectsIntersecting[i]->GetAABB())){
movementSpeed.y = 0;
cImage.setPosition(cImage.getPosition().x , objectsIntersecting[i]->GetAABB().top + objectsIntersecting[i]->GetAABB().height + masterBB().height/2);
}
if(movementSpeed.y > 0 && bottomBB().intersects(objectsIntersecting[i]->GetAABB())){
movementSpeed.y = 0;
cImage.setPosition(cImage.getPosition().x , objectsIntersecting[i]->GetAABB().top - masterBB().height/2);
//and some state updates
}
}
I tried to use 60 FPS Timestep like million times but all speed variables become so slow, I can't simply do like *2500.0f / 60.0f to all constants, It doesn't feel same. If I get close constants, It feels "ok" but then when the collision happens, character's position is getting setted all the time and it flys out of the map because of the big lap on the object caused by high speed constants applied every frame I guess...
I need to add, Normally, the book I took the timestep code uses
cImage.move(movementSpeed*TimePerFrame.asSeconds());
but as you saw, I just put /2500.0f to every constant and I don't use it.
So, is 1/2500 seconds per frame good? If not, how can I change all of these to 1/60.0f?
You're doing it wrong.
Your monitor most likely has a refresh rate of 60 Hz (= 60 FPS), thus trying to render an image at 2500 FPS is a huge waste of resources. If the only reason for choosing 2500 FPS is that your movement doesn't work the same, haven't you ever thought about, that the problem then might be with the movement code?
At best you'd implement a fixed timestep (famous article), that way your physics can run at whatever rate you want (2500 "FPS" would still be crazy, so don't do it) and is independent from your rendering rate. So even if you get some varying FPS, it won't influence your physics.
So I have been playing around with DirectX11 lately and I'm still pretty new at it. I'm trying to move something right now with the translation and this is what I've got. I've been reading Frank D Luna's book on DirectX11 and he provides a gameTimer class but I really am not sure how to use delta time. This is the small snippet of code I was working with. Obviously this won't work because whenever I'm not pressing the key the time is still increasing and it's total time.
// Button down event.
if (GetAsyncKeyState('W') & 0x8000)
{
XMMATRIX carTranslate;
// Every quarter second incremete it
static float t_base = 0.0f;
if( (mTimer.TotalTime() - t_base) >= 0.25f )
t_base += 0.25f;
carPos.x = mTimer.TotalTime();
carPos.y = 1.0f;
carPos.z = 0.0f;
carTranslate = XMMatrixTranslation(carPos.x, carPos.y, carPos.z);
XMStoreFloat4x4(&mCarWorld, XMMatrixMultiply(carScale, carTranslate));
}
Usually we constantly render frames (redrawing screen) in a while loop (so called "main loop"). To "move" an object we just draw it in another position than it was in previous frame.
To move objects consistently, you need to know a time between frames. We call it "delta time" (dt). So, between frames, time increases by dt. Given velocity (speed) of object (v), we can calculate displacement as dx = dt * v. Then, to get current position, we just add dx to previous position: x += dx.
Note, that is a bad idea to calculate delta just inside your update or rendering code. Avoiding spreading out this functionality, we usually localize this calculations in timer/clock class.
Here is a simplified example:
// somewhere in timer class
// `Time` and `Duration` are some time units
class Timer {
Time m_previousTime;
Duration m_delta;
public:
Duration getDelta() const { return m_delta; }
void tick() {
m_delta = currentTime() - m_previousTime; // just subtract
m_previousTime = currentTime; // `current` becomes `previous` for next frame
}
};
// main loop
while(rendering) {
Timer.tick();
Frame(m_Timer.getDelta());
}
void Frame(Duration dt) {
if(keyPressed) {
object.position += dt * object.velocity;
}
}
You can now even make your object to move with acceleration (throttle, gravity, etc.):
object.velocity += dt * object.acceleration;
object.position += dt * object.velocity;
Hope you got the idea!
Happy coding!
this is driving me mad, anyway, usual story, attempting to guarantee the same speed in my very simple game across any windows machine that runs it. im doing it by specifying a 1/60 value, then ensuring a fram cant run until that value has passed in time since the last time it was called. the problem im having is 1/60 equates to 30hz for some reason, I have to set it to 1/120 to get 60hz. its also not bang on 60hz, its a little faster.
if i paste it out here, could someone tell me if they see anything wrong? or a more precise way to do it maybe?
float controlFrameRate = 1./60 ;
while (gameIsRunning)
{
frameTime += (system->getElapsedTime()-lastTime);
lastTime = system->getElapsedTime();
if(frameTime > controlFrameRate)
{
gameIsRunning = system->update();
//do stuff with the game
frameTime = .0f;
}
}
Don't call getElapsedTime twice, there may be a slight difference between the two calls. Instead, store its value, then reuse it. Also, instead of setting the frameTime to 0, subtract controlFrameRate from it, that way, if one frame takes a little longer, the next one will make up for it by being a little shorter.
while (gameIsRunning)
{
float elapsedTime = system->getElapsedTime();
frameTime += (elapsedTime-lastTime);
lastTime = elapsedTime;
if(frameTime > controlFrameRate)
{
gameIsRunning = system->update();
//do stuff with the game
frameTime -= controlFrameRate;
}
}
I'm not sure about your problem with having to set the rate to 1/120, what timing API are you using here?
I have the following timer class :
class Timer
{
private:
unsigned int curr,prev;
float factor;
float delta;
public:
Timer(float FrameLockFactor)
{
factor = FrameLockFactor;
}
~Timer()
{
}
void Update()
{
curr = SDL_GetTicks();
delta = (curr - prev) * (1000.f / factor);
prev = curr;
}
float GetDelta()
{
return delta;
}
};
And i use it like this :
//Create a timer and lock at 60fps
Timer timer(60.0f);
while()
{
float delta;
float velocity = 4.0f;
timer.Update();
delta = timer.GetDelta();
sprite.SetPosition( sprite.GetVector() + Vector2(0.0,velocity * delta) );
sprite.Draw();
}
But there is a big problem : My game runs way too slow for a program that is supposed to run at 60frames per second and the same test code runs smooth when not using frame indepented movement , so there must be something wrong with my code.
Any help?
If delta is supposed to be a count of frames, shouldn't it be calculated as
delta = (curr - prev) * (factor / 1000.f);
I don't really get what you are trying to do with your code. Especially the line delta = (curr - prev) * (1000.f / factor); does not make sense as far as I can see.
If I understand this correctly you are trying to calculate how much time has passed since the last update and translate this into milliseconds per frame. What units are you using?
I don't know what is returned by SDL_GetTicks();. Is this the number of processor or real clock ticks, which is returned? In case it is returning real clock ticks, the (curr-prev) part will most often be zero, since you have to do multiple updates per frame for this to work.
If it is not returning real clock ticks, why are you multiplying by 1000f. Where does this factor come from?
With code like this, it is often very important to take care of round-off errors, so I am guessing that your problem lies somewhere in this area. Although without additional information I cannot tell what your actuall problem may be.