Framerate independent update using SDL? - sdl

This is my main loop:
while (!done)
{
oldTimeMS = timeMS;
timeMS = SDL_GetTicks();
frameTimeMS = timeMS - oldTimeMS;
frameTime = ((float)timeMS / 1000.0f)*FPS;
update();
render();
}
Where FPS is 60. I can framerate-independently update a variable by doing doing
var = var+increment*frameTime;
But how can I make it so that update() is only called after a certain interval (in time, not in frames) has passed? I want render() to be called as often as possible, every frame. But I want update() to only be called at a time-based interval. How can I set this up using these variables? Thank you in advance.
Edit:
Wait... Wouldn't I want update() to be called as much as possible until the next frame renders? How would I go about doing this? Would it be a better idea than my previous idea?

What you are looking is "tick based rendering loop"
http://www.flipcode.com/archives/Main_Loop_with_Fixed_Time_Steps.shtml

Related

How to adjust a game loop frame rate?

I'm at the moment trying to Programm my first Game in SFML (and my first overall) but i ran into the Problem that i get a stutter about once a second. During such a stutter the Frametime is 3 to 4 times higher than normal which is really noticeable as long i don't run really high FPS (300+).
No Problem (at least atm) as performance is not an Issue, but:
When doing that my movement Method really freaks out and moves way way slower that it's supposed to do.
my Movement method:
void Player::Update(float frametime){
mMovementSpeedTimefactor = frametime * 60 / 1000.0f;
setMovementVector(sf::Vector2f( mMovementVector.x * mMovementSpeedTimefactor, mMovementVector.y *mMovementSpeedTimefactor));
validateMovement();
//std::cout << mTestClock->restart().asMilliseconds() << std::endl;
moveObject();
this->updateAnimation();
}
frametime is the frametime in Milliseconds, and i Multiply by 60, as my movementspeed is set as a value for pixel/second and not per frame.
movementspeed is 5, so the character should move 5 px per second, whatever FPS( and therefore Frametime) i have.
But: that gives me really jumpy movement, as those "stutterframes" result in a jump, and on nto stuttering frames the palyer moves a lot slower than it should.
my mainloop is really simple, just
while(mMainWindow->isOpen()){
HandleEvents();
Update();
Render();
}
while using the inbuild framelimiter (tried writing my own, but i get the very same result, as long as i use sf:sleep to regulate FPS for not having the cpu core running at 100% load) to 300 FPS.
So yeah, i could just set my standard speed to 1 instead of 5, but
setframeratelimit is not very accurate, so i get some variation in movementspeed, that i really not like.
anyone has an idea, what i could best do? Maybe i'm not seeing the forest for all the trees ( i actually have no idea if you say that in english :P) but as this is my first game i have no experience to look back upon.
Similar question: Movement Without Framerate Limit C++ SFML.
What you really need is fixed time step.
Take a look at the SFML Game development book source code. Here's the interesting snippet from Application.cpp:
const sf::Time Game::TimePerFrame = sf::seconds(1.f/60.f);
[...]
sf::Clock clock;
sf::Time timeSinceLastUpdate = sf::Time::Zero;
while (mWindow.isOpen())
{
sf::Time elapsedTime = clock.restart();
timeSinceLastUpdate += elapsedTime;
while (timeSinceLastUpdate > TimePerFrame)
{
timeSinceLastUpdate -= TimePerFrame;
processEvents();
update(TimePerFrame);
}
updateStatistics(elapsedTime);
render();
}
EDIT: If this is not really what you want, see "Fix your timestep!" which Laurent Gomila himself linked in the SFML forum.
When frametime is really high or really low your calculations may not work correctly because of float precision issues. I suggest setting standard speed to, maybe, 500 and mMovementSpeedTimeFactor to frametime * 60 / 10.0f and check if the issue still happens.

Insert a delay in Processing sketch

I am attempting to insert a delay in Processing sketch. I tried Thread.sleep() but I guess it will not work because, as in Java, it prevents rendering of the drawings.
Basically, I have to draw a triangle with delays in drawing three sides.
How do I do that?
Processing programs can read the value of computer’s clock. The current second is read with the second() function, which returns values from 0 to 59. The current minute is read with the minute() function, which also returns values from 0 to 59. - Processing: A Programming Handbook
Other clock related functions : millis(), day(), month(), year().
Those numbers can be used to trigger events and calculate the passage of time, as in the following Processing sketch quoted from the aforementioned book:
// Uses millis() to start a line in motion three seconds
// after the program starts
int x = 0;
void setup() {
size(100, 100);
}
void draw() {
if (millis() > 3000) {
x++;
line(x, 0, x, 100);
}
}
Here's an example of a triangle whose sides are drawn each one after 3 seconds (the triangle is reset every minute):
int i = second();
void draw () {
background(255);
beginShape();
if (second()-i>=3) {
vertex(50,0);
vertex(99,99);
}
if (second()-i>=6) vertex(0,99);
if (second()-i>=9) vertex(50,0);
endShape();
}
As #user2468700 suggests, use a time keeping function. I like millis().
If you have a value to keep track of the time at certain intervals and the current time (continuously updated) you can check if one timer(manually updated one) falls behind the other(continuous one) based on a delay/wait value. If it does, update your data (number of points to draw in this case) and finally the local stop-watch like value.
Here's a basic commented example.
Rendering is separated from data updates to make it easier to understand.
//render related
PVector[] points = new PVector[]{new PVector(10,10),//a list of points
new PVector(90,10),
new PVector(90,90)};
int pointsToDraw = 0;//the number of points to draw on the screen
//time keeping related
int now;//keeps track of time only when we update, not continuously
int wait = 1000;//a delay value to check against
void setup(){
now = millis();//update the 'stop-watch'
}
void draw(){
//update
if(millis()-now >= wait){//if the difference between the last 'stop-watch' update and the current time in millis is greater than the wait time
if(pointsToDraw < points.length) pointsToDraw++;//if there are points to render, increment that
now = millis();//update the 'stop-watch'
}
//render
background(255);
beginShape();
for(int i = 0 ; i < pointsToDraw; i++) {
vertex(points[i].x,points[i].y);
}
endShape(CLOSE);
}

how should i use glutPostRedisplay in a loop to call display multiple times when an event occures?

i am new to opengl.
i want to draw my scene a few times when the user presses some key, i have called glutPostRedisplay in a for loop when the key is pressed but it just redraws my scene one time. how should i handle this problem?
First off, glutPostRedisplay is a function that belongs to GLUT, which is not part of OpenGL but a 3rd party library/framework. There are other frameworks, or you can do everything from scratch (heck, I just remembered, that when I began learning OpenGL some 14 years ago, GLUT won't properly work for me, so I did write my framework from scratch then).
i want to draw my scene a few times when the user presses some key
Never mix drawing and animation logic with event processing. If the user pressed a key that triggers some animation, set a flag (=some variable), that the animation should be played and then iterate through the animation in your render/animation loop.
i have called glutPostRedisplay in a for loop when the key is pressed but it just redraws my scene one time
glutPostRedisplay won't trigger a redraw immediately. It sets a flags, that the GLUT message loop shouls issue a redraw instead of going idle after all message processing. Of course this flag doesn't accumulate.
So here's a layout using GLUT. Unfortunately GLUT is subobtimal for this kind of things, because it doesn't give you control over the event loop, which makes precise timing cumbersome to achieve.
time_t rendertimer;
void stopwatch(time_t); // some external helper function that reports time between calls
typedef enum {stop, play} animstate;
struct animation {
float time;
float duration;
animstate state;
};
animation animations[...];
void keyboard(key, x, y)
{
if(key == ...) {
animations[0].time = 0;
animations[0].state = play;
}
glutPostRedisplay();
}
void idle()
{
float deltaT = stopwatch(rendertimer);
if( animation[...].state == play ) {
animation[...].time += deltaT;
if( animation[...].duration <= animation[...].time ) {
animation[...].state = stop;
}
}
glutPostRedisplay();
}
void display()
{
draw_objects_according_to_animation();
}

Simulated time in a game loop using c++

I am building a 3d game from scratch in C++ using OpenGL and SDL on linux as a hobby and to learn more about this area of programming.
Wondering about the best way to simulate time while the game is running. Obviously I have a loop that looks something like:
void main_loop()
{
while(!quit)
{
handle_events();
DrawScene();
...
SDL_Delay(time_left());
}
}
I am using the SDL_Delay and time_left() to maintain a framerate of about 33 fps.
I had thought that I just need a few global variables like
int current_hour = 0;
int current_min = 0;
int num_days = 0;
Uint32 prev_ticks = 0;
Then a function like :
void handle_time()
{
Uint32 current_ticks;
Uint32 dticks;
current_ticks = SDL_GetTicks();
dticks = current_ticks - prev_ticks; // get difference since last time
// if difference is greater than 30000 (half minute) increment game mins
if(dticks >= 30000) {
prev_ticks = current_ticks;
current_mins++;
if(current_mins >= 60) {
current_mins = 0;
current_hour++;
}
if(current_hour > 23) {
current_hour = 0;
num_days++;
}
}
}
and then call the handle_time() function in the main loop.
It compiles and runs (using printf to write the time to the console at the moment) but I am wondering if this is the best way to do it. Is there easier ways or more efficient ways?
I've mentioned this before in other game related threads. As always, follow the suggestions by Glenn Fiedler in his Game Physics series
What you want to do is to use a constant timestep which you get by accumulating time deltas. If you want 33 updates per second, then your constant timestep should be 1/33. You could also call this the update frequency. You should also decouple the game logic from the rendering as they don't belong together. You want to be able to use a low update frequency while rendering as fast as the machine allows. Here is some sample code:
running = true;
unsigned int t_accum=0,lt=0,ct=0;
while(running){
while(SDL_PollEvent(&event)){
switch(event.type){
...
}
}
ct = SDL_GetTicks();
t_accum += ct - lt;
lt = ct;
while(t_accum >= timestep){
t += timestep; /* this is our actual time, in milliseconds. */
t_accum -= timestep;
for(std::vector<Entity>::iterator en = entities.begin(); en != entities.end(); ++en){
integrate(en, (float)t * 0.001f, timestep);
}
}
/* This should really be in a separate thread, synchronized with a mutex */
std::vector<Entity> tmpEntities(entities.size());
for(int i=0; i<entities.size(); ++i){
float alpha = (float)t_accum / (float)timestep;
tmpEntities[i] = interpolateState(entities[i].lastState, alpha, entities[i].currentState, 1.0f - alpha);
}
Render(tmpEntities);
}
This handles undersampling as well as oversampling. If you use integer arithmetic like done here, your game physics should be close to 100% deterministic, no matter how slow or fast the machine is. This is the advantage of increasing the time in fixed time intervals. The state used for rendering is calculated by interpolating between the previous and current states, where the leftover value inside the time accumulator is used as the interpolation factor. This ensures that the rendering is is smooth, no matter how large the timestep is.
Other than the issues already pointed out (you should use a structure for the times and pass it to handle_time() and your minute will get incremented every half minute) your solution is fine for keeping track of time running in the game.
However, for most game events that need to happen every so often you should probably base them off of the main game loop instead of an actual time so they will happen in the same proportions with a different fps.
One of Glenn's posts you will really want to read is Fix Your Timestep!. After looking up this link I noticed that Mads directed you to the same general place in his answer.
I am not a Linux developer, but you might want to have a look at using Timers instead of polling for the ticks.
http://linux.die.net/man/2/timer_create
EDIT:
SDL Seem to support Timers: SDL_SetTimer

How to programmatically move a window slowly, as if the user were doing it?

I am aware of the MoveWindow() and SetWindowPos() functions. I know how to use them correctly. However, what I am trying to accomplish is move a window slowly and smoothly as if a user is dragging it.
I have yet to get this to work correctly. What I tried was getting the current coordinates with GetWindowRect() and then using the setwindow and movewindow functions, incrementing Right by 10 pixels each call.
Any ideas?
Here is what I had beside all my definitions.
while(1)
{
GetWindowRect(notepad,&window);
Sleep(1000);
SetWindowPos(
notepad,
HWND_TOPMOST,
window.top - 10,
window.right,
400,
400,
TRUE
);
}
If you want smooth animation, you'll need to make it time-based, and allow Windows to process messages in between movements. Set a timer, and respond to WM_TIMER notifications by moving the window a distance based on the elapsed time since your animation started. For natural-looking movement, don't use a linear function for determining the distance - instead, try something like Robert Harvey's suggested function.
Pseudocode:
//
// animate as a function of time - could use something else, but time is nice.
lengthInMS = 10*1000; // ten second animation length
StartAnimation(desiredPos)
{
originalPos = GetWindowPos();
startTime = GetTickCount();
// omitted: hwnd, ID - you'll call SetTimer differently
// based on whether or not you have a window of your own
timerID = SetTimer(30, callback);
}
callback()
{
elapsed = GetTickCount()-startTime;
if ( elapsed >= lengthInMS )
{
// done - move to destination and stop animation timer.
MoveWindow(desiredPos);
KillTimer(timerID);
}
// convert elapsed time into a value between 0 and 1
pos = elapsed / lengthInMS;
// use Harvey's function to provide smooth movement between original
// and desired position
newPos.x = originalPos.x*(1-SmoothMoveELX(pos))
+ desiredPos.x*SmoothMoveELX(pos);
newPos.y = originalPos.y*(1-SmoothMoveELX(pos))
+ desiredPos.y*SmoothMoveELX(pos);
MoveWindow(newPos);
}
I found this code which should do what you want. It's in c#, but you should be able to adapt it:
increment a variable between 0 and 1 (lets call it "inc" and make it global) using small increments (.03?) and use the function below to give a smooth motion.
Math goes like this:
currentx=x1*(1-smoothmmoveELX(inc)) + x2*smoothmmoveELX(inc)
currenty=y1*(1-smoothmmoveELX(inc)) + y2*smoothmmoveELX(inc)
Code:
public double SmoothMoveELX(double x)
{
double PI = Atn(1) * 4;
return (Cos((1 - x) * PI) + 1) / 2;
}
http://www.vbforums.com/showthread.php?t=568889
A naturally-moving window would accelerate as it started moving, and decelerate as it stopped. The speed vs. time graph would look like a bell curve, or maybe the top of a triangle wave. The triangle wave would be easier to implement.
As you move the box, you need to steadily increase the number of pixels you are moving the box each time through the loop, until you reach the halfway point between point a and point b, at which you will steadily decrease the number of pixels you are moving the box by. There is no special math involved; it is just addition and subtraction.
If you are bored enough, you can do loopback VNC to drag the mouse yourself.
Now, as for why you would want to I don't know.