SDL Lag / Super slow (SDL 2.0.14) - c++

i have a game loop which measures the ms it takes to finish.
before entering the loop i call a "Init" function which looks like this:
void screen::Init()
{
if(SDL_Init(SDL_INIT_EVERYTHING) == 0)
{
m_window = SDL_CreateWindow("TITLE", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, WIDTH*SIZE, HEIGHT*SIZE, NULL);
m_renderer = SDL_CreateRenderer(m_window, -1, NULL);
if(m_renderer)
SDL_SetRenderDrawColor(m_renderer, 0,0,0,0);
m_Texture = SDL_CreateTexture(m_renderer, SDL_PIXELFORMAT_ARGB8888, SDL_TEXTUREACCESS_STREAMING, WIDTH, HEIGHT);
}
}
The thing is that when i call the function one execution of the loop takes 70 to 90 ms!
if i leave the "init" function call out (so no window shown) its only 2-4 ms.
The loop does nothing other than calling one function which looks like the following:
void screen::handleEvents()
{
SDL_PollEvent(&m_event);
switch(m_event.type)
{
case SDL_QUIT:
screen->Clean();
break;
default:
break;
}
}
leaving the handleEvents function out results in 0-2ms, but the window crashes the moment i click on it.
not sure what im doing wrong but im pretty sure it shouldnt slow down to ~10 FPS lol
Here is the rest of the code:
#define SIZE 5
#define WIDTH 150
#define HEIGHT 150
int main(int argc, char* argv[])
{
screen* screen = new class screen;
screen->Init();
while(screen->running)
{
screen->execute();
}
return 0;
}
void screen::execute()
{
if(currline == 1 && Time == 0)
start = SDL_GetTicks();
if(Time > 0)
Time--;
else
counter();
handleEvents();
if(currline == 151 && Time == 0)
{
end = SDL_GetTicks();
printf("s: %8d e: %8d r: %d\n", start, end, end-start);
}
}
void screen::counter()
{
if(currline > 150)
currline = 255;
currline++;
Time = 500;
};

okay, found out why it was lagging:
i was calling handleevent everytime it executed a loop (which is a couple thousand times per second) instead of doing it inside this if statement:
if(currline == 151 && Time == 0)
{
end = SDL_GetTicks();
printf("s: %8d e: %8d r: %d\n", start, end, end-start);
}
the if statement is there to "restrict" the whole thing to a certain framerate :)
usually you would do a sleep at the end of the whole loop to get your 60 executions per second, but since i did it inside this if statment i had to put the function-call in there.

Related

How to allocate a period of time for a thread to execute?

I have a class executing in a thread.
But I only want to allow it to run for 10 seconds.
Note... I have no means of passing any boolean into the class to stop execution.
So, How can I set up a thread to terminate after 10 seconds?
The class I am testing has potential infinite recursion that may take place and it is pointless to let it run longer than 10 seconds.
TEST_METHOD(TM_ClientServer_Threads)
{
bool bDone = false;
int ii = 0;
std::thread tCounter([&bDone, &ii]()
{
// Black Box: can't touch this; can't pass in a Boolean
while(true)
{
ii++;
}
}
);
std::thread tTimer([&bDone, &tCounter]()
{
Sleep(1000);
bDone = true;
// kill the tCounter thread ?
}
);
tCounter.join();
tTimer.join();
ii = ii + 0; // break point here
}

How to get persistent input in SDL2 c++

So I noticed that when getting input with SDL_GetKeyboardState(NULL), when holding a specific button, it is going to first write outlets say a, and after 1 second its gonna continue aaaaaaaa normally. I want to when I hold the button a that it automatically goes aaaaaa.
Here is a video if you don't understand my poor explanations:
https://streamable.com/oub0w3
There is a delay between it writes out first a, and writing out aaaaa about 1 second. How can I change that? (I want there to be no delay)
Here is my code:
while (gameRunning) {
SDL_Event event;
const Uint8* keystates = SDL_GetKeyboardState(NULL);
while (SDL_PollEvent(&event)) {
if (event.type == SDL_QUIT) {
gameRunning = false;
}
if (keystates[SDL_SCANCODE_W]) {
entities[0].setY(entities[0].getY() - 1);
}
if (keystates[SDL_SCANCODE_S]) {
entities[0].setY(entities[0].getY() + 1);
}
if (keystates[SDL_SCANCODE_A]) {
entities[0].setX(entities[0].getX() - 1);
}
if (keystates[SDL_SCANCODE_D]) {
entities[0].setX(entities[0].getX() + 1);
}
}
You're misusing SDL_GetKeyboardState(nullptr).
It should be used in the main loop, not in the event loop:
while (gameRunning)
{
SDL_Event event;
while (SDL_PollEvent(&event))
{
if (event.type == SDL_QUIT)
gameRunning = false;
}
const std::uint8_t *keystates = SDL_GetKeyboardState(nullptr);
if (keystates[SDL_SCANCODE_W])
entities[0].setY(entities[0].getY() - 1);
if (keystates[SDL_SCANCODE_S])
entities[0].setY(entities[0].getY() + 1);
// An so on...
}
If you want the repetition to start immediately, you need to make your own implementation of the repeating letters.
The additional "a" characters you receive as events are (I assume) generated by the operating system, so unless you have some settings on your OS you can change to make repetition start immediately, you need your program to do it.
(I am assuming SDL is not the one generating these characters, which could be a possibility)
To do this, you would make a system check the amount of time elapsed and kept track of how long keys are being pressed, and outputting "key" events that it generated itself, much like the OS is doing.

Play one sound after another with SDL_mixer?

I have 4 sounds. I need play sound 1, when it finishes, automatically play sound 2; when sound 2 finishes, automatically play sound 3. Soun 3 finishes, play sound 4.... I'm using SDL Mixer 2.0, no SDL Sound...Is there a way?
int main() {
int frequencia = 22050;
Uint16 formato = AUDIO_S16SYS;
int canal = 2; // 1 mono; 2 = stereo;
int buffer = 4096;
Mix_OpenAudio(frequencia, formato, canal, buffer);
Mix_Chunk* sound_1;
Mix_Chunk* sound_2;
Mix_Chunk* sound_3;
Mix_Chunk* sound_4;
som_1 = Mix_LoadWAV("D:\\sound1.wav");
som_2 = Mix_LoadWAV("D:\\sound1.wav");
som_3 = Mix_LoadWAV("D:\\sound1.wav");
som_4 = Mix_LoadWAV("D:\\sound1.wav");
Mix_PlayChannel(-1, sound_1, 0);
Mix_PlayChannel(1, sound_2, 0);
Mix_PlayChannel(2, sound_3, 0);
Mix_PlayChannel(3, sound_4, 0);
return 0;
}
Check in a loop whether the channel is still playing using Mix_Playing(), and add a delay using SDL_Delay() to prevent the loop from consuming all available CPU time.
(In this example, I changed your first call to Mix_PlayChannel() from -1 to 1.)
Mix_PlayChannel(1, sound_1, 0);
while (Mix_Playing(1) != 0) {
SDL_Delay(200); // wait 200 milliseconds
}
Mix_PlayChannel(2, sound_2, 0);
while (Mix_Playing(2) != 0) {
SDL_Delay(200); // wait 200 milliseconds
}
// etc.
You should probably wrap that into a function instead so that you don't repeat what is basically the same code over and over again:
void PlayAndWait(int channel, Mix_Chunk* chunk, int loops)
{
channel = Mix_PlayChannel(channel, chunk, loops);
if (channel < 0) {
return; // error
}
while (Mix_Playing(channel) != 0) {
SDL_Delay(200);
}
}
// ...
PlayAndWait(-1, sound_1, 0);
PlayAndWait(1, sound_2, 0);
PlayAndWait(2, sound_3, 0);
PlayAndWait(3, sound_3, 0);

pausing function and continue from a certain point in a robot movement

i have a question on how to program a certain sequence for my robot.
lets say if i would to program to make it run from position a to b, i have a sensor attach to it that should it detect x, it would perform an action called y at the position where it detect x, where action y doesnt change its position.
i would like the robot to continue from where it left after performing action y to go towards b. however i do not know how to pause the sequence from a to b and continue from where it left off after performing action y. i am controlling only the motor of the wheels and its timing so i can only set the speed of the wheels for a certain time.
is there a pause function in general( not sleep) in c++ and to continue running its lines of code from where it paused?
for now i do know how to reset its action but thats not what i want.
example( making the robot move from a to b in 10 seconds, detect object x at 3 seconds, do action y at position when t =3 seconds, continue motion for remaining 7 seconds after action y has been done)
You can try to use some event (message) driving architecture like the following pseudo code:
vector<action> action_sequence={do_a,do_b,do_c};
int i=0;
while(1)
{
e = WaitForMessage();
switch(e.code)
{
case do_action:
action_sequence[i].run();//perform an action
...//some code to have a scheduler thread to send next
//do_action message in certain seconds to this thread.
i++;
default:
...
}
}
The answer would depend on your code, are you using windows messaging, are you using thread, etc. Assuming that you are using neither, just linear code you could implement your own sleep function which is passed a function by the caller which is used to access if the sleep should be preempted. If preempted, then the function returns the time left so the action can be continued later.
This allows linear code to handle your situation. I knocked up a sample. Will explain the bits.
typedef bool (*preempt)(void);
DWORD timedPreemptableAction (DWORD msTime, preempt fn)
{
DWORD startTick = GetTickCount();
DWORD endTick = startTick + msTime;
DWORD currentTick;
do
{
currentTick = GetTickCount();
}
while (fn () == false && currentTick < endTick);
return currentTick > endTick ? 0 : endTick-currentTick;
}
The key function above, obtains start time in milliseconds, and will not exit until the timeout expires - or the user provided function returns true.
This user provided function could poll input devices such as a keyboard press etc. For now to match your question, I have added a user function which returns true, after 3 seconds:
DWORD startToTurnTicks = 0;
bool startToTurn (void)
{
bool preempt = false;
// TODO Implement method of preemption. For now
// just use a static timer, yep a little hacky
//
// 0 = uninitialized
// 1 = complete, no more interruptions
// >1 = starting tick count
if (startToTurnTicks == 0)
{
startToTurnTicks = GetTickCount();
}
else
{
if (startToTurnTicks != 1)
{
if ((startToTurnTicks + 3000) < GetTickCount())
{
startToTurnTicks = 1;
preempt = true;
}
}
}
return preempt;
}
Now we have a function which waits for N time and can exit, and a user function which will return true after 3 seconds, now the main call:
bool neverPreempt (void)
{
return false;
}
int main (void)
{
int appResult = 0;
DWORD moveForwardTime = 1000*10;
DWORD turnTime = 1000*3;
DWORD startTicks = GetTickCount();
printf ("forward : %d seconds in\n",
(GetTickCount()-startTicks)/1000);
moveForwardTime = timedPreemptableAction (moveForwardTime, &startToTurn);
printf ("turn : %d seconds in\n",
(GetTickCount()-startTicks)/1000);
turnTime = timedPreemptableAction (turnTime, &neverPreempt);
printf ("turn complete : %d seconds in\n",
(GetTickCount()-startTicks)/1000);
if (moveForwardTime > 0)
{
printf ("forward again : %d seconds in\n",
(GetTickCount()-startTicks)/1000);
moveForwardTime = timedPreemptableAction (moveForwardTime, &neverPreempt);
printf ("forward complete : %d seconds in\n",
(GetTickCount()-startTicks)/1000);
}
return appResult;
}
In the main code you see timedPreemptableAction is called 3 times. The first time we pass the user function which turns true after 3 seconds. This first call exits after three seconds returning 7 seconds left. The output from the app returns:
f:\projects\cmake_test\build\Debug>main
forward : 0 seconds in
turn : 3 seconds in
turn complete : 6 seconds in
forward again : 6 seconds in
forward complete : 13 seconds in
Started to move forward #0 seconds, "paused" #3 seconds, restored #6 and finished #13.
0->3 + 6->13 = 10 seconds.

Gtk/C++ Chronometer

I'm doing a game that uses a chronometer.
The game is written in C++ and I'm also using glade and GTK3.0
My problem is that when I start the game the chronometer doesn't work as it should..
I have created a file time.h with this code inside:
struct aTimer
{
bool running = false;
int hour_expired = 0;
int min_expired = 59;
int sec_expired = 50;
};
void start_time(aTimer *&t)
{
t->running = true;
}
void reset_time(aTimer *&t)
{
t->running = false;
t->sec_expired = 0;
t->min_expired = 0;
t->hour_expired = 0;
}
In my main file, I include it and also declare a new chronometer like this:
void start_time(aTimer *&);
void reset_time(aTimer *&);
aTimer *tempo = new aTimer;
Now, in my game, I have 2 windows, when i press play from the first window, the second window becomes visible and I hide the first one. When the second one is closed, the first becomes visible and the second invisible..
When the first window is closed, the application is closed.
In the struct, the bool running is false, because my idea was to make it true when you actually play the game (that is when you have the second window visible) and not at the start of the application..
So I've done this in the main file:
void start_game()
{
start_time(tempo);
}
gboolean update_time()
{
if (tempo->running)
{
if (tempo->sec_expired == 60)
{
tempo->sec_expired = 0;
(tempo->min_expired)++;
if (tempo->min_expired == 60)
{
tempo->min_expired = 0;
(tempo->hour_expired)++;
}
}
ostringstream oss;
GtkLabel *time = GTK_LABEL(gtk_builder_get_object(builder, "lblSec"));
oss<<(tempo->sec_expired)++;
gtk_label_set_text(time, oss.str().c_str());
oss.str("");
oss.clear();
time = GTK_LABEL(gtk_builder_get_object(builder, "lblMin"));
oss<<tempo->min_expired<<":";
gtk_label_set_text(time, oss.str().c_str());
oss.str("");
oss.clear();
time = GTK_LABEL(gtk_builder_get_object(builder, "lblHour"));
oss<<tempo->hour_expired<<":";
gtk_label_set_text(time, oss.str().c_str());
oss.str("");
oss.clear();
}
return tempo->running;
}
and in the main function of the main file i have also:
g_timeout_add_seconds(1, GSourceFunc(update_time), NULL);
If I start the application with the bool running = false, it won't work at all
If I start it the running = true, then it does work when i start the application, but as i come back to the "menu" and want to play another game, it won't start again.. The new time will just be the old time and won't increment anymore
I don't understand why though... Can someone help me?
Thank you
********************UPDATE************************
I tried the GTimer "option" as #José Fonte suggested but still can't come ahead..
This is an example that i tried..
#include <iostream>
#include <glib.h>
#include <sstream>
#include <gtk/gtk.h>
static GtkBuilder *builder;
using namespace std;
GTimer *timing;
bool start = false;
extern "C" void btnStartPause_click(GtkButton *button)
{
if (!start)
{
timing = g_timer_new();
start = true;
}
else
{
g_timer_stop(timing);
start = false;
}
}
gboolean update_time()
{
if (start)
{
gulong *micro;
double sec;
sec = g_timer_elapsed(timing, micro);
ostringstream oss;
GtkLabel *time = GTK_LABEL(gtk_builder_get_object(builder, "lblSec"));
oss<<(sec)++;
gtk_label_set_text(time, oss.str().c_str());
oss.str("");
oss.clear();
}
return start;
}
int main(int argc, char *argv[])
{
gtk_init(&argc, &argv);
builder = gtk_builder_new();
gtk_builder_add_from_file(builder,"glade.glade", NULL);
gtk_builder_connect_signals(builder, NULL);
// timing
g_timeout_add_seconds(1, GSourceFunc(update_time), NULL);
gtk_main();
return 0;
}
The problem again is.. I wanna start the timing when i click the btnStartPause button (not when i start the application), so i want it to start in the btnClick function..
But it seems like that the application tries instantly to do the gboolean update_time() function but since at the start of the application the boolean start is false, it just won't do the code, but when i click the button, so that the boolean start becomes true, it doesn't try again the gboolean update_time(), like it tried at the start of the application, and won't try it anymore.. I don't understand this..