I am trying to make my program check for the system time 24/7 in a loop in a mfc dialog application.
a little background on what i did so far.
My GUI has a few buttons:- start, stop, exit, and a few edit boxes to show values.
It is meant to read into a .txt file at a predetermined location 24/7 at a specified interval time by the user. This could be 5 mins to however long the user wants, but it has to be in multiples of 5. For example, 5 min, 10 min, 15 min, 20 min so on and so forth.
After reading the .txt file, it will then compare the strings within the .txt file and output to a .csv file.
Thats the brief explanation on what I am trying to do. Now on to the question at hand.
Since I need the program to run 24/7, I am trying to make the program check the system time consistently and trigger a set of functions when the interval time specified by the user has been reached.
For that, I made a variable whenever the start button is pressed
BOOL start_flag = true;
and the start_flag will only return to false once the stop button is pressed
and then I had it in a while loop
while (start_flag)
{
Timer(); // To add the user entered interval time to current time
Timer_Secondary(); // To compare the converted time against the current time
Read_Log(); // Read the logs
}
///////////////////Timer function//////////////////////
{
CTime curTime = CTime::GetCurrentTime();
timeString_Hour = curTime.Format("%H");
timeString_Minute = curTime.Format("%M");
timeString_Second = curTime.Format("%S");
Hour = atoi(timeString_Hour);
Minute = atoi(timeString_Minute);
Second = atoi(timeString_Second);
if ((first_run == false) && (Int_Frequency < 60))
{
int Minute_Add = Minute + Int_Frequency;
if (Minute_Add >= 60)
{
Minute_Add = Minute_Add - 60;
Hour = Hour + 1;
}
Minute = Minute_Add;
}
if ((first_run == false) && (Int_Frequency >= 60))
{
int Local_Frequency = Int_Frequency;
while (Local_Frequency >= 60)
{
Local_Frequency = Local_Frequency - 60;
Hour = Hour + 1;
}
}
if (first_run)
{
Hour = Hour + 1;
Minute = 00;
Second = 00;
first_run = false;
}
timeString_Hour.Format("%d", Hour);
timeString_Minute.Format("%d", Minute);
timeString_Second.Format("%d", Second);
}
////////Timer_Secondary function//////////
{
CTime curTime = CTime::GetCurrentTime();
timeString_Hour_Secondary = curTime.Format("%H");
timeString_Minute_Secondary = curTime.Format("%M");
timeString_Second_Secondary = curTime.Format("%S");
Hour_Secondary = atoi(timeString_Hour);
Minute_Secondary = atoi(timeString_Minute);
Second_Secondary = atoi(timeString_Second);
}
Right, the problem i have so far is that because of the while loop, the program is stuck in an infinite loop and the GUI freezes due to that and the user wont be able to make it stop.
There are a few things i thought in my head but am not sure if it will work.
while (start_flag)
{
if((Hour_Secondary == Hour) && (Minute_Secondary == Minute) && (Second_Secondary == Second))
{
// Run parsing function in this (main bit of code)
start_flag = false; //Set it to false so it will jump back out of this loop
}
if ((Hour_Secondary != Hour) && (Minute_Secondary != Minute) && (Second_Secondary != Second))
{
// Some form of time function in this to wait every 1 min then loop back to start of while loop)
// With the timer function, the GUI should be usable at this point of time
}
}
Any advice would be much appreciated. I hope this post wasnt too confusing in its layout as I wanted to provide as much as I can to show that I am not just asking questions without trying to fix it myself first.
Whilst you are in your while loop, the windows message pump is not processing, ergo - your user interface freezes up. You have 2 options, as I see it:
1) Use a background thread to do this.
2) Investigate CWnd::SetTimer and use that to perform the timings. This will post a message in to the message queue at the intervals you specify (It's not a real-time solution, but I don't think you have that requirement), and therefore your interface will remain alive.
Add timer control in your dialog and handle WM_TIMER message.
http://msdn.microsoft.com/en-us/library/windows/desktop/ms644901(v=vs.85).aspx#creating_timer
Related
Our project is a mp3 player, we're all beginners at coding. The project has a sort of a 'main menu' from which the user is able to choose from options such as: show track info, shuffle playlist, etc. Now we're trying to add a feature that tells the user how long they've been logged in. Our stopwatch code:
unsigned long timeListened = 0;
static int sknt1=0;
static int sknt10=0;
static int min1=0;
static int min10=0;
static int hour1=0;
static int hour10=0;
static int hour100=0;
void loop() {
timeListened = millis();
if (timeListened >= 1000) {
sknt1 = sknt1 + 1;
timeListened = 0;
}
if (sknt1 == 10) {
sknt10 = sknt10 + 1 ;
sknt1 = 0;
}
if (sknt10 == 6) {
min1 = min1 + 1;
sknt10 = 0;
}
if (min1 == 10) {
min10 = min10 + 1;
min1=0;
}
if (min10 == 6) {
hour1=houri1+1;
min10=0;
}
if (hour1 == 10){
hour10 = hour10 + 1;
hour10=0;
}
Serial.print (hour100);
Serial.print (hour10); Serial.print(hour1); Serial.print(":");
Serial.print(min10); Serial.print(min1); Serial.print(":");
Serial.print(sknt10); Serial.println(sknt1);
delay(1000);
}
The issue is, we can't find the right spot to place this code in. Only results we've had is either the stopwatch adds 1 second every time the user chooses the option from the menu, or the stopwatch time is just 000:00:00.
Is there a way to make arduino run the stopwatch in the background while the main program (playing music) is running ?
I have written a some code that prints the elapsed time since the Arduino was turned on for another project. It is not efficient, but it works.
You could call this every second in your loop() in order to update it.
#define second (1000L)
#define minute (60L*1000L)
#define hour (60L*60L*1000L)
unsigned long testTime = millis();
unsigned long hh = (testTime ) / hour;
unsigned long mm = ((testTime ) % hour) / minute ;
unsigned long ss = (((testTime ) % hour) % minute) / second;
Serial.print(hh);Serial.print(':');
Serial.print(mm);Serial.print(':');
Serial.print(ss);Serial.print('\n');
You are already using millis in your code which gives you the time in milliseconds ellapsed since the program started.
So whenever you want to know how long the program has been running just call millis().
This will overflow after approximately 50 days but that's probably irrelevant in your use case.
The problem with your code is that you add 1 second every time millis returns a value > 1000 which is always the case if your Arduino is running for more than 1 second.
This also has the effect that you only add 1 second every time you check that. Not every second.
There is also no need to delay your program to get an update every second.
Simply conver the return value of millis() to a time string. It gives you the milliseconds since program start. So calculate how many hours, minutes and seconds that is. I guess I don't have to tell you that 60000ms make a minute...
If you want to know the time passed since a certain event. For example a user login, you store the time of that event and calculate the difference to the current time.
I am using STM32F769 Disc board with Mbed online compiler. My task is to change the PWM frequency and duty ratio according to input.
I've created a simple algorithm according to my need, the program is working well but whenever the controller updates the PWM frequency, there is strange behavior(overlapped maybe, difficult to explain verbally for me), the frequency is changed instantly and the output is incorrect at that moment. For other controllers (like arduino) this never happens, the controller updates value after the time period of PWM is over. But not in this case.
What can be wrong?
I thought to add a small delay before value is updated but that will not work, as every time a different delay would be needed. I have attached the code and screenshots.
#include "mbed.h"
AnalogIn analog_value(A0);
PwmOut pulse(D11);
int main() {
double meas_v = 0;
double out_freq, out_duty, s_time;
while (1) {
meas_v = analog_value.read() * 3300;
if (meas_v < 1) {
out_freq = 50000;
out_duty = 40;
} else if (meas_v >= 1000) {
out_freq = 100000;
out_duty = 80;
} else {
out_freq = 50000 + (meas_v * 50);
out_duty = 40 + (meas_v * 0.04);
}
pulse.period(1.0 / out_freq);
pulse = out_duty / 100;
s_time = 0.0001;
wait(s_time);
}
}
The output should be updated after the current period is completed, not instantly.
Error I am getting
The underlying HAL code probably resets the current count value of the timer when you set the new period. You'll have to read the current timer cnt value and wait for it to reach 0. You can set a new period when the timer cnt value reaches 0.
You need to update the value in the update interrupt or use the timer DMA burst mode
I try to have my program figure if an event is getting reached during the day
for instance, I want to be able to create events at 10:00:00 so that a task gets executed at that moment in the day (only once)
so I have this function that can tell the time has passed, but it will always return true after 10:00 (time parameter)
bool Tools::passedTimeToday(std::time_t time)
{
auto now = std::chrono::system_clock::now();
std::time_t _now = std::chrono::system_clock::to_time_t(now);
if (std::difftime(_now,time)<0)
return false;
else
return true;
}
how do I check the time has passed only once ?
do I use some sort of epsilon around that time ? what value shoud I use for that epsilon ?
double delta = std::difftime(_now,time);
if ( (delta<0) && (delta>-epsilon) )
{
...
I mean it could work, but what if my program tests that condition too late (bigger than epsilon) ?
I thought about using a boolean flag instead (bool processed), but then evey time I run the program, it would also run all tasks that happened around that time
any help appreciated
thanks
Create a list with the scheduled tasks and their next run time sorted by the next run time.
When the current time is after the next run time of the first task in the list, remove it from the list, execute the task and then add it back to the list with the run time incremented by a day (or whatever repetition period you need).
Go to step 2 unless the list is empty.
Here is my way of doing it
I only admit tasks within a 5 seconds period
bool Tools::passedTimeToday(std::time_t time)
{
auto now = std::chrono::system_clock::now();
std::time_t _now = std::chrono::system_clock::to_time_t(now);
double delta = std::difftime(_now,time);
if ( (delta>0) && (delta<5) )
return true;
else
return false;
}
I have a game with Bullet Physics as the physics engine, the game is online multiplayer so I though to try the Source Engine approach to deal with physics sync over the net. So in the client I use GLFW so the fps limit is working there by default. (At least I think it's because GLFW). But in the server side there is no graphics libraries so I need to "lock" the loop which simulating the world and stepping the physics engine to 60 "ticks" per second.
Is this the right way to lock a loop to run 60 times a second? (A.K.A 60 "fps").
void World::Run()
{
m_IsRunning = true;
long limit = (1 / 60.0f) * 1000;
long previous = milliseconds_now();
while (m_IsRunning)
{
long start = milliseconds_now();
long deltaTime = start - previous;
previous = start;
std::cout << m_Objects[0]->GetObjectState().position[1] << std::endl;
m_DynamicsWorld->stepSimulation(1 / 60.0f, 10);
long end = milliseconds_now();
long dt = end - start;
if (dt < limit)
{
std::this_thread::sleep_for(std::chrono::milliseconds(limit - dt));
}
}
}
Is it ok to use std::thread for this task?
Is this way is efficient enough?
Will the physics simulation will be steped 60 times a second?
P.S
The milliseconds_now() looks like this:
long long milliseconds_now()
{
static LARGE_INTEGER s_frequency;
static BOOL s_use_qpc = QueryPerformanceFrequency(&s_frequency);
if (s_use_qpc) {
LARGE_INTEGER now;
QueryPerformanceCounter(&now);
return (1000LL * now.QuadPart) / s_frequency.QuadPart;
}
else {
return GetTickCount();
}
}
Taken from: https://gamedev.stackexchange.com/questions/26759/best-way-to-get-elapsed-time-in-miliseconds-in-windows
If you want to limit the rendering to a maximum FPS of 60, it is very simple :
Each frame, just check if the game is running too fast, if so just wait, for example:
while ( timeLimitedLoop )
{
float framedelta = ( timeNow - timeLast )
timeLast = timeNow;
for each ( ObjectOrCalculation myObjectOrCalculation in allItemsToProcess )
{
myObjectOrCalculation->processThisIn60thOfSecond(framedelta);
}
render(); // if display needed
}
Please note that if vertical sync is enabled, rendering will already be limited to the frequency of your vertical refresh, perhaps 50 or 60 Hz).
If, however, you wish the logic locked at 60fps, that's different matter: you will have to segregate your display and logic code in such a way that the logic runs at a maximum of 60 fps, and modify the code so that you can have a fixed time-interval loop and a variable time-interval loop (as above). Good sources to look at are "fixed timestep" and "variable timestep" ( Link 1 Link 2 and the old trusty Google search).
Note on your code:
Because you are using a sleep for the whole duration of the 1/60th of a second - already elapsed time you can miss the correct timing easily, change the sleep to a loop running as follows:
instead of
if (dt < limit)
{
std::this_thread::sleep_for(std::chrono::milliseconds(limit - dt));
}
change to
while(dt < limit)
{
std::this_thread::sleep_for(std::chrono::milliseconds(limit - (dt/10.0)));
// or 100.0 or whatever fine-grained step you desire
}
Hope this helps, however let me know if you need more info:)
I thought I was jumping over a puddle, but have instead fallen into an ocean :/
I'm trying to implement a 5 second timer (don't need more than milliseconds).
My goal:
// I start the program in gamestate 0...
{
if (button_has_been_pressed == 1)
{
gamestate = 1;
}
}
if (gamestate==1)
{
//wait for 5 seconds and go to gamestate2
gamestate = 2;
}
I've tried the following:
GLUT_TIME_ELAPSED measures time from the beginning of my program. I am unable to 'reset' GLUT_TIME_ELAPSED after entering gamestate1. Otherwise, it would work wonderfully.
gettimeofday gives me much more resolution than I need. At most, milliseconds would be applicable.
Regardless of my resolution needs, I've tried Song Ho's method:
gamestate1_elapsedTime = (t2.tv_sec - t1.tv_sec) * 1000.0; // sec to ms
gamestate1_elapsedTime += (t2.tv_usec - t1.tv_usec) / 1000.0; // us to ms
// add that elapsed time together, and keep track of its total
//r_gamestate1_elapsedTime_total = gamestate1_elapsedTime;
//if (r_gamestate1_elapsedTime_total > 5 seconds) ...
However, the gamestate1_elapsedTime appears to have some variability to it. Because the output is seldom consistent. I guess it's b/c gettimeofday employs CPU time(?), and I am artificially clamping this with my fps clamp.
I've tried clock() as well, but that also appears to be CPU time - not wall time.
As mentioned above, GLUT_ELAPSED_TIME works well, except that I am unable to reset it midway through my program, and my 5-seconds is no longer dependent upon my initial button click.
I would deeply appreciate even a nudge in the right direction, if you could lend some advice. Thank you very much in advance.
-kropcke
You don't need to "reset" GLUT_ELAPSED_TIME, you just need to copy it somewhere that you can use as an offset. For example:
int timeAtReset = glutGet(GLUT_ELAPSED_TIME);
// I start the program in gamestate 0...
{
if (button_has_been_pressed == 1)
{
gamestate = 1;
timeAtReset = glutGet(GLUT_ELAPSED_TIME);
}
}
if (gamestate==1)
{
int timeSinceReset = glutGet(GLUT_ELAPSED_TIME) - timeAtReset;
// use timeSinceReset, instead of glutGet(GLUT_ELAPSED_TIME), to
// wait for 5 seconds and go to gamestate2
gamestate = 2;
}