QTimer start specific time - c++

I try to start timer at specific time like 02:30. Every day it starts at 02.30.
Is it possible? Do you have any idea?
Thank a lot.

QTimer doesn't handle specific times of day natively, but you could use it in conjunction with QDateTime to get what you want. That is, use QDateTime objects to figure out how many seconds are between (right now) and 2:30 (QDateTime::msecsTo() looks particularly appropriate here), then set your QTimer to go off after that many seconds. Repeat as necessary.

Depending on the required resolution, you could use an ordinary QTimer that fires let's say every minute.
In the timerEvent, you could check if you are on the right time (using QDateTime), and trigger the necessary event.

The solution of Jeremy is indeed elegant, but it doesn't take into account the daylight savings time.
To guard against that, you should fire a timer event every hour and check the wall clock.
Calculate the delta to the target, like Jeremy proposes, and if it falls within the coming hour, set a timer to fire, and disable the hourly timer.
If not, just wait for the hourly timer to fire again.
Pseudo code:
Get wall clock time
Calculate difference between target time and wall clock
If difference < 1 hour:
Set timer to fire after difference secs
If this is a repeating event, restart the hourly timer
Else:
Start watch timer to do this calculation again after one hour

Related

Timer countdown even when program is not running QML

I am trying to have a 24 hour countdown on my user interface in a QML/Qt project. The time should update every second like 23:59:59 then 23:59:58. Additionally, I need the time to continue going down even when the application is not open. So if the time is 23:59:59 when I close the app, if I open it two hours later it should continue counting down from 21:59:59. If the timer had timed out when the app isn't running, it needs to reset to 24 and continue. Does anyone know how I could do this, either QML or connected c++? Any help would be greatly appreciated.
You need to store somewhere timer's end time according to system clock or equivalent information. So at each moment you can tell timer's value by taking difference between system clock's now() and timer's end.
Just use std::this_thread::sleep_until to wait to the exact moment you need to update the time for the next second. Don't use sleep_for(1s) as this way you'll accumulate inaccuracies.
Note: system clock has an issue that it can be adjusted. I don't fully know of a way around it - say your application turned off then how to tell how much time passed if system clock was adjusted? You can deal with clock adjustment during application run by using sleep_until with steady_clock. In C++ 20 they introduce utc_clock perhaps you can access that somehow which should solve the issue with daylight saving time adjustments. I don't think that it is theoretical possible to deal with all types of clock adjustments unless you have access to GPS clock.

QTimer: are "multiple timeouts" possible?

I'm building a simple game that shows a countdown to the user, starting at 1:00 and counting down to zero. As 0:00 is reached, I want to display a message like "time's up!".
I currently have a QTimer and a QTime object (QTime starting at 00:01:00)
QTimer *timer=new QTimer();
QTime time{0,1,0};
In the constructor I'm setting the timer to timeout every 1 second, and it's connected to an event that updates the countdown on screen, which is initially displaying the timer at 1:00:
connect(timer, SIGNAL(timeout()), this, SLOT(updateCountDown()));
timer->start(1000);
ui->countdown->setText(time.toString("m:ss"));
This is the slot being called every 1 second:
void MainWindow::updateCountDown(){
time=time.addSecs(-1);
ui->countDown->setText(time.toString("m:ss"));
}
Now I need to be able to call a new method whenever the QTime reaches 0:00. I'm not very keen on adding an if on the updateCountdown method to check if the QTime is at 0:00 every second. I also thought maybe I could add a second QTimer that times out at 1 minute, but I'm not sure if both QTimer objects will start at the exact same time so the 1 minute timeout will happen exactly when the QTime object is at 0:00.
So is there a way to add a second timeout to the same QTimer object (to timeout once every second to update the countdown on screen and then a second timeout after 1 minute to end the game? I suspect the answer will be "no", but in that case, what would be the best approach? (if none of my options are valid, is there a better way to do it?).
The answer to your first question is no -- QTimer supports a single timeout (or a single specified time-interval, if it's not running in single-shot mode).
The answer to your second question is -- the best approach is the simplest one that can possibly work. Use a single QTimer, and in your updateCountdown() method, include an if statement to do something different when the countdown finally reaches zero. (Btw you don't really need to use a QTime object to represent the countdown; you could just as easily use an int that starts at 60 and is decremented until it reaches 0)
Why is this better/simpler? You could use two QTimer objects instead, but then you have to worry about keeping them in sync -- perhaps that's not a big deal for now, but what happens when you want to add a "Pause" button to your game, or when you want to add a time-bonus that gives the player 10 extra seconds of play time? All of a sudden, your 60-second timer will no longer be doing the right thing, and it will be a pain to stop and restart it correctly in all cases.
If-statements are cheap, and easy to understand/control/debug; so you might as well use one.

QTimer long timeout

Qt 5.7 32-bit on windows 10 64-bit
long period timer
the interval of a QTimer is given in msecs as a signed integer, so the maximum interval which can be set is a little bit more than 24 days (2^31 / (1000*3600*24) = 24.85)
I need a timer with intervals going far beyond this limit.
So my question is, which alternative do you recommend? std::chrono (C++11) seems not to be suitable as it does not have an event handler?
Alain
You could always create your own class which uses multiple QTimer's for the duration they are valid and just count how many have elapsed.
Pretty simple problem. If you can only count to 10 and you need to count to 100 - just count to 10 ten times.
I would implement this in the following way:
Upon timer start, note the current time in milliseconds like this:
m_timerStartTime = QDateTime::currentMSecsSinceEpoch()
The, I would start a timer at some large interval, such as 10 hours, and attach a handler function to the timer that simply compared the time since it started to see if we are due:
if(QDateTime::currentMSecsSinceEpoch() - m_timerStartTime > WANTED_DELAY_TIME){
// Execute the timer payload
// Stop interval timer
}
This simple approach could be improved in several ways. For example, to keep the timer running even if application is stopped/restarted, simply save the timer start time in a setting or other persistent storage, and read it back in at application start up.
And to improve precision, simply change the interval from the timer handler function in the last iteration so that it tracks the initial end time perfectly (instead of overshooting by up to 10 minutes).

Multitasking issue

I have am creating a game.Then game has a timer 100 sec ,which move to zero like 99,98,97...
Now when we lose the game we retry ...but this time the timer decrements of2...lieke 98,96,94
if we lose again and retry this time the difference is of 3...
i noticed that when we lose and retry the timer function is called twice so it make decrement of 2...similarly if we retry for the 3rd time the timer is called three times and so on?
what is this issue?please urgent help required
Perhaps your timer is being started each time you retry. The first time, you have one timer running. Then the second time, you have two timers, and the third time three... You may need to stop the previous timer before retrying so that you only have one timer running.

Modify Time for simulation in c++

i am writing a program which simulates an activity, i am wondering how to speed up time for the simulation, let say 1 hour in the real world is equal to 1 month in the program.
thank you
the program is actually similar to a restaurant simulation where you dont really know when customer come. let say we pick a random number (2-10) customer every one hour
It depends on how it gets time now.
For example, if it calls Linux system time(), just replace that with your own function (like mytime) which returns speedier times. Perhaps mytime calls time and multiplies the returned time by whatever factor makes sense. 1 hr = 1 month is 720 times. Handling the origin as when the program begins should be accounted for:
time_t t0;
main ()
{
t0 = time(NULL); // at program initialization
....
for (;;)
{
time_t sim_time = mytime (NULL);
// yada yada yada
...
}
}
time_t mytime (void *)
{
return 720 * (time (NULL) - t0); // account for time since program started
// and magnify by 720, so one hour is one month
}
You just do it. You decide how many events take place in an hour of simulation time (eg., if an event takes place once a second, then after 3600 simulated events you've simulated an hour of time). There's no need for your simulation to run in real time; you can run it as fast as you can calculate the relevant numbers.
It sounds like you are implementing a Discrete Event Simulation. You don't even need to have a free-running timer (no matter what scaling you may use) in such a situation. It's all driven by the events. You have a priority queue containing events, ordered by the event time. You have a processing loop which takes the event at the head of the queue, and advances the simulation time to the event time. You process the event, which may involve scheduling more events. (For example, the customerArrived event may cause a customerOrdersDinner event to be generated 2 minutes later.) You can easily simulate customers arriving using random().
The other answers I've read thus far are still assuming you need a continuous timer, which is usually not the most efficient way of simulating an event-driven system. You don't need to scale real time to simulation time, or have ticks. Let the events drive time!
If the simulation is data dependent (like a stock market program), just speed up the rate at which the data is pumped. If it is some think that depends on time() calls you will have to do some thing like wallyk's answer (assuming you have the source code).
If time in your simulation is discrete, one option is to structure your program so that something happens "every tick".
Once you do that, time in your program is arbitrarily fast.
Is there really a reason for having a month of simulation time correspond exactly to an hour of time in the real world ? If yes, you can always process the number of ticks that correspond to a month, and then pause the appropriate amount of time to let an hour of "real time" finish.
Of course, a key variable here is the granularity of your simulation, i.e. how many ticks correspond to a second of simulated time.