Two independent timers with two different time periods running simultaneously.
Using SetWaitableTimer.
SetWaitableTimer(mhandle, &liDueTime, 0, onCallFunc, this, TRUE);
What iperiod indicate?
After the first timer expiry at lpDueTime, the timer will go off every lPeriod milliseconds unless
lPeriod is zero. Zero means the timer goes off only once, or
lPeriod is less than zero. SetWaitableTimer fails if given a negative period.
For more details see Documentation.
Related
Qt 5.7 32-bit on windows 10 64-bit
long period timer
the interval of a QTimer is given in msecs as a signed integer, so the maximum interval which can be set is a little bit more than 24 days (2^31 / (1000*3600*24) = 24.85)
I need a timer with intervals going far beyond this limit.
So my question is, which alternative do you recommend? std::chrono (C++11) seems not to be suitable as it does not have an event handler?
Alain
You could always create your own class which uses multiple QTimer's for the duration they are valid and just count how many have elapsed.
Pretty simple problem. If you can only count to 10 and you need to count to 100 - just count to 10 ten times.
I would implement this in the following way:
Upon timer start, note the current time in milliseconds like this:
m_timerStartTime = QDateTime::currentMSecsSinceEpoch()
The, I would start a timer at some large interval, such as 10 hours, and attach a handler function to the timer that simply compared the time since it started to see if we are due:
if(QDateTime::currentMSecsSinceEpoch() - m_timerStartTime > WANTED_DELAY_TIME){
// Execute the timer payload
// Stop interval timer
}
This simple approach could be improved in several ways. For example, to keep the timer running even if application is stopped/restarted, simply save the timer start time in a setting or other persistent storage, and read it back in at application start up.
And to improve precision, simply change the interval from the timer handler function in the last iteration so that it tracks the initial end time perfectly (instead of overshooting by up to 10 minutes).
I try to start timer at specific time like 02:30. Every day it starts at 02.30.
Is it possible? Do you have any idea?
Thank a lot.
QTimer doesn't handle specific times of day natively, but you could use it in conjunction with QDateTime to get what you want. That is, use QDateTime objects to figure out how many seconds are between (right now) and 2:30 (QDateTime::msecsTo() looks particularly appropriate here), then set your QTimer to go off after that many seconds. Repeat as necessary.
Depending on the required resolution, you could use an ordinary QTimer that fires let's say every minute.
In the timerEvent, you could check if you are on the right time (using QDateTime), and trigger the necessary event.
The solution of Jeremy is indeed elegant, but it doesn't take into account the daylight savings time.
To guard against that, you should fire a timer event every hour and check the wall clock.
Calculate the delta to the target, like Jeremy proposes, and if it falls within the coming hour, set a timer to fire, and disable the hourly timer.
If not, just wait for the hourly timer to fire again.
Pseudo code:
Get wall clock time
Calculate difference between target time and wall clock
If difference < 1 hour:
Set timer to fire after difference secs
If this is a repeating event, restart the hourly timer
Else:
Start watch timer to do this calculation again after one hour
I have am creating a game.Then game has a timer 100 sec ,which move to zero like 99,98,97...
Now when we lose the game we retry ...but this time the timer decrements of2...lieke 98,96,94
if we lose again and retry this time the difference is of 3...
i noticed that when we lose and retry the timer function is called twice so it make decrement of 2...similarly if we retry for the 3rd time the timer is called three times and so on?
what is this issue?please urgent help required
Perhaps your timer is being started each time you retry. The first time, you have one timer running. Then the second time, you have two timers, and the third time three... You may need to stop the previous timer before retrying so that you only have one timer running.
i am writing a program which simulates an activity, i am wondering how to speed up time for the simulation, let say 1 hour in the real world is equal to 1 month in the program.
thank you
the program is actually similar to a restaurant simulation where you dont really know when customer come. let say we pick a random number (2-10) customer every one hour
It depends on how it gets time now.
For example, if it calls Linux system time(), just replace that with your own function (like mytime) which returns speedier times. Perhaps mytime calls time and multiplies the returned time by whatever factor makes sense. 1 hr = 1 month is 720 times. Handling the origin as when the program begins should be accounted for:
time_t t0;
main ()
{
t0 = time(NULL); // at program initialization
....
for (;;)
{
time_t sim_time = mytime (NULL);
// yada yada yada
...
}
}
time_t mytime (void *)
{
return 720 * (time (NULL) - t0); // account for time since program started
// and magnify by 720, so one hour is one month
}
You just do it. You decide how many events take place in an hour of simulation time (eg., if an event takes place once a second, then after 3600 simulated events you've simulated an hour of time). There's no need for your simulation to run in real time; you can run it as fast as you can calculate the relevant numbers.
It sounds like you are implementing a Discrete Event Simulation. You don't even need to have a free-running timer (no matter what scaling you may use) in such a situation. It's all driven by the events. You have a priority queue containing events, ordered by the event time. You have a processing loop which takes the event at the head of the queue, and advances the simulation time to the event time. You process the event, which may involve scheduling more events. (For example, the customerArrived event may cause a customerOrdersDinner event to be generated 2 minutes later.) You can easily simulate customers arriving using random().
The other answers I've read thus far are still assuming you need a continuous timer, which is usually not the most efficient way of simulating an event-driven system. You don't need to scale real time to simulation time, or have ticks. Let the events drive time!
If the simulation is data dependent (like a stock market program), just speed up the rate at which the data is pumped. If it is some think that depends on time() calls you will have to do some thing like wallyk's answer (assuming you have the source code).
If time in your simulation is discrete, one option is to structure your program so that something happens "every tick".
Once you do that, time in your program is arbitrarily fast.
Is there really a reason for having a month of simulation time correspond exactly to an hour of time in the real world ? If yes, you can always process the number of ticks that correspond to a month, and then pause the appropriate amount of time to let an hour of "real time" finish.
Of course, a key variable here is the granularity of your simulation, i.e. how many ticks correspond to a second of simulated time.
I was just trying the SetTimer method in Win32 with some low values such as 10ms as the timeout period. I calculated the time it took to get 500 timer events and expected it to be around 5 seconds. Surprisingly I found that it is taking about 7.5 seconds to get these many events which means that it is timing out at about 16ms. Is there any limitation on the value we can set for the timeout period ( I couldn't find anything on the MSDN ) ? Also, does the other processes running in my system affect these timer messages?
OnTimer is based on WM_TIMER message, which is a low message priority, meaning it will be send only when there's no other message waiting.
Also MSDN explain that you can not set an interval less than USER_TIMER_MINIMUM, which is 10.
Regardless of that the scheduler will honor the time quantum.
Windows is not a real-time OS and can't handle that kind of precision (10 ms intervals). Having said that, there are multiple kinds of timers and some have better precision than others.
You can alter the granularity of the system timer down to 1ms - this is intended for MIDI work.
Basically, my experiences on w2k are that any requested wait period under 13ms returns a wait which oscillates randomly between two values, 0ms and 13ms. Timers longer than that are generally very accurate. Your 500 timer events - some were 0ms, some were 13ms (assuming 13ms is still correct). You ended up with a time shortfall.
As stated - windows is not a realtime OS. Asking it to do anything and expecting it at a specific time later is a fools errand. Setting a timer asks windows nicely to fire the WM_TIMER event as soon after the time has passed as is possible. This may be after other threads are dealt with and done. Therefore the actual time to see the WM_TIMER event can't be realistically predicted - All you know is it's >the time you set....
Checkout this article on windows time