I am new to c++ but I just cant get this to work at all. I am trying to get the system current time in ms and do something with it but it wont work what I have tried.
Qt
QDateTime qt = new QDateTime();
int x = qt.currentDateTimeUtc();
if(x%5 ==0){
//something
}
c++
double sysTime = time(0);
if(sysTime%5.00 ==0.00){
}
I get invalid operands of type double to binary operator error. I have no idea why? Can anyone point in the right direction
For QT, try using the function QDateTime::toMSecsSinceEpoch()
http://doc.qt.io/qt-5/qdatetime.html#toMSecsSinceEpoch
This will return a qint64 http://doc.qt.io/qt-5/qtglobal.html#qint64-typedef
If you're trying to get the unix timestamp in milliseconds in C you can try this code:
include "time.h"
...
time_t seconds;
time(&seconds);
unsigned long long millis = (unsigned long long)seconds * 1000;
Though please note this is multiplied by 1000 - it looks like milliseconds but the accuracy is that of seconds - which judging by your x % 5 code might be enough if you're trying to do something every 5 seconds, so the following should be enough:
time_t seconds; time(&seconds);
Related
I'm trying to measure the CPU and Wall time for my program.
The code should run on Windows so it's alright to use platform specific functions.
For Wall time I use QueryPerformanceCounter() and it is precise.
When I use GetProcessTimes() I get a 15.625 millisecond precision.
On MSDN it says that the precision of the returned CPU time is 100 nanoseconds.
Here is the code I am using:
void getCPUtime(unsigned long long *pUser, unsigned long long *pKernel) {
FILETIME user, kernel, exit, start;
ULARGE_INTEGER userCPU, kernelCPU;
if (::GetProcessTimes(::GetCurrentProcess(), &start, &exit, &kernel, &user) != 0) {
userCPU.LowPart = user.dwLowDateTime;
userCPU.HighPart = user.dwHighDateTime;
kernelCPU.LowPart = kernel.dwLowDateTime;
kernelCPU.HighPart = kernel.dwHighDateTime;
}
*pUser = (unsigned long long)userCPU.QuadPart;
*pKernel = (unsigned long long)kernelCPU.QuadPart;
}
And I am calling it from:
void someFunction() {
unsigned long long *userStartCPU, *userEndCPU, *kernelStartCPU, *kernelEndCPU;
double userCPUTime, kernelCPUTime;
getCPUtime(userStartCPU, kernelStartCPU);
// Do stuff which takes longer than a millisecond
getCPUtime(userEndCPU, kernelEndCPU);
userCPUTime = (userEndCPU - userStartCPU) / (double)10000.00; // Convert to milliseconds
kernelCPUTime = (kernelEndCPU - kernelStartCPU) / (double)10000.00; // Convert to milliseconds
}
Does anyone know why this is happening, or has any other way to precisely measure CPU time on Windows?
MSDN has this page that outlines using a high resolution timer.
I would recommend looking at Google Benchmark]2. Looking at the Windows specific code, you might need to use double instead of integers as used in the MakeTime function here
I am just develop the game in C# and I was migrated C++ newly, I just want to rewrite the game into C++, and almost successfully to translate of all the code from C# to C++, but there are some problem, my game use System.DateTime.Now.Ticks in C#... and I was stuck
so.. there are something equivalent with System.DateTime.Now.Ticks in C++?
look at this tutorial on cplusplus.com to get platform independent.
Function Clock() returns the amount of tics that this process have consumed. To access this function, you need to include: #include <time.h>.
To get the same thing, only in seconds, you can use the CLOCKS_PER_SEC macro, so you can do: Clock() / CLOCKS_PER_SEC to get the number of seconds that this process took. But remember that you might loose some precision this way.
You might not need this exact functionality though... If you are looking for the exact time that has elapsed since the program has started, you (as far as I can remember) must use difftime() function. You might be loosing some precision if you need exact tics, depending on your platform.
This way you must save the current time at the beginning of your application, and subtract it from the current time during the application.
#include <stdio.h>
#include <time.h>
time_t programstart;
int main ()
{
time(&programstart); //assign the current time
/*... the program ...*/
//get seconds since start
double secondssincestart = difftime(time(NULL), programstart);
printf ("the program took: %f seconds\n", secondssincestart);
return 0;
}
EDIT:
Since this post still gets some attention, it is important to note, that today's C++11 has standard, easy to use, and pretty handy library chrono.
Use QueryUnbiasedInterruptTime. Like DateTime.Ticks, each tick of QueryUnbiasedInterruptTime is also 100-nanosecond. If you need higher resolution, you need to go for QueryPerformanceCounter.
Link: http://msdn.microsoft.com/en-us/library/windows/desktop/ee662306(v=vs.85).aspx
There isn't an equivalent class or method, but you can do this:
SYSTEMTIME systemTime;
GetLocalTime(&systemTime);
FILETIME fileTime;
SystemTimeToFileTime(&systemTime, &fileTime);
ULARGE_INTEGER largeInteger;
largeInteger.LowPart = fileTime.dwLowDateTime;
largeInteger.HighPart = fileTime.dwHighDateTime;
ULONGLONG ticks = reinterpret_cast<ULONGLONG&>(largeInteger);
In Windows with MFC the Ticks equivalent to System.DateTime.Now.Ticks
ULONGLONG GetTicksNow()
{
COleDateTime epoch(100, 1, 1, 00, 00, 00);
COleDateTime currTime = COleDateTime::GetCurrentTime();
COleDateTimeSpan span = currTime - epoch;
CTimeSpan cSpan(span.GetDays(), span.GetHours(), span.GetMinutes(),
span.GetSeconds());
ULONGLONG diff = cSpan.GetTotalSeconds();
LONG missingDays = 365 * 99 + 24;
CTimeSpan centSpan(missingDays, 0, 0, 0);
ULONGLONG centSeconds = centSpan.GetTotalSeconds();// *1000000000;//
ULONGLONG totSec = (diff + centSeconds)*10000000;
return totSec ;
}
I have found a function to get milliseconds since the Mac was started:
U32 Platform::getRealMilliseconds()
{
// Duration is a S32 value.
// if negative, it is in microseconds.
// if positive, it is in milliseconds.
Duration durTime = AbsoluteToDuration(UpTime());
U32 ret;
if( durTime < 0 )
ret = durTime / -1000;
else
ret = durTime;
return ret;
}
The problem is that after ~20 days AbsoluteToDuration returns INT_MAX all the time until the Mac is rebooted.
I have tried to use method below, it worked, but looks like gettimeofday takes more time and slows down the game a bit:
timeval tim;
gettimeofday(&tim, NULL);
U32 ret = ((tim.tv_sec) * 1000 + tim.tv_usec/1000.0) + 0.5;
Is there a better way to get number of milliseconds elapsed since some epoch (preferably since the app started)?
Thanks!
Your real problem is that you are trying to fit an uptime-in-milliseconds value into a 32-bit integer. If you do that your value will always wrap back to zero (or saturate) in 49 days or less, no matter how you obtain the value.
One possible solution would be to track time values with a 64-bit integer instead; that way the day of reckoning gets postponed for a few hundred years and so you don't have to worry about the problem. Here's a MacOS/X implementation of that:
uint64_t GetTimeInMillisecondsSinceBoot()
{
return UnsignedWideToUInt64(AbsoluteToNanoseconds(UpTime()))/1000000;
}
... or if you don't want to return a 64-bit time value, the next-best thing would be to record the current time-in-milliseconds value when your program starts, and then always subtract that value from the values you return. That way things won't break until your own program has been running for at least 49 days, which I suppose is unlikely for a game.
uint32_t GetTimeInMillisecondsSinceProgramStart()
{
static uint64_t _firstTimeMillis = GetTimeInMillisecondsSinceBoot();
uint64_t nowMillis = GetTimeInMillisecondsSinceBoot();
return (uint32_t) (nowMillis-_firstTimeMillis);
}
My preferred method is mach_absolute_time - see this tech note - I use the second method, i.e. mach_absolute_time to get time stamps and mach_timebase_info to get the constants needed to convert the difference between time stamps into an actual time value (with nanosecond resolution).
I've seen some other answers on SO that suggest we can get the time from epoch in milliseconds by subtracting the epoch time from the "other" time, but it doesn't work when I try it:
ptime epoch = time_from_string("1970-01-01 00:00:00.000");
ptime other = time_from_string("2011-08-09 17:27:00.000");
long diff = (other-epoch).total_milliseconds();
At this stage diff is -1349172576 and it should be a positive number since the "other" time is 2011. Does anybody know what might be causing this? What's the proper way to get the milliseconds since epoch?
Additionally, I've tried to construct a ptime object from milliseconds:
ptime result = from_time_t(diff);
Result then becomes: "1927-Apr-01 13:50:24" and it should be "2011-Aug-09 17:27:00.000". What's the catch here?
Update:
OK, so my mistake stems from the fact that I have 2 programs, one is C# (8 byte/64-bit long) and a C++ (4 byte/32-bit long); in any case, that interaction is not depicted here.
However, when I use long long, the value is positive but the resulting date (constructed from_time_t) is still incorrect: "2012-Oct-02 10:09:36".
Presumably you're on a platform on which long is smaller than 64 bits.
Let's assume it's 32 bits – in that case, the maximum value of a long is 2147483648. However, it's been ~1312000000000 milliseconds since epoch, so long is clearly insufficient to hold this value and consequently you're seeing overflow.
I'd do something like this instead:
ptime epoch = time_from_string("1970-01-01 00:00:00.000");
ptime other = time_from_string("2011-08-09 17:27:00.000");
time_duration const diff = other - epoch;
long long ms = diff.total_seconds();
ms *= 1000LL;
ms += diff.fractional_seconds() / 1000000L; // 1000L if you didn't build datetime
// with nanosecond resolution
Creating a ptime from the specified number of milliseconds has the same problem – ptime works in terms of long and you have a long long – so you'll essentially need to do the reverse:
// given long long ms
time_duration t = seconds(static_cast<long>(ms / 1000LL));
if (ms % 1000LL)
t += milliseconds(static_cast<long>(ms % 1000LL));
A shortened variation on ildjarn's great solution:
ptime epoch = time_from_string("1970-01-01 00:00:00.000");
ptime other = time_from_string("2011-08-09 17:27:00.001");
time_duration const diff = other - epoch;
long long ms = diff.total_milliseconds();
This would be independent of whether it was built with nanosecond resolution.
you could try:
ptime other = time_from_string("2011-08-09 17:27:00.000");
time_t posix_time = (other - ptime(min_date_time)).total_seconds();
How do I convert from TickCounts to Milliseconds?
this is what I used:
long int before = GetTickCount();
long int after = GetTickCount();
I want the difference of it in seconds.
int seconds = (after - before) /1000;
for more precision, there is also QueryPerformanceCounter()
int seconds = (after - before + 500) / 1000;
or:
double seconds = (after - before) / 1000.0;
GetTickCount() returns the time in milliseconds. so (after - before)/<milli equivalent> should give you time in seconds
I'm not sure what OS/platform you're using, but there should be a call that returns the tick time in milliseconds.
time = after - before * <tick time in milliseconds>;
Edit:
I see that this is a Windows function that returns milliseconds already. The other answers are better.