Is there a simple way to get scaled unix timestamp in C++ - c++

I'm porting some PHP to C++. Some of our database code stores time values as unix time stamps *100
The php contains code that looks a bit like this.
//PHP
static function getTickTime()
{
return round(microtime(true)*100);
}
I need something like this:
//C++
uint64_t getTickTime()
{
ptime Jan1st1970(date(1970, 1, 1));
ptime Now = microsecond_clock::local_time();
time_duration diff = Now - Jan1st1970;
return static_cast<uint64_t>(diff.total_seconds()*100);
}
Is something like this sensible? Is there a neater solution?
Is there something nasty in this code that I can't see? (Guess I'm not experienced enough with boost::date_time to know these things)

The neatest and most portable solution is to use the time() function, defined in <ctime>, which returns the number of seconds since the Unix epoch.
If you do use boost, you'll want universal_time(), not local_time(), since the epoch is specified in UTC.

The solution suggested by dauphic can be modified to something like this
uint64_t getTickTime()
{
timeval tim;
gettimeofday(&tim, NULL);
return tim.tv_sec*100 + tim.tv_usec/10000;
}
I cant think of a neater solution than that.

Assuming you're on Unix, include <sys/time.h>
timeval tim;
gettimeofday(&tim, NULL);
return tim.tv_usec;
If on Windows, there's no good way to get microsecond resolution, especially because it uses a different epoch. This sample requires only <windows.h> I believe.
FILETIME tim;
GetSystemTimeAsFileTime(&tim);
ULARGE_INTEGER ms;
ms.LowPart = tim.dwLowDateTime;
ms.HighPart = tim.dwHighDateTime;
return ms.QuadPart * 10 + <nanoseconds from January 1, 1609 to January 1, 1970>; // ms represents how many 100s of nanoseconds

Related

Getting time point with microseconds precision

I need to retrieve the current time point with a precision of microseconds. The time point can be relative to any fixed date.
How can it be achieved? For job policy, I really should't use boost or any other lib.
I'm working at a multiplatform application and under Linux, I can use C++11 system_clock::now().time_since_epoch(), but under Windows I work with VS2010, so I have no std::chrono library.
I've seen the RtlTimeToSecondsSince1970 function, but its resolution is a second.
Timers and timing is a tricky enough subject that In my opinion current cross platform implementations are not quite up to scratch. So I'd recommend a specific version for windows with appropriate #ifdef's. See other answers if you want a cross-platform version.
If you've got to/want to use a windows specific call then GetSystemTimeAsFileTime (or on windows 8 GetSystemTimePreciseAsFileTime) are the best calls for getting UTC time and QueryPerformanceCounter is good for high resolution timestamps. It gives back the number of 100-nanosecond intervals since January 1, 1601 UTC into a FILETIME structure.
This fine article goes into the gory details of measuring timers and timestamps in windows and is well worth a read.
EDIT: Converting a FILETIME to us, you need to go via a ULARGE_INTEGER.
FILETIME ft;
GetSystemTimeAsFileTime(&ft);
ULARGE_INTEGER li;
li.LowPart = ft.dwLowDateTime;
li.HighPart = ft.dwHighDateTime;
unsigned long long valueAsHns = li.QuadPart;
unsigned long long valueAsUs = valueAsHns/10;
This code works for me in VS2010. The constructor tests to see if high-precision timing is available on the processor and currentTime() returns a time stamp in seconds. Compare time stamps for delta time. I use this for a game engine to get very small delta time values. Please note precision isn't limited to seconds despite the return value being named so (its a double).
Basically you find out how many seconds per cpu tick with QueryPerformanceFrequency and get the time using QueryPerformanceCounter.
////////////////////////
//Grabs speed of processor
////////////////////////
Timer::Timer()
{
__int64 _iCountsPerSec = 0;
bool _bPerfExists = QueryPerformanceFrequency((LARGE_INTEGER*)&_iCountsPerSec) != 0;
if (_bPerfExists)
{
m_dSecondsPerCount = 1.0 / static_cast<double>(_iCountsPerSec);
}
}
////////////////////////
//Returns current real time
////////////////////////
double Timer::currentTime() const
{
__int64 time = 0;
QueryPerformanceCounter((LARGE_INTEGER*)&time);
double timeInSeconds = static_cast<double>(time)* m_dSecondsPerCount;
return timeInSeconds;
}
The following code works in visual studio.
#include <time.h>
clock_t start , end ;
int getTicks_u32()
{
int cpu_time_used ;
end = clock() ;
cpu_time_used = (static_cast<int> (end - start)) / CLOCKS_PER_SEC;
return cpu_time_used ;
}
void initSystemClock_bl(void)
{
start = clock();
}

Inputs to sleep_until()

As per an answer to my previous question, I am trying to use sleep_until() but cannot find a tutorial that tells me how to use it with an actual time such as 1400:00h.
There are examples like this: std::this_thread::sleep_until(system_clock::now() + seconds(10));, but I would like something that actually lets me specify a clock time. What format should this be in? I would appreciate any examples.
To wait until a specified clock time, you need to get a time_t that represents that clock time, and then use std::chrono::system_clock::from_time_t to create a std::chrono::system_clock::time_point which you can then use with the xxx_until functions such as std::this_thread::sleep_until.
e.g.
void foo(){
tm timeout_tm={0};
// set timeout_tm to 14:00:01 today
timeout_tm.tm_year = 2013 - 1900;
timeout_tm.tm_mon = 7 - 1;
timeout_tm.tm_mday = 10;
timeout_tm.tm_hour = 14;
timeout_tm.tm_min = 0;
timeout_tm.tm_sec = 1;
timeout_tm.tm_isdst = -1;
time_t timeout_time_t=mktime(&timeout_tm);
std::chrono::system_clock::time_point timeout_tp=
std::chrono::system_clock::from_time_t(timeout_time_t);
std::this_thread::sleep_until(timeout_tp);
}
Have a look here: std::chrono
There are all sorts of date/time representation formats you'll need.
you want to use mktime to make up a time specification which you can use in sleep_until as done here

C++ Something that equivalent with System.DateTime.Now.Ticks?

I am just develop the game in C# and I was migrated C++ newly, I just want to rewrite the game into C++, and almost successfully to translate of all the code from C# to C++, but there are some problem, my game use System.DateTime.Now.Ticks in C#... and I was stuck
so.. there are something equivalent with System.DateTime.Now.Ticks in C++?
look at this tutorial on cplusplus.com to get platform independent.
Function Clock() returns the amount of tics that this process have consumed. To access this function, you need to include: #include <time.h>.
To get the same thing, only in seconds, you can use the CLOCKS_PER_SEC macro, so you can do: Clock() / CLOCKS_PER_SEC to get the number of seconds that this process took. But remember that you might loose some precision this way.
You might not need this exact functionality though... If you are looking for the exact time that has elapsed since the program has started, you (as far as I can remember) must use difftime() function. You might be loosing some precision if you need exact tics, depending on your platform.
This way you must save the current time at the beginning of your application, and subtract it from the current time during the application.
#include <stdio.h>
#include <time.h>
time_t programstart;
int main ()
{
time(&programstart); //assign the current time
/*... the program ...*/
//get seconds since start
double secondssincestart = difftime(time(NULL), programstart);
printf ("the program took: %f seconds\n", secondssincestart);
return 0;
}
EDIT:
Since this post still gets some attention, it is important to note, that today's C++11 has standard, easy to use, and pretty handy library chrono.
Use QueryUnbiasedInterruptTime. Like DateTime.Ticks, each tick of QueryUnbiasedInterruptTime is also 100-nanosecond. If you need higher resolution, you need to go for QueryPerformanceCounter.
Link: http://msdn.microsoft.com/en-us/library/windows/desktop/ee662306(v=vs.85).aspx
There isn't an equivalent class or method, but you can do this:
SYSTEMTIME systemTime;
GetLocalTime(&systemTime);
FILETIME fileTime;
SystemTimeToFileTime(&systemTime, &fileTime);
ULARGE_INTEGER largeInteger;
largeInteger.LowPart = fileTime.dwLowDateTime;
largeInteger.HighPart = fileTime.dwHighDateTime;
ULONGLONG ticks = reinterpret_cast<ULONGLONG&>(largeInteger);
In Windows with MFC the Ticks equivalent to System.DateTime.Now.Ticks
ULONGLONG GetTicksNow()
{
COleDateTime epoch(100, 1, 1, 00, 00, 00);
COleDateTime currTime = COleDateTime::GetCurrentTime();
COleDateTimeSpan span = currTime - epoch;
CTimeSpan cSpan(span.GetDays(), span.GetHours(), span.GetMinutes(),
span.GetSeconds());
ULONGLONG diff = cSpan.GetTotalSeconds();
LONG missingDays = 365 * 99 + 24;
CTimeSpan centSpan(missingDays, 0, 0, 0);
ULONGLONG centSeconds = centSpan.GetTotalSeconds();// *1000000000;//
ULONGLONG totSec = (diff + centSeconds)*10000000;
return totSec ;
}

Portable way of counting milliseconds in C++?

Is there any portable (Windows & Linux) way of counting how many milliseconds elapsed between two calls ?
Basically, I want to achieve the same functionnality than the StopWatch class of .NET. (for those who already used it)
In a perfect world, I would have used boost::date_time but that's not an option here due to some silly rules I'm enforced to respect.
For those who better read code, this is what I'd like to achieve.
Timer timer;
timer.start();
// Some instructions here
timer.stop();
// Print out the elapsed time
std::cout << "Elapsed time: " << timer.milliseconds() << "ms" << std::endl;
So, if there is a portable (set of) function(s) that can help me implement the Timer class, what is it ? If there is no such function, what Windows & Linux API should I use to achieve this functionnality ? (using #ifdef WINDOWS-like macros)
Thanks !
On Linux (and generally in POSIX), you can use gettimeofday function, which returns number of microseconds since the Epoch. On Windows, there is GetTickCount function, that return number of milliseconds since the system was started.
clock() (in Time.h) returns a value which increases CLOCKS_PER_SEC every second, commonly 1000.
On Windows, use the High Performance Timer, it's a doddle.
LARGE_INTEGER frequency;
LARGE_INTEGER one;
LARGE_INTEGER two;
QueryPerformanceFrequency(&frequency);
QueryPerformanceCounter(&one);
// something
QueryPerformanceCounter(&two);
std::cout << (((double)two.QuadPart - (double)one.QuadPart) / (double)frequency.QuadPart) * (double)1000;
In theory, this can go up to per-clock-cycle accuracy, depending on the CPU in question.
This is my code for this task (Boost license)
Where ptime is class that represents time in seconds + nanoseconds ptime(int sec,int nano)
And ptime::microseconds create sec/nano pair from posix time microseconds.
It is quite easy to rewrite it for your needs and write such class.
ptime ptime::now()
{
#ifndef BOOSTER_WIN_NATIVE
struct timeval tv;
gettimeofday(&tv,0);
return ptime(tv.tv_sec,tv.tv_usec * 1000);
#else
FILETIME ft;
GetSystemTimeAsFileTime(&ft);
unsigned long long tt = ft.dwHighDateTime;
tt <<=32;
tt |= ft.dwLowDateTime;
tt /=10;
tt -= 11644473600000000ULL;
return ptime(ptime::microseconds(tt));
#endif
}
And there is no portable C++ function for this...

Time difference in C++

Does anyone know how to calculate time difference in C++ in milliseconds?
I used difftime but it doesn't have enough precision for what I'm trying to measure.
I know this is an old question, but there's an updated answer for C++0x. There is a new header called <chrono> which contains modern time utilities. Example use:
#include <iostream>
#include <thread>
#include <chrono>
int main()
{
typedef std::chrono::high_resolution_clock Clock;
typedef std::chrono::milliseconds milliseconds;
Clock::time_point t0 = Clock::now();
std::this_thread::sleep_for(milliseconds(50));
Clock::time_point t1 = Clock::now();
milliseconds ms = std::chrono::duration_cast<milliseconds>(t1 - t0);
std::cout << ms.count() << "ms\n";
}
50ms
More information can be found here:
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2661.htm
There is also now a boost implementation of <chrono>.
You have to use one of the more specific time structures, either timeval (microsecond-resolution) or timespec (nanosecond-resolution), but you can do it manually fairly easily:
#include <time.h>
int diff_ms(timeval t1, timeval t2)
{
return (((t1.tv_sec - t2.tv_sec) * 1000000) +
(t1.tv_usec - t2.tv_usec))/1000;
}
This obviously has some problems with integer overflow if the difference in times is really large (or if you have 16-bit ints), but that's probably not a common case.
if you are using win32 FILETIME is the most accurate that you can get:
Contains a 64-bit value representing the number of 100-nanosecond intervals since January 1, 1601 (UTC).
So if you want to calculate the difference between two times in milliseconds you do the following:
UINT64 getTime()
{
SYSTEMTIME st;
GetSystemTime(&st);
FILETIME ft;
SystemTimeToFileTime(&st, &ft); // converts to file time format
ULARGE_INTEGER ui;
ui.LowPart=ft.dwLowDateTime;
ui.HighPart=ft.dwHighDateTime;
return ui.QuadPart;
}
int _tmain(int argc, TCHAR* argv[], TCHAR* envp[])
{
//! Start counting time
UINT64 start, finish;
start=getTime();
//do something...
//! Stop counting elapsed time
finish = getTime();
//now you can calculate the difference any way that you want
//in seconds:
_tprintf(_T("Time elapsed executing this code: %.03f seconds."), (((float)(finish-start))/((float)10000))/1000 );
//or in miliseconds
_tprintf(_T("Time elapsed executing this code: %I64d seconds."), (finish-start)/10000 );
}
The clock function gives you a millisecond timer, but it's not the greatest. Its real resolution is going to depend on your system. You can try
#include <time.h>
int clo = clock();
//do stuff
cout << (clock() - clo) << endl;
and see how your results are.
You can use gettimeofday to get the number of microseconds since epoch. The seconds segment of the value returned by gettimeofday() is the same as that returned by time() and can be cast to a time_t and used in difftime. A millisecond is 1000 microseconds.
After you use difftime, calculate the difference in the microseconds field yourself.
You can get micro and nanosecond precision out of Boost.Date_Time.
If you're looking to do benchmarking, you might want to see some of the other threads here on SO which discuss the topic.
Also, be sure you understand the difference between accuracy and precision.
I think you will have to use something platform-specific. Hopefully that won't matter?
eg. On Windows, look at QueryPerformanceCounter() which will give you something much
better than milliseconds.