As per an answer to my previous question, I am trying to use sleep_until() but cannot find a tutorial that tells me how to use it with an actual time such as 1400:00h.
There are examples like this: std::this_thread::sleep_until(system_clock::now() + seconds(10));, but I would like something that actually lets me specify a clock time. What format should this be in? I would appreciate any examples.
To wait until a specified clock time, you need to get a time_t that represents that clock time, and then use std::chrono::system_clock::from_time_t to create a std::chrono::system_clock::time_point which you can then use with the xxx_until functions such as std::this_thread::sleep_until.
e.g.
void foo(){
tm timeout_tm={0};
// set timeout_tm to 14:00:01 today
timeout_tm.tm_year = 2013 - 1900;
timeout_tm.tm_mon = 7 - 1;
timeout_tm.tm_mday = 10;
timeout_tm.tm_hour = 14;
timeout_tm.tm_min = 0;
timeout_tm.tm_sec = 1;
timeout_tm.tm_isdst = -1;
time_t timeout_time_t=mktime(&timeout_tm);
std::chrono::system_clock::time_point timeout_tp=
std::chrono::system_clock::from_time_t(timeout_time_t);
std::this_thread::sleep_until(timeout_tp);
}
Have a look here: std::chrono
There are all sorts of date/time representation formats you'll need.
you want to use mktime to make up a time specification which you can use in sleep_until as done here
Related
I have a few lines of code that takes the system clock on my Windows machine and converts it to a double.
std::chrono::time_point<std::chrono::system_clock> currentNow =
std::chrono::system_clock::now();
auto duration = currentNow.time_since_epoch();
double millis = static_cast<double>(std::chrono::duration_cast<std::chrono::milliseconds>(duration).count());
double origin_time = millis / 1000;
I would like to reverse this later on and convert the double to a string being the format YYYY-mm-dd HH:MM:SS.ffffffff
The first step I have right now is taking the double and passing it as a parameter to chrono::duration.
auto rep = std::chrono::duration<double>(origin_time);
How would I go about using the chrono library to achieve the string specified above, thanks!
Construct a new time_point containing the current epoch:
auto epoch = std::chrono::time_point<std::chrono::system_clock>();
Add your converted duration to this time_point:
auto oldNow = epoch + std::chrono::duration_cast<std::chrono::milliseconds>(std::chrono::duration<double>(origin_time));
Convert it into a std::time_t:
auto t_c = std::chrono::system_clock::to_time_t(oldNow);
Print it using formatting facilities for time_t:
std::cout << std::put_time(std::localtime(&t_c), "%F %T");
Example output:
2020-01-10 18:45:48
See it live!
You'll notice it is missing the .ffffffff part. That's because std::put_time has no formatting option for it. You can easily calculate the value yourself and put it at the end of the string, if that's important for you.
Also note that this is much better accomplished using C++20 chrono or Howard's Date library (which is basically the same thing, except for a few bits here and there) as noted in the comments by the man himself. For an overview on it, you may be interested in this answer.
I want to see whether my data is 120 second old by looking at the timestamp of the data so I have below code:
uint64_t now = duration_cast<milliseconds>(steady_clock::now().time_since_epoch()).count();
bool is_old = (120 * 1000 < (now - data_holder->getTimestamp()));
In the above code data_holder->getTimestamp() is uint64_t which returns timestamp in milliseconds. Does my above code looks right?
I'd probably do something like this:
auto now = system_clock::now().time_since_epoch();
// Use the correct time duration below. Milliseconds could be wrong, see 1)
auto diff = now - std::chrono::milliseconds(data_holder->getTimestamp());
bool is_old = diff > std::chrono::seconds{120};
// bool is_old = diff > 120s; // From C++14 onwards.
1) As mentioned, milliseconds could be the wrong unit to use for getTimestamp(). All possible types are
std::chrono::hours
std::chrono::minutes
std::chrono::seconds
std::chrono::milliseconds
std::chrono::microseconds
std::chrono::nanoseconds
You probably have to try out which one to use, because that depends on data_holder->getTimestamp().
Note: Big one
Making sure to use system_clock to measure time since epoch will work most likely. But the standard doesn't require that a clock's epoch is the UNIX epoch. You have encountered this with steady_clock already.
You'd have to calculate the difference between the clock's epoch and the epoch yourself (and I don't know of a way to do that right now for any clock). For system_clock, if you don't trust it to use the unix epoch you can use the following:
system_clock::duration time_since_unix_epoch()
{
std::tm epoch;
epoch.tm_mday = 1;
epoch.tm_mon = 0;
epoch.tm_year = 70;
std::time_t epocht = mktime(&epoch);
return system_clock::now() - system_clock::from_time_t(epocht);
}
instead of system_clock::now(). I'd prefer this method.
Unfortunatly you can't just replace system_clock with another clock from std::chrono because only std::system_clock offers from_time_t(time_t) which converts a real date to the internal time_point used by the clock.
I need to retrieve the current time point with a precision of microseconds. The time point can be relative to any fixed date.
How can it be achieved? For job policy, I really should't use boost or any other lib.
I'm working at a multiplatform application and under Linux, I can use C++11 system_clock::now().time_since_epoch(), but under Windows I work with VS2010, so I have no std::chrono library.
I've seen the RtlTimeToSecondsSince1970 function, but its resolution is a second.
Timers and timing is a tricky enough subject that In my opinion current cross platform implementations are not quite up to scratch. So I'd recommend a specific version for windows with appropriate #ifdef's. See other answers if you want a cross-platform version.
If you've got to/want to use a windows specific call then GetSystemTimeAsFileTime (or on windows 8 GetSystemTimePreciseAsFileTime) are the best calls for getting UTC time and QueryPerformanceCounter is good for high resolution timestamps. It gives back the number of 100-nanosecond intervals since January 1, 1601 UTC into a FILETIME structure.
This fine article goes into the gory details of measuring timers and timestamps in windows and is well worth a read.
EDIT: Converting a FILETIME to us, you need to go via a ULARGE_INTEGER.
FILETIME ft;
GetSystemTimeAsFileTime(&ft);
ULARGE_INTEGER li;
li.LowPart = ft.dwLowDateTime;
li.HighPart = ft.dwHighDateTime;
unsigned long long valueAsHns = li.QuadPart;
unsigned long long valueAsUs = valueAsHns/10;
This code works for me in VS2010. The constructor tests to see if high-precision timing is available on the processor and currentTime() returns a time stamp in seconds. Compare time stamps for delta time. I use this for a game engine to get very small delta time values. Please note precision isn't limited to seconds despite the return value being named so (its a double).
Basically you find out how many seconds per cpu tick with QueryPerformanceFrequency and get the time using QueryPerformanceCounter.
////////////////////////
//Grabs speed of processor
////////////////////////
Timer::Timer()
{
__int64 _iCountsPerSec = 0;
bool _bPerfExists = QueryPerformanceFrequency((LARGE_INTEGER*)&_iCountsPerSec) != 0;
if (_bPerfExists)
{
m_dSecondsPerCount = 1.0 / static_cast<double>(_iCountsPerSec);
}
}
////////////////////////
//Returns current real time
////////////////////////
double Timer::currentTime() const
{
__int64 time = 0;
QueryPerformanceCounter((LARGE_INTEGER*)&time);
double timeInSeconds = static_cast<double>(time)* m_dSecondsPerCount;
return timeInSeconds;
}
The following code works in visual studio.
#include <time.h>
clock_t start , end ;
int getTicks_u32()
{
int cpu_time_used ;
end = clock() ;
cpu_time_used = (static_cast<int> (end - start)) / CLOCKS_PER_SEC;
return cpu_time_used ;
}
void initSystemClock_bl(void)
{
start = clock();
}
I am just develop the game in C# and I was migrated C++ newly, I just want to rewrite the game into C++, and almost successfully to translate of all the code from C# to C++, but there are some problem, my game use System.DateTime.Now.Ticks in C#... and I was stuck
so.. there are something equivalent with System.DateTime.Now.Ticks in C++?
look at this tutorial on cplusplus.com to get platform independent.
Function Clock() returns the amount of tics that this process have consumed. To access this function, you need to include: #include <time.h>.
To get the same thing, only in seconds, you can use the CLOCKS_PER_SEC macro, so you can do: Clock() / CLOCKS_PER_SEC to get the number of seconds that this process took. But remember that you might loose some precision this way.
You might not need this exact functionality though... If you are looking for the exact time that has elapsed since the program has started, you (as far as I can remember) must use difftime() function. You might be loosing some precision if you need exact tics, depending on your platform.
This way you must save the current time at the beginning of your application, and subtract it from the current time during the application.
#include <stdio.h>
#include <time.h>
time_t programstart;
int main ()
{
time(&programstart); //assign the current time
/*... the program ...*/
//get seconds since start
double secondssincestart = difftime(time(NULL), programstart);
printf ("the program took: %f seconds\n", secondssincestart);
return 0;
}
EDIT:
Since this post still gets some attention, it is important to note, that today's C++11 has standard, easy to use, and pretty handy library chrono.
Use QueryUnbiasedInterruptTime. Like DateTime.Ticks, each tick of QueryUnbiasedInterruptTime is also 100-nanosecond. If you need higher resolution, you need to go for QueryPerformanceCounter.
Link: http://msdn.microsoft.com/en-us/library/windows/desktop/ee662306(v=vs.85).aspx
There isn't an equivalent class or method, but you can do this:
SYSTEMTIME systemTime;
GetLocalTime(&systemTime);
FILETIME fileTime;
SystemTimeToFileTime(&systemTime, &fileTime);
ULARGE_INTEGER largeInteger;
largeInteger.LowPart = fileTime.dwLowDateTime;
largeInteger.HighPart = fileTime.dwHighDateTime;
ULONGLONG ticks = reinterpret_cast<ULONGLONG&>(largeInteger);
In Windows with MFC the Ticks equivalent to System.DateTime.Now.Ticks
ULONGLONG GetTicksNow()
{
COleDateTime epoch(100, 1, 1, 00, 00, 00);
COleDateTime currTime = COleDateTime::GetCurrentTime();
COleDateTimeSpan span = currTime - epoch;
CTimeSpan cSpan(span.GetDays(), span.GetHours(), span.GetMinutes(),
span.GetSeconds());
ULONGLONG diff = cSpan.GetTotalSeconds();
LONG missingDays = 365 * 99 + 24;
CTimeSpan centSpan(missingDays, 0, 0, 0);
ULONGLONG centSeconds = centSpan.GetTotalSeconds();// *1000000000;//
ULONGLONG totSec = (diff + centSeconds)*10000000;
return totSec ;
}
I'm porting some PHP to C++. Some of our database code stores time values as unix time stamps *100
The php contains code that looks a bit like this.
//PHP
static function getTickTime()
{
return round(microtime(true)*100);
}
I need something like this:
//C++
uint64_t getTickTime()
{
ptime Jan1st1970(date(1970, 1, 1));
ptime Now = microsecond_clock::local_time();
time_duration diff = Now - Jan1st1970;
return static_cast<uint64_t>(diff.total_seconds()*100);
}
Is something like this sensible? Is there a neater solution?
Is there something nasty in this code that I can't see? (Guess I'm not experienced enough with boost::date_time to know these things)
The neatest and most portable solution is to use the time() function, defined in <ctime>, which returns the number of seconds since the Unix epoch.
If you do use boost, you'll want universal_time(), not local_time(), since the epoch is specified in UTC.
The solution suggested by dauphic can be modified to something like this
uint64_t getTickTime()
{
timeval tim;
gettimeofday(&tim, NULL);
return tim.tv_sec*100 + tim.tv_usec/10000;
}
I cant think of a neater solution than that.
Assuming you're on Unix, include <sys/time.h>
timeval tim;
gettimeofday(&tim, NULL);
return tim.tv_usec;
If on Windows, there's no good way to get microsecond resolution, especially because it uses a different epoch. This sample requires only <windows.h> I believe.
FILETIME tim;
GetSystemTimeAsFileTime(&tim);
ULARGE_INTEGER ms;
ms.LowPart = tim.dwLowDateTime;
ms.HighPart = tim.dwHighDateTime;
return ms.QuadPart * 10 + <nanoseconds from January 1, 1609 to January 1, 1970>; // ms represents how many 100s of nanoseconds