How to convert custom time "ticks" to TDateTime? - c++

(i'm using C++Builder 2006, if this matters)
I'm not able to solve this problem:
What i have (and i cannot change this):
typedef struct {
uint16_t Leaps; // How many Leaps from the previous event (see below)
uint16_t Ticks; // Event "machine ticks" (see below)
uint16_t Code;
} sMachineEvents;
typedef struct {
TDateTime Date;
uint16_t Code;
} sConvertEvents;
TDateTime Sync // Contains the date and time of Ev1
TICKS_PER_SECOND // #defined elsewhere: How many Ticks in a second
TICKS_PER_LEAP // #defined elsewhere: How many ticks to make a "Leap"
// (this means that when the Tick counter reaches TICKS_PER_LEAP
// it becomes 0 and the Leaps counter increases by 1)
sMachineEvents Ev[3];
sConvertEvents cEv[3];
Ev[0].Leaps = 0x0005;
Ev[0].Ticks = 0x5975;
Ev[0].Code = 0x0001;
Ev[1].Leaps = 0x0001;
Ev[1].Ticks = 0x0124;
Ev[1].Code = 0x0002;
Ev[2].Leaps = 0x0000;
Ev[2].Ticks = 0x70AC;
Ev[2].Code = 0x0003;
I need to "convert" these "MachineEvents" in "ConvertEvents".
The first one is easy:
cEv[0].Date = Sync;
cEv[0].Code = Ev[0].Code;
Now: Ev[1] happened BEFORE Ev[0].
How much? I need to go "back in time" by Ev[0].Leaps+Ev[0].Ticks and then go "forward in time" by Ev[1].Ticks.
Ev[2] is the same: it happened
Ev[1].Leaps + Ev[1].Ticks - Ev[2].Ticks
BEFORE Ev[1]....
HOW should I compute the DateTime for Ev[1] and Ev[2]?

TDateTime is actually a floating point value representing the number of days; an hour is represented by 1.0 / 24.0, and a second is represented by 1.0 / SecsPerDay (SecsPerDay is a constant equal to 60 * 60 * 24 = 86400).
So, if one tick is 1.0 / TICKS_PER_SECOND seconds:
TDateTime TimeAsTDateTime = TimeInTicks / (SecsPerDay * TICKS_PER_SECOND)
Also, I think your code is not correct: instead of Ev[0].Leaps+Ev[0].Ticks, as far as I can see you need to use Ev[0].Leaps * TICKS_PER_LEAP + Ev[0].Ticks.

If I understand you correct, you compare "ticks" with "time".
You have to convert the ticks in time format, done by tick_count / TICKS_PER_SECOND
So float Seconds = Tick / TICKS_PER_SECOND. As it is a float, you might better use Milliseconds
int ms = (int)(ticks / TICKS_PER_SECOND * 1000);
With the Seconds (or Milliseconds) you can create a new Date object and add the two Date Objects, or simply add the seconds to the old Date.
Another way would be saving all Ticks since start of the program and simply using this as "Date Reference".

Related

Bad Values in QDateTimeAxis (QtCharts)

I am using QtCharts to display simulation data. The simulation starts at time zero, but my chart axis always seems to start at 19 hours. This confounds me. The set up of the chart is straight forward:
std::vector<SimData> data;
// ... Populate data
auto series = new QLineSeries();
for(auto i : data)
{
// Append time in milliseconds and a value
series->append(i.msTime, i.value);
}
this->legend()->hide();
this->addSeries(series);
this->axisX = new QDateTimeAxis;
this->axisX->setTickCount(10);
this->axisX->setFormat("HH:mm:ss");
this->axisX->setTitleText("Sim Time");
this->axisX->setMin(QDateTime());
this->addAxis(this->axisX, Qt::AlignBottom);
series->attachAxis(this->axisX);
this->axisY = new QValueAxis;
this->axisY->setLabelFormat("%i");
this->axisY->setTitleText(x->getID().c_str());
this->addAxis(this->axisY, Qt::AlignLeft);
series->attachAxis(this->axisY);
If I run with no data, but just display the chart, I get this:
If I add data, starting at time zero, the total amount of data is correct, but the time still starts at 19:00:00. Why does the time not start at 00:00:00?
I believe this is because you are on the East Coast (UTC-5) so while 0 represents 12am (2400) at UTC-5 0ms would represent 5 hours earlier (1900 the previous day). I was having the same problem, set my time zone to UTC (under ubuntu) and voila axis started at 0 hours instead of 19.
The problem was indeed confirmed to be UTC offset. SO had a good example of how to get the UTC Offset which I then used to offset the data going into the chart:
Easy way to convert a struct tm (expressed in UTC) to time_t type
I created a utility function from this to use with QDateTimeAxis series data.
double GetUTCOffsetForQDateTimeAxis()
{
time_t zero = 24 * 60 * 60L;
struct tm* timeptr;
int gmtime_hours;
// get the local time for Jan 2, 1900 00:00 UTC
timeptr = localtime(&zero);
gmtime_hours = timeptr->tm_hour;
// if the local time is the "day before" the UTC, subtract 24 hours
// from the hours to get the UTC offset
if(timeptr->tm_mday < 2)
{
gmtime_hours -= 24;
}
return 24.0 + gmtime_hours;
}
Then the data conversion was simple.
std::vector<SimData> data;
// ... Populate data
auto series = new QLineSeries();
const auto utcOffset = sec2ms(hours2sec(GetUTCOffsetForQDateTimeAxis()));
for(auto i : data)
{
// Append time in milliseconds and a value
series->append(i.msTime - utcOffset, i.value);
}
// ...
For the lonesome wanderer that probably ends up here.
My KISS solution is actually setting the time, then some a second and finally add a new data point:
for(int i = 0; i <= points; i++) {
QDateTime timeStamp;
timeStamp.setDate(QDate(1980, 1, 1));
timeStamp.setTime(QTime(0, 0, 0));
timeStamp = timeStamp.addSecs(i);
data->append(timeStamp.toMSecsSinceEpoch(), /* your y here */);
}
later where the diagram gets painted I used:
QSplineSeries *temps1 = /* wherever you get your series */;
QChart *chTemp = new QChart();
tempAxisX->setTickCount(5);
tempAxisX->setFormat(QString("hh:mm:ss"));
tempAxisX->setTitleText("Time");
chTemp->addAxis(tempAxisX, Qt::AlignBottom);
temps1->attachAxis(tempAxisX);
Hope that that helps a future visitor once (including myself).
For the lonesome wanderer blah blah. As solution of Tillo Reilly didn't work for me, I did my own (which works). In addition, my code converts timestamp to particular timezone. The thing is - QtCharts don't knows nothing about timezones (but better should to) but know and force-uses local PC timezone in a wierd way. So, if you want to change displayed date/time on charts to specified timezone - the only thing you can do is to modify dataset.
Answer to original question
auto temp_time = QDateTime::fromSecsSinceEpoch( timestamp );
auto local_offset = temp_time.offsetFromUtc();
auto fixed_timestamp = timestamp - local_offset;
With specified timezone
auto temp_time = QDateTime::fromSecsSinceEpoch( timestamp );
auto local_offset = temp_time.offsetFromUtc();
temp_time.setTimeSpec( Qt::TimeZone );
temp_time.setTimeZone( QTimeZone("Europe/Moscow") ); //for example
auto timezone_offset = temp_time.offsetFromUtc();
auto fixed_timestamp = timestamp + timezone_offset - local_offset;
important notice 1: you shouldn't create any random QDateTime object to calculate offsets, for example with functions like QDateTime::currentDateTime() or QDateTime::fromSecsSinceEpoch(0) - offset from UTC depends on point in time! So, with functions above you'll get QDateTime object, but its offset from UTC may differ from what you expect.
For the very same reason, you shouldn't calculate offset once and use it for all the dataset, especially for datasets with significant time ranges. The larger your dataset size - the bigger chances are that you'll get into trouble with unexpected Daylight Saving Time somewhere in the middle of it.
important notice 2: I don't know if QtCharts calculates offset from UTC for local timezone for current time or for timestamp passed to it via data series. I really hope it's the latter.

Converting time_duration to DATE

I want to convert a time_duration to a DATE format, which is the number of days since 1899, 12, 30.
DATE date_from_duration(time_duration td)
{
double days = td.hours()/24.+td.minutes()/(24.*60.)+td.seconds()/(24.*60.*60.);
return days;
}
This code almost works but gives sometimes rounding errors, f.i the time_duration(1007645, 15, 0) should result in 2014-12-12 00:15:00, but is actually 2014-12-12 00:14:59.
The check of DATE is done with this method, stolen from here:
ptime pTime_from_DATE(double date)
{
using boost::math::modf;
static const ptime::date_type base_date(1899, Dec, 30);
static const ptime base_time(base_date, ptime::time_duration_type(0,0,0));
int dayOffset, hourOffset, minuteOffset, secondOffset;
double fraction = fabs(modf(date, &dayOffset)) * 24; // fraction = hours
fraction = modf(fraction, &hourOffset) * 60; // fraction = minutes
fraction = modf(fraction, &minuteOffset) * 60; // fraction = seconds
modf(fraction, &secondOffset);
ptime t(base_time);
t += ptime::time_duration_type(hourOffset, minuteOffset, secondOffset);
t += ptime::date_duration_type(dayOffset);
return t;
}
Any ideas how to correct this rounding issue efficiently?
I might be missing some of the complexity, but it seems really simple to me:
Live On Coliru
#include <boost/date_time/posix_time/posix_time.hpp>
#include <iostream>
using DATE = double;
boost::posix_time::ptime pTime_from_DATE(double date)
{
static const boost::posix_time::ptime::date_type base_date(1899, boost::gregorian::Dec, 30);
return boost::posix_time::ptime(
base_date,
boost::posix_time::milliseconds(date * 1000 * 60 * 60 * 24));
}
int main() {
boost::posix_time::time_duration duration(1007645, 15, 0);
DATE date = duration.total_milliseconds() / 1000.0 / 60 / 60 / 24;
std::cout << date << ": " << pTime_from_DATE(date);
}
Prints
41985.2: 2014-Dec-12 05:15:00
See it Live On Coliru
That kind of depends on your circumstances.
The general problem is that 1 / (24. * 60. * 60.) is not exactly representable as a binary float (because 86400 is not a power of two). The DATE you get is very nearly exact, but there will be a rounding error. Sometimes this means that it is very little more, sometimes very little less, but there's really not a lot you can do to make it more precise; it is as perfectly alright as you can get. That you see a discrepancy of a second is arguably a problem with your check, in that you stop looking at seconds -- if you check the milliseconds, you're likely to get 999, making the rounding error look a lot less extreme. This will continue for microseconds and possibly nanoseconds, depending on the resolution of time_duration.
So quite possibly there's nothing to do because the data is alright. If, however, you don't care about milliseconds and beyond and only want the seconds value to be stable in conversions back and forth, the simplest way to achieve that is to add an epsilon value:
DATE date_from_duration(time_duration td)
{
double days =
td.hours () / 24.
+ td.minutes() / (24. * 60.)
+ td.seconds() / (24. * 60. * 60.)
+ 1e-8; // add roughly a millisecond
return days;
}
This increases the overall rounding error but ensures that the error is in a "safe" direction, i.e., that converting it back to time_duration will give the same seconds() value and the visible changes will be at the milliseconds() level.

fully separated date with milliseconds from std::chrono::system_clock

My current pattern (for unix) is to call gettimeofday, cast the tv_sec field to a time_t, pass that through localtime, and combine the results with tv_usec. That gives me a full date (year, month, day, hour, minute, second, nanoseconds).
I'm trying to update my code to C++11 for portability and general good practice. I'm able to do the following:
auto currentTime = std::chrono::system_clock::now( );
const time_t time = std::chrono::system_clock::to_time_t( currentTime );
const tm *values = localtime( &time );
// read values->tm_year, etc.
But I'm stuck on the milliseconds/nanoseconds. For one thing, to_time_t claims that rounding is implementation defined (!) so I don't know if a final reading of 22.6 seconds should actually be 21.6, and for another I don't know how to get the number of milliseconds since the previous second (are seconds guaranteed by the standard to be regular? i.e. could I get the total milliseconds since the epoch and just modulo it? Even if that is OK it feels ugly).
How should I get the current date from std::chrono::system_clock with milliseconds?
I realised that I can use from_time_t to get a "rounded" value, and check which type of rounding occurred. This also doesn't rely on every second being exactly 1000 milliseconds, and works with out-of-the-box C++11:
const auto currentTime = std::chrono::system_clock::now( );
time_t time = std::chrono::system_clock::to_time_t( currentTime );
auto currentTimeRounded = std::chrono::system_clock::from_time_t( time );
if( currentTimeRounded > currentTime ) {
-- time;
currentTimeRounded -= std::chrono::seconds( 1 );
}
const tm *values = localtime( &time );
int year = values->tm_year + 1900;
// etc.
int milliseconds = std::chrono::duration_cast<std::chrono::duration<int,std::milli> >( currentTime - currentTimeRounded ).count( );
Using this free, open-source library you can get the local time with millisecond precision like this:
#include "tz.h"
#include <iostream>
int
main()
{
using namespace date;
using namespace std::chrono;
std::cout << make_zoned(current_zone(),
floor<milliseconds>(system_clock::now())) << '\n';
}
This just output for me:
2016-09-06 12:35:09.102 EDT
make_zoned is a factory function that creates a zoned_time<milliseconds>. The factory function deduces the desired precision for you. A zoned_time is a pairing of a time_zone and a local_time. You can get the local time out with:
local_time<milliseconds> lt = zt.get_local_time();
local_time is a chrono::time_point. You can break this down into date and time field types if you want like this:
auto zt = make_zoned(current_zone(), floor<milliseconds>(system_clock::now()));
auto lt = zt.get_local_time();
local_days ld = floor<days>(lt); // local time truncated to days
year_month_day ymd{ld}; // {year, month, day}
time_of_day<milliseconds> time{lt - ld}; // {hours, minutes, seconds, milliseconds}
// auto time = make_time(lt - ld); // another way to create time_of_day
auto y = ymd.year(); // 2016_y
auto m = ymd.month(); // sep
auto d = ymd.day(); // 6_d
auto h = time.hours(); // 12h
auto min = time.minutes(); // 35min
auto s = time.seconds(); // 9s
auto ms = time.subseconds(); // 102ms
Instead of using to_time_t which rounds off you can instead do like this
auto tp = std::system_clock::now();
auto s = std::chrono::duration_cast<std::chrono::seconds>(tp.time_since_epoch());
auto t = (time_t)(s.count());
That way you get the seconds without the round-off. It is more effective than checking difference between to_time_t and from_time_t.
I read the standard like this:
It is implementation defined whether the value is rounder or truncated, but naturally the rounding or truncation only occurs on the most detailed part of the resulting time_t. That is: the combined information you get from time_t is never more wrong than 0.5 of its granularity.
If time_t on your system only supported seconds, you would be right that there could be 0.5 seconds systematic uncertainty (unless you find out how things were implemented).
tv_usec is not standard C++, but an accessor of time_t on posix. To conclude, you should not expect any rounding effects bigger than half of the smallest time value difference your system supports, so certainly not more than 0.5 micro seconds.
The most straight forward way is to use boost ptime. It has methods such as fractional_seconds()
http://www.boost.org/doc/libs/1_53_0/doc/html/date_time/posix_time.html#date_time.posix_time.ptime_class
For interop with std::chrono, you can convert as described here: https://stackoverflow.com/a/4918873/1149664
Or, have a look at this question: How to convert std::chrono::time_point to calendar datetime string with fractional seconds?

Adding two epoch millseconds in C++

My goal is to determine expiry of an item to when it was acquired(bought) and when it is sold.There is a TTL value associated with each of the item.
I am doing following :
time_t currentSellingTime;
long currentSystemTime = time(&currentSellingTime); // this gives me epoch millisec of now()
long TTL = <some_value>L;
long BuyingTime = <some_value> // this is also in epoch millsec
if(currentSystemTime > TTL+BuyingTime))
{
//throw exception
// item is expired
}
My question is how to sum two epoch millisec and compare it with another epoch millsec in C++
There may be some misconceptions on how time() works:
epoch time as given by time() is expressed in seconds, not millseconds
time returns the current time value and can optionally set current time in the variable given as its sole argument. This means that
long currentSystemTime = time(&currentSellingTime);
will set both currentSystemTime and currentSellingTime to the current time, and that's probably not what you intend to do... You should probably do
long currentSystemTime = time(NULL);
or
time(&currentSellingTime);
but the "double form" you are using is quite suspicious. For completeness' sake the MS Help reference for time()
You want to use another function, as as previously pointed out, time() returns seconds. Try:
#include <time.h>
long current_time() {
struct timespec t;
clock_gettime(CLOCK_REALTIME, &t);
return t.tv.sec * 1000l + t.tv_nsec / 1000000l;
}
Your code should work then. This approach is also POSIX compatible. Example usage:
const long TTL = 100;
long start_time = current_time();
while (!(current_time() > start_time + TTL))
{
// do the stuff that can expire
}
note: I know that the condition in the while loop can be constructed differently, but this way it is more like "until not expired".

The best way to measure milliseconds for deltatime on Mac OS?

I have found a function to get milliseconds since the Mac was started:
U32 Platform::getRealMilliseconds()
{
// Duration is a S32 value.
// if negative, it is in microseconds.
// if positive, it is in milliseconds.
Duration durTime = AbsoluteToDuration(UpTime());
U32 ret;
if( durTime < 0 )
ret = durTime / -1000;
else
ret = durTime;
return ret;
}
The problem is that after ~20 days AbsoluteToDuration returns INT_MAX all the time until the Mac is rebooted.
I have tried to use method below, it worked, but looks like gettimeofday takes more time and slows down the game a bit:
timeval tim;
gettimeofday(&tim, NULL);
U32 ret = ((tim.tv_sec) * 1000 + tim.tv_usec/1000.0) + 0.5;
Is there a better way to get number of milliseconds elapsed since some epoch (preferably since the app started)?
Thanks!
Your real problem is that you are trying to fit an uptime-in-milliseconds value into a 32-bit integer. If you do that your value will always wrap back to zero (or saturate) in 49 days or less, no matter how you obtain the value.
One possible solution would be to track time values with a 64-bit integer instead; that way the day of reckoning gets postponed for a few hundred years and so you don't have to worry about the problem. Here's a MacOS/X implementation of that:
uint64_t GetTimeInMillisecondsSinceBoot()
{
return UnsignedWideToUInt64(AbsoluteToNanoseconds(UpTime()))/1000000;
}
... or if you don't want to return a 64-bit time value, the next-best thing would be to record the current time-in-milliseconds value when your program starts, and then always subtract that value from the values you return. That way things won't break until your own program has been running for at least 49 days, which I suppose is unlikely for a game.
uint32_t GetTimeInMillisecondsSinceProgramStart()
{
static uint64_t _firstTimeMillis = GetTimeInMillisecondsSinceBoot();
uint64_t nowMillis = GetTimeInMillisecondsSinceBoot();
return (uint32_t) (nowMillis-_firstTimeMillis);
}
My preferred method is mach_absolute_time - see this tech note - I use the second method, i.e. mach_absolute_time to get time stamps and mach_timebase_info to get the constants needed to convert the difference between time stamps into an actual time value (with nanosecond resolution).