How do I update a custom date in C++? - c++

I created my own DateTime class. It accepts the current date/time, and it also accepts a custom date/time. The custom date/time is what I'm interested in.
If I set the date to 1/5/1953 with a time of 1:05:31 PM, and call updateTime(), I want the time to update based on the difference between when it was first created and how many milliseconds followed afterwards.
However, when I do this, it's always giving me today's date and time, which is not the desired result.
This is my current code.
if (m_isCustomDate)
{
time_t currentRawTime;
// Get the current raw time
time(&currentRawTime);
// Get the time lapse
time_t time_diff = (time_t)difftime(currentRawTime, m_rawTime);
// Increment the time difference to the old raw time
m_rawTime += time_diff;
// Update the tm structure
localtime_s(&m_tm, &m_rawTime);
}
Updated problem:
With a date of date to 1/5/1953 and with a time of 1:05:10 PM, and when I call getSecond(), it's not giving me the 10 as I expect, but it gives me the current second on my computer (4 in this case). Is localtime_s() not the right function to use in this case?
My getSecond() function:
/// <summary>
/// Gets the current second between 0 and 60.
/// </summary>
/// <returns>Returns the second.</returns>
int DateTime::getSecond()
{
updateTime();
return m_tm.tm_sec;
}
Updated specific question:
How can I get the time lapse between the original custom date (m_rawTime) and the time lapse since the app's start up, and then update the tm structure?
Edit:
This solution worked. Posting if anyone needs a working example:
const DWORD curr_time = GetTickCount();
DWORD time_diff = (curr_time - m_init_time) / 1000;
m_rawTime += time_diff;
localtime_s(&m_tm, &m_rawTime);
m_init_time = curr_time;

Given that m_rawTime holds the 'custom' time, I noticed the following. With time_t time_diff = (time_t)difftime(currentRawTime, m_rawTime); you get the difference between the current time and m_rawTime. Then, with m_rawTime += time_diff; you make m_rawTime equal to the current time. I think, this is not what you want to do.
You say you want the time to "update based on the difference between when it was first created and how many milliseconds followed afterwards". So, you effectively want the difference between the time the object was last updated and the current time. To do that, initialize some counter (say, this->init_time) to the current time in the constructor and make each call to updateTime add the difference between the current time and this->init_time and make the latter equal to that current time:
void DateTime::updateTime() {
const auto curr_time = get_time(); // this is not an actual function
const auto diff = curr_time - this->init_time;
m_rawTime += diff;
// update the tm structure here...
this->init_time = curr_time;
}
Now, if you want to work with milliseconds, microseconds or smaller time periods, you should use std::chrono::high_resolution_clock, but struct tm doesn't support time periods shorter than one second, so you can't actually make your custom time any more precise than that with it. In other words, if you stick to struct tm, you can only work with precision of one second, no more.

Related

Creating a custom millisecond-based chrono clock that increments counters, and resets every year?

I'm starting out with a basic task in C++, coming from Pascal and Lua. I picked this as it seems simple enough, and useful for the project I'm working on.
I need to create a millisecond timer that starts at 0 for everything; Year 0, month 0, week 0, day 0. It will also run at a faster timescale than real life; every second that passes is 20 seconds for this timer. Which should be easy? Just multiply each ms by 20 and save that.
As the timer progresses, increment various counters that track seconds/minutes/hours/days/weeks/months/seasons/years passed. And after a year has passed, reset the timer back to 0ms and continue incrementing.
I will also be stopping, saving, loading, resuming. That bit I'll be able to handle, as I have access to a MySQL database to take care of serializing data. I'm just perplexed by std:chrono, and couldn't understand how to make my own arbitrary starting time and reference date.
Don't have any code yet, but here's the layout for the C++ file:
namespace FyTyGameTime
{
//Number of realtime milliseconds for each timescale second.
uint32_t TimeScaleMsMult = 20;
//Current iteration's total number of milliseconds
uint32_t TimeScaleCounter = 0;
//Total accumulated time
uint32_t TimeScaleMinutes = 0;
uint32_t TimeScaleHours = 0;
uint32_t TimeScaleDays = 0;
uint32_t TimeScaleWeeks = 0;
uint32_t TimeScaleMonths = 0;
uint32_t TimeScaleSeasons = 0;
uint32_t TimeScaleYears = 0; //Number of years passed since timer started
//Current iteration's counters.
//Current start date is 01/01/0000 7am
uint32_t TimeScaleCurrentSecond = 0; //0-86399, Resets every day
uint32_t TimeScaleCurrentMinute = 420; //0-1339, Resets every day
uint32_t TimeScaleCurrentHour = 7; //0-23, Resets every day
uint32_t TimeScaleCurrentDay = 0; //0-6, Resets every week
uint32_t TimeScaleCurrentWeek = 0; //0-3, Resets every month
uint32_t TimeScaleCurrentMonth = 0; //0-11, Resets every year
uint32_t TimeScaleCurrentSeason = 0; //0-3, Resets every year
uint32_t TimeScaleCurrentYear = 0; //Starting year, incremeneted every year. !Does not reset!
uint32_t RealTimeToTimescale(uint32_t iMilliseconds)
{
return uint32_t(iMilliseconds * TimeScaleMsMult);
}
bool SaveTimescale()
{
//
}
void LoadTimescale()
{
//
}
bool ResetTimescale()
{
//
}
}
Any help and clarification on std::chrono is appreciated, thank you. C++ Is quite a bit more terse than the languages I usually use and feel comfortable with.
edit: From what I've been able to read, std::chrono::steady_clock is the way to go. But it doesn't have any member functions to pause, resume, and to modify its values?
The time points of this clock cannot decrease as physical time moves forward and the time between ticks of this clock is constant. This clock is not related to wall clock time (for example, it can be time since last reboot), and is most suitable for measuring intervals.
The only member function it has is now()...
edit2: No, that is still based on some external timer. While I need to increment every ms. It seems like the three clocks provided for chrono are for duration checks, rather than actually used to do something every time interval?

C++ std::tm returns wrong value after converting it from std::chrono::time_point

TL;DR: How to use a std::chrono::system_clock::time_point to compare based on only certain parameters (e.g. I just want hours, minutes and seconds, but not day, month, etc.).
Also: After converting the std::chrono::system_clock::time_point to a std::tm, the std::tm.tm_hours contains a value one higher than originally input to the std::chrono::system_clock::time_point.
My theoretical approach on getting a std::chrono::system_clock::time_point to work:
typedef std::chrono::system_clock::time_point TimePoint;
TimePoint MainWindow::createTimePoint(int h, int m)
{
TimePoint createdTime = std::chrono::system_clock::time_point{std::chrono::hours(h) + std::chrono::minutes(m)};
time_t tt = std::chrono::system_clock::to_time_t(createdTime);
tm timeExtracted = *localtime(&tt);
std::cout << "input:\t\t" << "H = " << h << ", M = " << m << std::endl;
std::cout << "timeExtracted:\t" << "H = " << timeExtracted.tm_hour << ", M = " << timeExtracted.tm_min << std::endl;
return createdTime;
}
If I run this, the hours of timeExtracted are always +1 from the input h.
Why is that so? And how to fix this? I went over a few other posts that showed this, but they couldnt help me. Probably also because of this:
I think that when I create a TimePoint, the day, month, etc. is also set to a random value or initiated to a certain value. The point is: I want them to always be the same value, so that my TimePoint (after converting) basically shows this:
timeExtracted.tm_sec = 0
timeExtracted.tm_min = m
timeExtracted.tm_hour = h
timeExtracted.tm_mon = 0
timeExtracted.tm_wday = 0
timeExtracted.tm_mday = 0
timeExtracted.tm_yday = 0
timeExtracted.tm_year = 0
timeExtracted.tm_isdst = 0
How can I compare two of these TimePoint utilising using the compare operations of std::chrono on them, but only compare the hour and minute.
If my question is unclear, I'm sorry, it's late in the evening. I'll check again next morning. Thank you.
I'm going to start an answer, but this isn't going to be a complete answer because I'm not yet sure of the complete question. However, I can help.
TimePoint createdTime = system_clock::time_point{hours(h) + minutes(m)};
(I've clipped the std::chrono:: qualifiers so that this is easier to read and discuss)
This creates a time stamp that is 1970-01-01 hh:mm:00 UTC. In a nutshell, system_clock::time_point is measuring the duration of time (in some units like microseconds or nanoseconds) since New Years 1970, UTC. Technically the above is an approximation, system_clock doesn't count leap seconds, but we can (and should) ignore that detail for now.
This:
tm timeExtracted = *localtime(&tt);
is going to introduce UTC offset corrections based on your computer's setting for the local time zone. The time zone adjustment rules are (hopefully) going to be based on what was in effect in 1970 in your area.
There exist techniques and libraries for taking a system_clock::time_point and breaking it up into fields such as {year, month, day, hours, minutes, seconds, microseconds}. But that conversion also depends on if you want these fields in UTC, local time, or some other arbitrary time zone.
And the very first step is to apply the UTC offset associated with some time zone if desired. It may be that your {h, m} input needs a UTC offset adjustment prior to putting them into system_clock::time_point if the intent is that {h, m} represent local time instead of UTC.
Update: Store hours example
This example will use my free, open-source time zone library, because I feel it is much easier to work with and allows for more readable and expressive code.
This example takes as input a system_clock::time_point and compares it to a list of open/close times for each day of the week and determines if the input time is inside or outside of those time-of-day ranges for the weekday associated with the input time t. The store hours are presumed to be stated with respect to the store's local time zone, which is also the current time zone set for the computer running this code.
#include "date/tz.h"
#include <algorithm>
#include <cassert>
#include <chrono>
bool
is_store_open_at(std::chrono::system_clock::time_point tp)
{
using namespace date;
using namespace std::chrono;
struct day_schedule
{
weekday wd;
minutes open;
minutes close;
};
// hours are expressed in terms of local time
static constexpr day_schedule store_hours[]
{
// week day open-time close-time
{Monday, 0h, 0h}, // closed all day
{Tuesday, 8h, 18h},
{Wednesday, 8h, 18h},
{Thursday, 8h, 18h},
{Friday, 8h, 18h},
{Saturday, 8h, 15h+30min},
{Sunday, 9h+30min, 15h}
};
auto local_tp = current_zone()->to_local(tp);
auto local_day = floor<days>(local_tp);
auto local_time_of_day = local_tp - local_day;
weekday local_weekday{local_day};
auto ds = std::find_if(std::begin(store_hours), std::end(store_hours),
[local_weekday](day_schedule const& x)
{
return x.wd == local_weekday;
});
assert(ds != std::end(store_hours));
return ds->open <= local_time_of_day && local_time_of_day < ds->close;
}
#include <iostream>
int
main()
{
std::cout << is_store_open_at(std::chrono::system_clock::now()) << '\n';
}
The function begins by defining some handy data structures to store the open and close times for each day of the week. The open and close members of day_schedule measure "minutes since midnight" in local time.
The input time tp is in terms of UTC, since its type is system_clock::time_point. This is not currently specified by the C++ standard, but will be for next year's C++20.
zoned_seconds is used to convert the UTC time t into local time according to the computers time zone setting obtained by calling current_zone(). I've truncated t to seconds to simplify some of the syntax. This isn't strictly necessary. I've edited to use slightly simpler syntax to eliminate the zoned_seconds. zoned_seconds can be really useful in other examples, but in this one was more trouble than it was worth. auto local_tp = current_zone()->to_local(tp) is a simpler way to translate UTC to a local time point.
local_tp is a chrono::time_point that is considered "local time", and is distinct from the family of chrono::time_points associated with system_clock. The advantage of doing this is so that if local time and UTC time are accidentally mixed, it is a compile-time error.
local_days is simply local_tp truncated to days precision. It is still a chrono::time_point, just a coarse one that points to the beginning of the day as described by the local time zone.
The time duration since the local midnight is simply local_tp - local_day.
The day of the week (as defined by the local time zone) can be obtained by converting local_day to type weekday. This is the local day of the week associated with tp.
Now it is a simple matter to search store_hours for the entry that matches local_weekday.
The store is open if local_time_of_day is at or past the open time and has not yet reached the close time.
If the "store hours" are specified in UTC instead of local time, then this program simplifies somewhat, but is still similar.

Bad Values in QDateTimeAxis (QtCharts)

I am using QtCharts to display simulation data. The simulation starts at time zero, but my chart axis always seems to start at 19 hours. This confounds me. The set up of the chart is straight forward:
std::vector<SimData> data;
// ... Populate data
auto series = new QLineSeries();
for(auto i : data)
{
// Append time in milliseconds and a value
series->append(i.msTime, i.value);
}
this->legend()->hide();
this->addSeries(series);
this->axisX = new QDateTimeAxis;
this->axisX->setTickCount(10);
this->axisX->setFormat("HH:mm:ss");
this->axisX->setTitleText("Sim Time");
this->axisX->setMin(QDateTime());
this->addAxis(this->axisX, Qt::AlignBottom);
series->attachAxis(this->axisX);
this->axisY = new QValueAxis;
this->axisY->setLabelFormat("%i");
this->axisY->setTitleText(x->getID().c_str());
this->addAxis(this->axisY, Qt::AlignLeft);
series->attachAxis(this->axisY);
If I run with no data, but just display the chart, I get this:
If I add data, starting at time zero, the total amount of data is correct, but the time still starts at 19:00:00. Why does the time not start at 00:00:00?
I believe this is because you are on the East Coast (UTC-5) so while 0 represents 12am (2400) at UTC-5 0ms would represent 5 hours earlier (1900 the previous day). I was having the same problem, set my time zone to UTC (under ubuntu) and voila axis started at 0 hours instead of 19.
The problem was indeed confirmed to be UTC offset. SO had a good example of how to get the UTC Offset which I then used to offset the data going into the chart:
Easy way to convert a struct tm (expressed in UTC) to time_t type
I created a utility function from this to use with QDateTimeAxis series data.
double GetUTCOffsetForQDateTimeAxis()
{
time_t zero = 24 * 60 * 60L;
struct tm* timeptr;
int gmtime_hours;
// get the local time for Jan 2, 1900 00:00 UTC
timeptr = localtime(&zero);
gmtime_hours = timeptr->tm_hour;
// if the local time is the "day before" the UTC, subtract 24 hours
// from the hours to get the UTC offset
if(timeptr->tm_mday < 2)
{
gmtime_hours -= 24;
}
return 24.0 + gmtime_hours;
}
Then the data conversion was simple.
std::vector<SimData> data;
// ... Populate data
auto series = new QLineSeries();
const auto utcOffset = sec2ms(hours2sec(GetUTCOffsetForQDateTimeAxis()));
for(auto i : data)
{
// Append time in milliseconds and a value
series->append(i.msTime - utcOffset, i.value);
}
// ...
For the lonesome wanderer that probably ends up here.
My KISS solution is actually setting the time, then some a second and finally add a new data point:
for(int i = 0; i <= points; i++) {
QDateTime timeStamp;
timeStamp.setDate(QDate(1980, 1, 1));
timeStamp.setTime(QTime(0, 0, 0));
timeStamp = timeStamp.addSecs(i);
data->append(timeStamp.toMSecsSinceEpoch(), /* your y here */);
}
later where the diagram gets painted I used:
QSplineSeries *temps1 = /* wherever you get your series */;
QChart *chTemp = new QChart();
tempAxisX->setTickCount(5);
tempAxisX->setFormat(QString("hh:mm:ss"));
tempAxisX->setTitleText("Time");
chTemp->addAxis(tempAxisX, Qt::AlignBottom);
temps1->attachAxis(tempAxisX);
Hope that that helps a future visitor once (including myself).
For the lonesome wanderer blah blah. As solution of Tillo Reilly didn't work for me, I did my own (which works). In addition, my code converts timestamp to particular timezone. The thing is - QtCharts don't knows nothing about timezones (but better should to) but know and force-uses local PC timezone in a wierd way. So, if you want to change displayed date/time on charts to specified timezone - the only thing you can do is to modify dataset.
Answer to original question
auto temp_time = QDateTime::fromSecsSinceEpoch( timestamp );
auto local_offset = temp_time.offsetFromUtc();
auto fixed_timestamp = timestamp - local_offset;
With specified timezone
auto temp_time = QDateTime::fromSecsSinceEpoch( timestamp );
auto local_offset = temp_time.offsetFromUtc();
temp_time.setTimeSpec( Qt::TimeZone );
temp_time.setTimeZone( QTimeZone("Europe/Moscow") ); //for example
auto timezone_offset = temp_time.offsetFromUtc();
auto fixed_timestamp = timestamp + timezone_offset - local_offset;
important notice 1: you shouldn't create any random QDateTime object to calculate offsets, for example with functions like QDateTime::currentDateTime() or QDateTime::fromSecsSinceEpoch(0) - offset from UTC depends on point in time! So, with functions above you'll get QDateTime object, but its offset from UTC may differ from what you expect.
For the very same reason, you shouldn't calculate offset once and use it for all the dataset, especially for datasets with significant time ranges. The larger your dataset size - the bigger chances are that you'll get into trouble with unexpected Daylight Saving Time somewhere in the middle of it.
important notice 2: I don't know if QtCharts calculates offset from UTC for local timezone for current time or for timestamp passed to it via data series. I really hope it's the latter.

C++ beginner how to use GetSystemTimeAsFileTime

I have a program that reads the current time from the system clock and saves it to a text file. I previously used the GetSystemTime function which worked, but the times weren't completely consistent eg: one of the times is 32567.789 and the next time is 32567.780 which is backwards in time.
I am using this program to save the time up to 10 times a second. I read that the GetSystemTimeAsFileTime function is more accurate. My question is, how to I convert my current code to use the GetSystemTimeAsFileTime function? I tried to use the FileTimeToSystemTime function but that had the same problems.
SYSTEMTIME st;
GetSystemTime(&st);
WORD sec = (st.wHour*3600) + (st.wMinute*60) + st.wSecond; //convert to seconds in a day
lStr.Format( _T("%d %d.%d\n"),GetFrames() ,sec, st.wMilliseconds);
std::wfstream myfile;
myfile.open("time.txt", std::ios::out | std::ios::in | std::ios::app );
if (myfile.is_open())
{
myfile.write((LPCTSTR)lStr, lStr.GetLength());
myfile.close();
}
else {lStr.Format( _T("open file failed: %d"), WSAGetLastError());
}
EDIT To add some more info, the code captures an image from a camera which runs 10 times every second and saves the time the image was taken into a text file. When I subtract the 1st entry of the text file from the second and so on eg: entry 2-1 3-2 4-3 etc I get this graph, where the x axis is the number of entries and the y axis is the subtracted values.
All of them should be around the 0.12 mark which most of them are. However you can see that a lot of them vary and some even go negative. This isn't due to the camera because the camera has its own internal clock and that has no variations. It has something to do with capturing the system time. What I want is the most accurate method to extract the system time with the highest resolution and as little noise as possible.
Edit 2 I have taken on board your suggestions and ran the program again. This is the result:
As you can see it is a lot better than before but it is still not right. I find it strange that it seems to do it very incrementally. I also just plotted the times and this is the result, where x is the entry and y is the time:
Does anyone have any idea on what could be causing the time to go out every 30 frames or so?
First of all, you wanna get the FILETIME as follows
FILETIME fileTime;
GetSystemTimeAsFileTime(&fileTime);
// Or for higher precision, use
// GetSystemTimePreciseAsFileTime(&fileTime);
According to FILETIME's documentation,
It is not recommended that you add and subtract values from the FILETIME structure to obtain relative times. Instead, you should copy the low- and high-order parts of the file time to a ULARGE_INTEGER structure, perform 64-bit arithmetic on the QuadPart member, and copy the LowPart and HighPart members into the FILETIME structure.
So, what you should be doing next are
ULARGE_INTEGER theTime;
theTime.LowPart = fileTime.dwLowDateTime;
theTime.HighPart = fileTime.dwHighDateTime;
__int64 fileTime64Bit = theTime.QuadPart;
And that's it. The fileTime64Bit variable now contains the time you're looking for.
If you want to get a SYSTEMTIME object instead, you could just do the following:
SYSTEMTIME systemTime;
FileTimeToSystemTime(&fileTime, &systemTime);
Getting the system time out of Windows with decent accuracy is something that I've had fun with, too... I discovered that Javascript code running on Chrome seemed to produce more consistent timer results than I could with C++ code, so I went looking in the Chrome source. An interesting place to start is the comments at the top of time_win.cc in the Chrome source. The links given there to a Mozilla bug and a Dr. Dobb's article are also very interesting.
Based on the Mozilla and Chrome sources, and the above links, the code I generated for my own use is here. As you can see, it's a lot of code!
The basic idea is that getting the absolute current time is quite expensive. Windows does provide a high resolution timer that's cheap to access, but that only gives you a relative, not absolute time. What my code does is split the problem up into two parts:
1) Get the system time accurately. This is in CalibrateNow(). The basic technique is to call timeBeginPeriod(1) to get accurate times, then call GetSystemTimeAsFileTime() until the result changes, which means that the timeBeginPeriod() call has had an effect. This gives us an accurate system time, but is quite an expensive operation (and the timeBeginPeriod() call can affect other processes) so we don't want to do it each time we want a time. The code also calls QueryPerformanceCounter() to get the current high resolution timer value.
bool NeedCalibration = true;
LONGLONG CalibrationFreq = 0;
LONGLONG CalibrationCountBase = 0;
ULONGLONG CalibrationTimeBase = 0;
void CalibrateNow(void)
{
// If the timer frequency is not known, try to get it
if (CalibrationFreq == 0)
{
LARGE_INTEGER freq;
if (::QueryPerformanceFrequency(&freq) == 0)
CalibrationFreq = -1;
else
CalibrationFreq = freq.QuadPart;
}
if (CalibrationFreq > 0)
{
// Get the current system time, accurate to ~1ms
FILETIME ft1, ft2;
::timeBeginPeriod(1);
::GetSystemTimeAsFileTime(&ft1);
do
{
// Loop until the value changes, so that the timeBeginPeriod() call has had an effect
::GetSystemTimeAsFileTime(&ft2);
}
while (FileTimeToValue(ft1) == FileTimeToValue(ft2));
::timeEndPeriod(1);
// Get the current timer value
LARGE_INTEGER counter;
::QueryPerformanceCounter(&counter);
// Save calibration values
CalibrationCountBase = counter.QuadPart;
CalibrationTimeBase = FileTimeToValue(ft2);
NeedCalibration = false;
}
}
2) When we want the current time, get the high resolution timer by calling QueryPerformanceCounter(), and use the change in that timer since the last CalibrateNow() call to work out an accurate "now". This is in Now() in my code. This also periodcally calls CalibrateNow() to ensure that the system time doesn't go backwards, or drift out.
FILETIME GetNow(void)
{
for (int i = 0; i < 4; i++)
{
// Calibrate if needed, and give up if this fails
if (NeedCalibration)
CalibrateNow();
if (NeedCalibration)
break;
// Get the current timer value and use it to compute now
FILETIME ft;
::GetSystemTimeAsFileTime(&ft);
LARGE_INTEGER counter;
::QueryPerformanceCounter(&counter);
LONGLONG elapsed = ((counter.QuadPart - CalibrationCountBase) * 10000000) / CalibrationFreq;
ULONGLONG now = CalibrationTimeBase + elapsed;
// Don't let time go back
static ULONGLONG lastNow = 0;
now = max(now,lastNow);
lastNow = now;
// Check for clock skew
if (LONGABS(FileTimeToValue(ft) - now) > 2 * GetTimeIncrement())
{
NeedCalibration = true;
lastNow = 0;
}
if (!NeedCalibration)
return ValueToFileTime(now);
}
// Calibration has failed to stabilize, so just use the system time
FILETIME ft;
::GetSystemTimeAsFileTime(&ft);
return ft;
}
It's all a bit hairy but works better than I had hoped. This also seems to work well as far back on Windows as I have tested (which was Windows XP).
I believe you are looking for GetSystemTimePreciseAsFileTime() function or even QueryPerformanceCounter() - to be short for something that is guarantied to produce monotone values.

Adding two epoch millseconds in C++

My goal is to determine expiry of an item to when it was acquired(bought) and when it is sold.There is a TTL value associated with each of the item.
I am doing following :
time_t currentSellingTime;
long currentSystemTime = time(&currentSellingTime); // this gives me epoch millisec of now()
long TTL = <some_value>L;
long BuyingTime = <some_value> // this is also in epoch millsec
if(currentSystemTime > TTL+BuyingTime))
{
//throw exception
// item is expired
}
My question is how to sum two epoch millisec and compare it with another epoch millsec in C++
There may be some misconceptions on how time() works:
epoch time as given by time() is expressed in seconds, not millseconds
time returns the current time value and can optionally set current time in the variable given as its sole argument. This means that
long currentSystemTime = time(&currentSellingTime);
will set both currentSystemTime and currentSellingTime to the current time, and that's probably not what you intend to do... You should probably do
long currentSystemTime = time(NULL);
or
time(&currentSellingTime);
but the "double form" you are using is quite suspicious. For completeness' sake the MS Help reference for time()
You want to use another function, as as previously pointed out, time() returns seconds. Try:
#include <time.h>
long current_time() {
struct timespec t;
clock_gettime(CLOCK_REALTIME, &t);
return t.tv.sec * 1000l + t.tv_nsec / 1000000l;
}
Your code should work then. This approach is also POSIX compatible. Example usage:
const long TTL = 100;
long start_time = current_time();
while (!(current_time() > start_time + TTL))
{
// do the stuff that can expire
}
note: I know that the condition in the while loop can be constructed differently, but this way it is more like "until not expired".