i am using GetTimeZoneInformation in windows...but not able to find any equivalent in linux.
Any idea??
i figured it out after some more debugging time related structures.
Its better to use struct tm * gmtime ( const time_t * timer ) which wll give the UTC time.
And can use tzset() and tzname[0] to get the timezone info.
You can use the following on Unix:
std::string LocalTimeZone()
{
time_t now = time(NULL);
struct tm tnow = *localtime(&now);
std::string tz = tnow.tm_zone;
std::cout << "Local timezone: " << tz << std::endl;
char buff[100];
strftime( buff, sizeof buff, "%a %b %d %Y %T %Z%z", &tnow );
std::vector<std::string> vec;
const std::string s(buff);
boost::split(vec, s, boost::is_any_of(" "));
std::vector<std::string>::iterator i = vec.end();
return *--i;
}
Which will give the current time zone as per the host machine's system settings.
Boost has Boost.Date_Time, containing a time zone database. You can use the same.
Related
please consider following code, it compiles and run on ESP32 board:
unsetenv("TZ");
String payload = http.getString();
payload.replace("\"", "");
Serial.print("Payload: ");
Serial.println(payload);
const char* format = "%Y-%m-%dT%H:%M:%S";
strptime(payload.c_str(), format,& _time);
//debug only
Serial.print("Chamber time(UTC): ");
char chDate[11] = "";
char chTime[9] = "";
strftime(chDate, 11, "%m/%d/%Y", &_time);
strftime(chTime, 9, "%H:%M:%S", &_time);
Serial.print(chDate);
Serial.print(" ");
Serial.println(chTime);
int epoch_time = mktime(&_time);
timeval epoch = { epoch_time, 0 };
const timeval* tv = &epoch;
settimeofday(tv, NULL);
int rcode = setenv("TZ", "EST+5", 1);
tzset();
Serial.print("SetEnv reply");
Serial.println(rcode);
//VERIFICA
struct tm now;
getLocalTime(&now, 0);
Serial.println(&now, " %B %d %Y %H:%M:%S (%A)");
producing the following output:
Payload: 2020-04-08T21:59:10.736+0000
Chamber time(UTC): 04/08/2020 21:59:10
SetEnv reply0
April 08 2020 21:59:10 (Wednesday)
I expected the last date to be a local time according to "EST+5" timezone, in this example. Infact, I followed this readme, as I am using a ESP32 board, that says:
To set local timezone, use setenv and tzset POSIX functions. First,
call setenv to set TZ environment variable to the correct value
depending on device location. Format of the time string is described
in libc documentation. Next, call tzset to update C library runtime
data for the new time zone. Once these steps are done, localtime
function will return correct local time, taking time zone offset and
daylight saving time into account
What am I missing/doing wrong, apart from my rusty C++? Perfect solution would be to use : format like ":Europe/Rome" Thanks
The TZ string "EST+5" might be unknown to the OS, supported by the fact your output showed UTC time. EST and EDT have been used for US Eastern or America/New York. Assuming this is a Linux-like OS, take a look in the /usr/share/zoneinfo/ path for the zones available, or try the tzselect command to see if you can find the correct TZ string.
Try this sequence of C functions:
time_t tnow;
time(&tnow);
struct tm when;
errno_t ret = localtime_s(&when, &tnow);
// struct tm *when = localtime(&tnow); // deprecated for some compilers
The getLocalTime() function is not standard C and may not honor tzset().
We have string format UTC Time and we need to convert to Your Current time zone and in string format.
string strUTCTime = "2017-03-17T10:00:00Z";
Need to Convert above value to Local time in same string format.
Like in IST it would be "2017-03-17T15:30:00Z"
Got Solution...
1) Convert String formatted time to time_t
2) use "localtime_s" to Convert utc to local time.
3) use "strftime" to convert local time (struct tm format) to string format.
int main()
{
std::string strUTCTime = "2017-03-17T13:20:00Z";
std::wstring wstrUTCTime = std::wstring(strUTCTime.begin(),strUTCTime.end());
time_t utctime = getEpochTime(wstrUTCTime.c_str());
struct tm tm;
/*Convert UTC TIME To Local TIme*/
localtime_s(&tm, &utctime);
char CharLocalTimeofUTCTime[30];
strftime(CharLocalTimeofUTCTime, 30, "%Y-%m-%dT%H:%M:%SZ", &tm);
string strLocalTimeofUTCTime(CharLocalTimeofUTCTime);
std::cout << "\n\nLocal To UTC Time Conversion:::" << strLocalTimeofUTCTime;
}
std::time_t getEpochTime(const std::wstring& dateTime)
{
/* Standard UTC Format*/
static const std::wstring dateTimeFormat{ L"%Y-%m-%dT%H:%M:%SZ" };
std::wistringstream ss{ dateTime };
std::tm dt;
ss >> std::get_time(&dt, dateTimeFormat.c_str());
/* Convert the tm structure to time_t value and return Epoch. */
return _mkgmtime(&dt);
}
I'm working on a client-server custom application (to be run on linux), and one of the frames I send include a timestamp (ie the time at which the frame is send).
In order to make my application reliable, I used gmtime to produce the time on the client. I'm in Belgium, so right now the hour of the client VM is 2 hours later than the UTC time (because of the summer hour).
In the server side, I first convert the recieved string to a time_t type. I do this to use the difftime function to see if the timestamp is not too old.
Then, I generate a timestamp (in UTC time) again with gmtime, and I convert it to a time_t.
I compare then the two time_t to see the time difference.
I have got a problem with the conversion of the time at the server side. I use the same code as in the client, but the outputted gmtime is different...
Client Side : function to generate the timestamp, and export it to a string (time_str):
std::string getTime()
{
time_t rawtime;
struct tm * timeinfo;
char buffer[80];
time (&rawtime); // Get time of the system
timeinfo = gmtime(&rawtime); // Convert it to UTC time
strftime(buffer,80,"%d-%m-%Y %H:%M:%S",timeinfo);
std::string time_str(buffer); // Cast it into a string
cout<<"Time Stamp now (client) : "<<time_str<<endl;
return time_str;
}
And it produices this (at 9h33 local time) :
Time Stamp now : 06-04-2016 07:33:30
Server Side : fucntion to retrieve the timestamp, generate the newx time stamp, and compare them :
bool checkTimeStamp(std::string TimsStamp_str, double delay)
{
cout<<"TimeStamp recieved: "<<TimsStamp_str<<endl;
/* Construct tm from string */
struct tm TimeStampRecu;
strptime(TimsStamp_str.c_str(), "%d-%m-%Y %I:%M:%S", &TimeStampRecu);
time_t t_old = mktime(&TimeStampRecu);
/* Generate New TimeStamp */
time_t rawtime;
struct tm * timeinfo;
time (&rawtime); // Get time of the system
timeinfo = gmtime(&rawtime); // convert it to UTC time_t
time_t t2 = mktime(timeinfo); // Re-Cast it to timt_t struct
/* Convert it into string (for output) */
char buffer[80];
strftime(buffer,80,"%d-%m-%Y %H:%M:%S",timeinfo);
std::string time_str(buffer); // Cast it into a string
cout<<"Time Stamp now (server) : "<<time_str<<endl;
/* Comparison */
double diffSecs = difftime(t2, t_old);
cout<<diffSecs<<endl;
bool isTimeStampOK;
if (diffSecs < delay)
isTimeStampOK = true;
else
isTimeStampOK = false;
return isTimeStampOK;
}
And it produces this (at 9h33 in Belgium) :
TimeStamp recieved : 06-04-2016 07:33:30
Time Stamp now (server) : 06-04-2016 08:33:31
Why is the Server Time (8h33) neither in localtime (9h33), neither in UTC time (7h33) ?
Have I made a mistake in its generation ? I don't understand where, because these are the exact same code as in the client side...
There's a couple of errors in your code, some your fault, some not. The biggest problem here is that the C <time.h> API is so poor, confusing, incomplete and error prone that errors like this are very nearly mandatory. More on that later.
The first problem is this line:
struct tm TimeStampRecu;
It creates an uninitialized tm and then passes that into strptime. strptime may not fill in all the fields of TimeStampRecu. You should zero-initialize TimeStampRecu like this:
struct tm TimeStampRecu{};
Next problem:
strptime(TimsStamp_str.c_str(), "%d-%m-%Y %I:%M:%S", &TimeStampRecu);
The 12-hour time denoted by %I is ambiguous without an AM/PM specifier. I suspect this is just a type-o as it is the only place you use it.
Next problem:
gmtime time_t -> tm UTC to UTC
mktime tm -> time_t local to UTC
That is, mtkime interprets the input tm according to the local time of the computer it is running on. What you need instead is:
timegm tm -> time_t UTC to UTC
Unfortunately timegm isn't standard C (or C++). Fortunately it probably exists anyway on your system.
With these changes, I think your code will run as expected.
If you are using C++11, there is a safer date/time library to do this here (free/open-source):
https://github.com/HowardHinnant/date
Here is your code translated to use this higher-level library:
std::string getTime()
{
using namespace std::chrono;
auto rawtime = time_point_cast<seconds>(system_clock::now());
auto time_str = date::format("%d-%m-%Y %H:%M:%S", rawTime);
std::cout<<"Time Stamp now (client) : "<<time_str<< '\n';
return time_str;
}
bool checkTimeStamp(std::string TimsStamp_str, std::chrono::seconds delay)
{
using namespace std::chrono;
std::cout<<"TimeStamp recieved: "<<TimsStamp_str<< '\n';
/* Construct tm from string */
std::istringstream in{TimsStamp_str};
time_point<system_clock, seconds> t_old;
date::parse(in, "%d-%m-%Y %H:%M:%S", t_old);
/* Generate New TimeStamp */
auto rawtime = time_point_cast<seconds>(system_clock::now());
auto time_str = date::format("%d-%m-%Y %H:%M:%S", rawTime);
in.str(time_str);
time_point<system_clock, seconds> t2;
date::parse(in, "%d-%m-%Y %H:%M:%S", t2);
cout<<"Time Stamp now (server) : "<<time_str<<endl;
/* Comparison */
auto diffSecs = t2 - t_old;
cout<<diffSecs.count()<<endl;
bool isTimeStampOK;
if (diffSecs < delay)
isTimeStampOK = true;
else
isTimeStampOK = false;
return isTimeStampOK;
}
A timestamp is an absolute value. It doesn't depend on timezones or DST. It represents the number of seconds since a fixed moment in the past. The value returned by time() is the same, no matter what the timezone of the server is.
Your code doesn't produce timestamps but dates formatted for human consumption.
I recommend you use timestamps internally and format them to dates readable by humans only in the UI. However, if you need to pass dates as strings between various components of your application, also put the timezone in them.
I would write your code like this (using only timestamps):
// The client side
time_t getTime()
{
return time(NULL);
}
// The server side
bool checkTimeStamp(time_t clientTime, double delay)
{
time_t now = time(NULL);
return difftime(now, clientTime) < delay;
}
Update
If you have to use strings to communicate between client and server then all you have to do is to update the format string used to format the timestamps as dates (on the client) and parse the dates back to timestamps (on the server), to include the timezone you used to format the date.
This means add %Z into the format everywhere in your code. Use "%d-%m-%Y %H:%M:%S %Z". Btw, the code you posted now reads "%d-%m-%Y %I:%M:%S" in the call to strptime() and that will bring you another problem in the afternoon.
Remark
I always use "%Y-%m-%d %H:%M:%S %Z" because it is unambiguous and locale-independent. 06-04-2016 can be interpreted as April 6 or June 4, depending of the native language of the reader.
I appended date to the mongodb like this
bson_append_date(b,"uploadDate",(bson_date_t)1000*time(NULL));
Do remember that this will append "milliseconds since epoch UTC" and saved as 2014-06-27 06:11:56
Now i am reading it out and it is giving milliseconds (1403852029) which is exactly right. Now i want to convert it into local time. I tried to use the localtime function of C++ but did get success as the time returned by mongodb is in int64_t.
if(bson_iterator_type(&it)==BSON_DATE)
bson_date_t date_it = bson_iterator_date( &it );
where bson_date_t is typedef int64_t bson_date_t;. Can anyone tell me how i can get the local time from the milliseconds.
Getting a valid time_t that would work with localtime should be exactly the opposite of what you are doing in the forward conversion:
bson_append_date(b,"uploadDate",(bson_date_t)1000*time(NULL));
To have a workable time_t, you should do following:
time_t rawTime = (time_t)(bson_iterator_date( &it ) / 1000);
struct tm * timeinfo = localtime (&rawTime);
One more method.
bson_date_t date_it = bson_iterator_date( &it );
struct tm* ts;
time_t epoch_time_as_time_t= date_it/1000;
ts=localtime(&epoch_time_as_time_t);
strftime(upload_Date,sizeof(upload_date),"%a %Y-%m-%d %H:%M:%S %Z",ts);
I was writing a quick utility to dump out some of the details of the stat structure but hit an issue as the time attributes of stat seem to be of type timestruc_t which on my platform seems to be two 64bit ints.
struct stat statBuf;
return_code = stat( aFileName, &statBuf );
if ( !return_code )
{
struct tm res;
localtime_r( statBuf.st_mtim.tv_sec, &res ); // problem!
I thought I could maybe use localtime_r to convert the seconds attribute into a struct tm but I seem to get casting issues using statBuf.st_mtim.tv_sec as the first parameter.
I'm sure this isn't the best solution - maybe you know a better one. I just want to get the date and time - down to sub-seconds if possible - out as a string in the format YYYY-MM-DD HH.MM.SS.SSS or something similar. Any suggestions would be very welcome.
UPDATE
This was a simple issue - my mistake. Just forgot that the first parameter needs to be the address of and int not the int by value. So the amended and partially completed code looks like this:
localtime_r( &statBuf.st_mtim.tv_sec, &res );
const int bufLen=24;
char buffer[ bufLen + 1];
strftime( buffer, bufLen, "%Y-%m-%d %H:%M:%S", &res );
printf(" %s, %s\n", aFileName, buffer);
The first parameter needs to be the address of an int not the int by value.