I'm using time_point the first time.
I want to parse datetime from string and found a difference when returning back to string from 1 hour.
std::chrono::system_clock::time_point timePoint;
std::stringstream ss("2021-01-01 00:00:09+01");
std::chrono::from_stream(ss, "%F %T%z", timePoint);
// timePoint == {_MyDur={_MyRep=16094556090000000 } }
std::string timePointStr = std::format("{:%Y/%m/%d %T}", floor<std::chrono::seconds>(timePoint));
// timePointStr = "2020/12/31 23:00:09"
I don't know what is wrong: the timepoint and the parsing or the formatting to string?
How can I get the same format as the parsed one?
This is the expected behavior.
Explanation:
system_clock::time_point, and more generally, all time_points based on system_clock, have the semantics of Unix Time. This is a count of time since 1970-01-01 00:00:00 UTC, excluding leap seconds.
So when "2021-01-01 00:00:09+01" is parsed into a system_clock::time_point, the "2021-01-01 00:00:09" is interpreted as local time, and the "+01" is used to transform that local time into UTC. When formatting back out, there is no corresponding transformation back to local time (though that is possible with additional syntax1). The format statement simply prints out the UTC time (an hour earlier).
If you would prefer to parse "2021-01-01 00:00:09+01" without the transformation to UTC, that can be done by parsing into a std::chrono::local_time of whatever precision you desire. For example:
std::chrono::local_seconds timePoint;
std::stringstream ss("2021-01-01 00:00:09+01");
from_stream(ss, "%F %T%z", timePoint);
...
Now when you print it back out, you will get "2021/01/01 00:00:09". However the value in the rep is now 1609459209 (3600 seconds later).
1 To format out at sys_time as a local_time with a UTC offset of 1h it is necessary to choose a time_zone with a UTC offset of 1h at least at the UTC time you are formatting. For example the IANA time zone of "Etc/GMT-1" always has an offset of 1h ... yes, the signs of the offset are reversed. Using this to transform 2020-12-31 23:00:09 UTC back to 2021-01-01 00:00:09 would look like:
std::chrono::sys_seconds timePoint;
std::stringstream ss("2021-01-01 00:00:09+01");
std::chrono::from_stream(ss, "%F %T%z", timePoint);
// timePoint == 2020-12-31 23:00:09 UTC
std::string timePointStr = std::format("{:%Y/%m/%d %T}",
std::chrono::zoned_time{"Etc/GMT-1", timePoint});
cout << timePointStr << '\n'; // 2021/01/01 00:00:09
Disclaimer: I do not currently have a way to verify that MSVC supports the "Etc/GMT-1" std::chrono::time_zone.
Fwiw, using "Africa/Algiers" or "Europe/Amsterdam" in place of "Etc/GMT-1" should give the same result for this specific time stamp. And if your computer has its local time zone set to something that has a 1h UTC offset for this timestamp, then std::chrono::current_zone() in place of of "Etc/GMT-1" will also give the same result.
Related
What is the best way to initialize a Date to midnight using AWS AppSync utility.
I need to know if we have something like this
var d = (new Date()).setUTCHours(0,0,0,0)
By using $util.time.nowEpochSeconds() , I am getting the epoch time but how do i identify the time difference that i need to add to set as midnight time
AppSync doesn't offer that capability through utils only yet and this is good feedback, I'll make sure the team sees this.
In the meantime, as a workaround, you could modify the date string to achieve what you need.
#set($time = $util.time.nowFormatted("yyyy-MM-dd/HH:mm:ss/Z"))
#set ($split = $time.split("/"))
#set ($midnight = $split[0] + " 00:00:00" + $split[2])
time: $time
midnight: $midnight
midnight epoch seconds: $util.time.parseFormattedToEpochMilliSeconds($midnight, "yyyy-MM-dd HH:mm:ssZ")
will print:
time: 2019-07-15/22:33:57/+0000
midnight: 2019-07-15 00:00:00+0000
midnight epoch seconds: 1563148800000
I am trying to convert UTC timestamp to simple binary encoded message.
Would like to achieve what is mentioned in example here.
Binary Encoding Example
The following timestamp:
UTC timestamp 14:17:22 Friday, October 4, 2024
is expressed in binary code (nanoseconds since Unix epoch) this way:
007420bf8838fb17 (8 bytes in nanoseconds since Unix epoch synced to a master clock to microsecond accuracy.
What I have done so far is,
import struct
from datetime import datetime
dt = datetime(2024, 10, 4, 14, 17, 22, 0)
timestamp = (dt - datetime(1970, 1, 1)).total_seconds() * 1000000000
utc_timestamp = struct.pack('d', timestamp)
Output I see on CLI is '\xf4\x02\x82\xa1E\xfb\xb7C' but, as per example in shared link, expected is 007420bf8838fb17
TL;DR: I think there's either an error in linked example, or the description of the representation is incorrect or incomplete.
import struct
from datetime import datetime, timedelta, timezone
utc_tz = timezone(timedelta(0))
t = datetime(2024, 10, 4, 14, 17, 22, 0, utc_tz)
nanos = int(t.timestamp()) * 1_000_000_000
print(hex(timestamp))
This gives the result of 0x17fb45a18202f400, not the quoted 0x17fb3888bf297400 (as presented, I think it's a little-endian byte string, so this value has the order of bytes reversed).
The quoted answer, converted to decimal seconds (dividing by 1_000_000_000) is 1728037042.0005898. That value obviously has a sub-second quantity, which the source datetime does NOT have.
Decoding the seconds component of the quoted answer:
import time
answer = 0x17fb3888bf297400
secs = int(answer / 1_000_000_000)
print(time.gmtime(secs))
gives:
time.struct_time(tm_year=2024, tm_mon=10, tm_mday=4, tm_hour=10, tm_min=17, tm_sec=22, tm_wday=4, tm_yday=278, tm_isdst=0)
which looks almost correct, save that the hour component is 4 hours earlier. So ... my guess is that this is actually a US/Eastern timestamp (not UTC), and contains a sub-second component of 5898ns, and the example is misleading.
If the schema you're using is encoding timestamps as a 64-bit number of nanoseconds since the Unix epoch (midnight 1 January 1970), then I think your example code has only one error: use format <q (or <Q) rather than d, and you'll get a little-endian 64-bit integer as required. You will also need to convert the timestamp to an integer to have struct.pack() accept it.
I have a question about python, I am saving the current months and year using strftime() in the format of %y %m so the value will display like '17 01'. This is how I get the year and month
from time import gmtime, strftime
strftime("%y %m" ,gmtime())
print strftime()
I am currently using SQLite and I wish to call a function to check in my DB means that before the unique ID was generated and display. I wish to know that whether there is changes in month ?? if yes, the auto increment need to start from 0, if no the auto increment will just continue
Both of the value will be display as a ID and will be save as int to the database, and after the year and month have a auto increment integer (the string will look like these
(y)(m)(auto increment)
17 10 001
how to write a code when there is changes on %y or %m , I will trigger a command and run
db.engine.execute("ALTER TALBE myDB.myTable AUTO_INCREMENT=0;")
I am trying to display client's timezone besides the timestamp.
E.g 4:13 PST
I tried using GetTimeZoneInfo() but the only way I could think of is by getting the offset in hours and then determining through an array of hard coded values.
Other way around I found was using java.util.TimeZone class. Following is the code I have tried ---
<cfset tz = CreateObject("java", "java.util.TimeZone")>
<cfset tz = tz.getDefault()>
<cfoutput>TimeZone:#tz.getDisplayName(false, 1)#</cfoutput>
This gives me output as Central Standard Time.
Any further help...
The code you mention above gets the server's TZ, not the client's.
If you want the client's TZ, you should read the comments against this other, similar question. These all revolve around using the Date.getTimezoneOffset() method. This does only give you the offset from UTC though, not the more familiar GMT / BST etc.
If you are allowing your users to select their time zone instead of getting it from the browser which potentially could be inaccurate, or they are coming from database values such as time zone per city, etc, or you simply need to extract the abbreviation from any datetime value, you can parse it out of the return value from LSDateTimeFormat() with the "long" mask.
function tzabbr(required date dttm, string tz = "", string locale = GetLocale()) {
var str = tz == ""
? LSDateTimeFormat(dttm, "long", locale)
: LSDateTimeFormat(dttm, "long", locale, tz)
return ListLast(str, " ")
}
// Usage Examples
dttm = Now()
tzServ = tzabbr(dttm)
tzWest = tzabbr(dttm, "US/Pacific")
tzEast = tzabbr(dttm, "US/Eastern")
https://trycf.com/gist/144aa0399ea80127a3aa1d11a74fc79b/acf2021?theme=monokai
I would prefer to do this with Qt Methods if at all possible.
Currently in our code, we can distinguish that Windows is on a 24 hour clock; however not on Mac.
We have a method that returns a string such as: 1/9/2012 9:53:42 AM - Which is giving us a previous time, not the current one (Which is what we want), I do not want to mess with this method though.
I've been playing around with a way to determine if the current system clock is in military time; and to adjust the previous time returned from the string to reflect that. I can get this to work on Windows, but on Mac - it displays a normal 12-hour time regardless of whether we're on a 24-hour clock.
Ignore my crude-debugging messages or if I'm not particularly going at the problem correctly - I haven't been able to test it yet and tweak as necessary: (Explanation after code)
QLocale *ql = new QLocale();
QString qlTF = ql->timeFormat();
QString fileTime = QString::fromUtf8(str.GetSafeStringPtr());
if (qlTF.left(1) == (QString("H"))) // Our system clock is set to military time
{
QString newTime;
QStringList fileTimeDateSplit = fileTime.split(" ");
QStringList fileTimeSplit = fileTimeDateSplit.at(1).split(":");
m_editModified->setText(qlTF);
if (fileTimeSplit.at(0).toInt() < 12 && (fileTimeDateSplit.at(2) == "PM"))
{
int newHour = 12 + (fileTimeSplit.at(0).toInt()%12);
newTime.append(QString::number(newHour));
newTime.append(":");
newTime.append(fileTimeSplit.at(1));
newTime.append(":");
newTime.append(fileTimeSplit.at(2));
m_editModified->setText(QString("military after noon"));
}
}
else m_editModified->setText(qlTF);
Basically I'm grabbing the locale of the current machine to retrieve the system's time format.
fileTime is set to a string such as "1/9/2012 9:53:42 AM".
qlTF returns a format such as: HH:mm:ss , H:mm:ss, hh:mm:ss, or h:mm:ss - capital meaning it's a 24 hour clock.
I tokenize the different strings by the delimiters and then check to see if the time was greater than 12 and PM; then add the additional time and combine the new time string.
You can see that I did:
m_editModified->setText(qlTF);
for debugging purposes. On Windows, this will be set to HH:mm:ss; however even with a 24-hour clock enabled on a mac, it still returns h:mm:ss - which completely defeats the purpose.
Any ideas would be very much appreciated!
Why don't you just convert the string you have ("1/9/2012 9:53:42 AM") to QDateTime and then convert that QDateTime back to string in the format you want (I use ISODate in the example):
QString timeFormat = "M/d/yyyy h:m:s AP";
QDateTime dt = QDateTime::fromString("1/9/2012 9:53:42 AM", timeFormat);
QString text = "";
if (dt.isValid())
text = dt.toString(Qt::ISODate);