We are trying to get the timezone from std::tm with strftime:
char timezone[50];
strftime(timezone, sizeof(timezone), "%Z", &timeCreated);
On iOS, we get "EST" which is what we want. But on Windows, we get "Eastern Summer Time". Anybody know how to consistently get the current timezone in C++ in abbreviation form?
I consider making the abbreviation from the full name of the timezone by simply picking out the first character in each word. But I check the list of abbreviations and notice that we could have timezone like this one "Chuuk Time" and abbreviated as "CHUT". Which makes manually adjusting not possible.
Not the same as Question: Windows Timezone and their abbreviations? I don't need a full list of all timezones and abbreviations. But instead, I need a systematic way to the current timezone using for example strftime. I want them to use the system's current timezone and the the current local.
Using this free, open-source library that has been ported to VS-2013 and later:
#include "tz.h"
#include <iostream>
int
main()
{
using namespace std::chrono;
using namespace date;
std::cout << format("%Z\n", make_zoned(current_zone(), system_clock::now()));
}
This just output for me:
EDT
Fully documented.
This library uses the IANA timezone database, and when current_zone() is called, translates the current Windows timezone into the appropriate IANA timezone.
Time zone information under Windows is kept in registry, you can find it in the following registry key:
HKEY_LOCAL_MACHINE
SOFTWARE
Microsoft
Windows NT
CurrentVersion
Time Zones
time_zone_name
you will not find there any abbreviations. The reason is mostly because it is not standardised, and also one abbreviation can be assigned to many time zone names, for more read here. Your aproach with taking first letter is fine, you can look up names also on this wiki:
https://en.wikipedia.org/wiki/List_of_time_zone_abbreviations
Also see this thread from MSDN forums:
https://social.msdn.microsoft.com/Forums/vstudio/en-US/3aa4420a-a5bf-48a3-af13-17a0905ce366/is-there-any-way-to-get-timezone-abbreviations?forum=csharpgeneral
GetDynamicTimeZoneInformation is probably a good choice. However, the minimum supported versions are Windows Vista, Windows Server 2008 and Windows Phone 8. So for anything below that GetTimeZoneInformation is better.
However another issue is both sometimes return StandardName or DaylightName empty.
In that case you have to use the windows registry as marcinj has stated. Here is the function taken from gnu cash which was also modified from glib.
static std::string
windows_default_tzname(void)
{
const char *subkey =
"SYSTEM\\CurrentControlSet\\Control\\TimeZoneInformation";
constexpr size_t keysize{128};
HKEY key;
char key_name[keysize]{};
unsigned long tz_keysize = keysize;
if (RegOpenKeyExA(HKEY_LOCAL_MACHINE, subkey, 0,
KEY_QUERY_VALUE, &key) == ERROR_SUCCESS)
{
if (RegQueryValueExA(key, "TimeZoneKeyName", nullptr, nullptr,
(LPBYTE)key_name, &tz_keysize) != ERROR_SUCCESS)
{
memset(key_name, 0, tz_keysize);
}
RegCloseKey(key);
}
return std::string(key_name);
}
This can be done with the Windows Runtime (WinRT) API, specifically the Calendar.GetTimeZone method. I don't have the C++ code to do so, but here it is with Rust/WinRT version 0.7 from the Rust crate iana-time-zone. The C++ version will be similar.
use windows::globalization::Calendar;
let cal = Calendar::new()?;
let tz_hstring = cal.get_time_zone()?;
# convert the Windows HString to a Rust std::string::String
tz_hstring.to_string()
Related
I have this function:
#include <cstdlib>
#include <clocale>
double SimplifiedExample () {
setlocale (LC_ALL, "");
char txt [] = "3.14";
for (int x=0; txt [x]; x++)
if (txt [x] == '.')
txt [x] = localeconv ()->decimal_point [0];
char *endptr;
return strtod (txt, &endptr);
}
This works on every Windows 10 and Linux system I've tried it on. It fails on at least one Windows 7 machine (which reports "," for localeconv ()->decimal_point [0], but uses "." in both sprintf and strtod). Is the use of localeconv to supply the system decimal point to strtod correct, or not?
Note that I can't remove the setlocale() call. The application MUST respect locale settings.
Have you tried "man strtod"? I have Slackware linux and the strtod man page has strtod, which mentions locale for the decimal point, and says it lives in stdlib.h.
You're falling into a bog of implementation defined behaviour.
The problem might be in that the setlocale on Windows doesn't behave how it is proposed in standards, due to it being a library wrapper around OS's idiosincrazie, e.g. UTF-8 supported in C++ runtime only in Win10. More of, some C++ library implementations of sprintf\printf\strtod respects only std::locale::global. More of, some runtimes do not support anything but "C" and "" and formatted output only ANSI or IEEE-754 (hence , a dot). Old Win10 and older versions do not even respect IEEE-754.
E.g, an arbitrary MinGW64 on Windows-7 would respond to this code:
auto old_locale = std::locale::global(std::locale(""));
cout << old_locale.name() << "\n";
cout << std::locale("").name() << "\n";
with reporting an ANSI locale
C
C
output regardless of system locale and doesn't recognize OEM locale ".OCP" or any present locale name. This behaviour is hard-coded in library linked with application,not in some system-wide one.
Note that you can try check returned value of setlocale:
auto loc = setlocale(LC_ALL, "");
if (loc == NULL){
printf ("setlocale failed!\n");
}
else
printf ("setlocale %s!\n",loc );
And in my case it had returned... "English_United States.1252" while I have a US locale set for user and "Russian_Russia.1251" for russian user locale, which doesn't match more usual "En-US" format. Windows uses underscores.
There would be another problem to solve while interfacing with some database service which would be run as a different user. Existing databases already implement support in defining what locale the particular database uses.
More of output of all library functions didn't change with the change of locale.
Developers of multiplatform applications are urged to use proper library packages for formatted input\output which would eliminate the struggle with platform and implementation specific behaviour.
I need modification time, creation time and change time of file in windows using cpp. I am using following code:
string filename = "D:\\hi.txt";
struct stat result;
if (stat(filename.c_str(), &result) == 0)
{
int a = 10;
auto mod_time = result.st_mtime;
cout << "modified time is: "<<mod_time<<endl;
}
Using this I am able to get modification and creation time. But, I am not able to get change time for the file. How should I get change time for file using cpp?
The definition of "change time" follows.
Modification time changes when the content of the file changes and
Change time changes even when the properties of the file change like
access permissions.
MSDN defines three timestamps for files: Creation Time, Last Access Time, Last Write Time. What you ask for looks in fact to be the Last Access Time.
In your example you use a Libc function stat() which is meant to work on all systems that have a C compiler. As it is, it may be too generic i.e. it does not represent all capabilities available inside a particular environment (MS Windows in your case), only a subset of generic properties.
At this link you can find the description of GetFileTime() WinAPI function that returns file times supported on Windows. If you write an application that is not meant to be ported to other platforms, you are better off using WinAPI for system-level things.
I am using the following code for setting the time in a Date Control in MFC using C++
CTime date;
date = date.GetCurrentTime();
this->m_headerDate.SetTime(&date);
This will get the Date and set it to the control in what ever format the user machine uses. But I want to set it to a format of ONLY mm/dd/yyyy.
There should be some way of doing this in MFC. Are there any utility functions for this?
Thanks,
Without MFC:
#include <iostream>
#include <ctime>
using namespace std;
int main() {
const int MAXLEN = 80;
char s[MAXLEN];
time_t t = time(0);
strftime(s, MAXLEN, "%m/%d/%Y", localtime(&t));
std::cout << s << '\n';
}
Compiled Code
With MFC:
This function formats a date as a date string for a specified locale. The function formats either a specified date or the local system date.
int GetDateFormat(
LCID Locale,
DWORD dwFlags,
CONST SYSTEMTIME* lpDate,
LPCTSTR lpFormat,
LPTSTR lpDateStr,
int cchDate
);
change LPCTSTR lpFormat to MM:dd:yyyy
If you're talking about getting a specific textual representation of a date/time, you can use strftime() to format a date in many different ways, including the one specified in your question.
You will need a variable of type time_t using the facilities in the ctime header. So you can either switch to using those times, or I believe CTime::GetTime( ) will give you one.
However, if you're talking about forcing a control to display it's date/time in a specific format, that's a property of the control itself. For example, CDateTimeCtrl provides a SetFormat() method which will modify how it displays its data.
Here I'm storing value in a CString variable
Try this code:
CString strTime;
CTime date;
date = GetCurrentTime();
strTime = date.Format(_T("%m/%d/%Y"));
I have found a way of doing this...not sure if this is the simplest way though
Since we already have a datetime control .All we can do is just use the Setformat Function of the DatetimeControl. PFB an example of this
CDateTimeCtrl m_DateTimeCtrl;
m_DateTimeCtrl.SetFormat(_T("MM/dd/yyyy"));
The above will set it up to the format of 01/14/2015 which is desired. Thanks to Paxdiablo, Himanshu and Irrational Person for the inputs. They did point me in right direction with different options.
Windows seems to keep track of at least four dimensions of "current locale":
http://www.siao2.com/2005/02/01/364707.aspx
DEFAULT USER LOCALE
DEFAULT SYSTEM LOCALE
DEFAULT USER INTERFACE LANGUAGE
DEFAULT INPUT LOCALE
My brain hurts just trying to keep track of what the hell four separate locale's are useful for...
However, I don't grok the relationship between code page and locale (or LCID, or Language ID), all of which appear to be different (e.g. Japanese (Japan) is LANGID = 0x411 location code 1, but the code page for Japan is 932).
How can I configure our application to use the user's desired language as the default MBCS target when converting between Unicode and narrow strings?
That is to say, we used to be an MBCS application. Then we switched to Unicode. Things work well in English, but fail in Asian languages, apparently because Windows conversion functions WideCharToMultiByte and MultiByteToWideChar take an explicit code page (not a locale ID or language ID), which can be set to CP_ACP (default to ANSI code page), but don't appear to have a value for "default to user's default interface language's code page".
I mean, this is some seriously convoluted twaddle. Four separate dimensions of "current language", three different identifier types, as well as (different) string-identifiers for C library and C++ standard library.
In our previous MBCS builds, disk I/O and user I/O worked correctly: everything remained in the DEFAULT SYSTEM LOCALE (Windows XP term: "Language for non-Unicode Programs"). But now, in our UNICODE builds, everything tries to use "C" as the locale, and file I/O fails to properly transcode UNICODE to user's locale, and vice verse.
We want to have text files written out (when narrow) using the current user's language's code page. And when read in, the current user's language's code page should be converted back to UNICODE.
Help!!!
Clarification: I would ideally like to use the MUI language code page rather than the OS default code page. GetACP() returns the system default code page, but I am unaware of a function that returns the user's chosen MUI language (which auto-reverts to system default if no MUI specified / installed).
I agree with the comments by Jon Trauntvein, the GetACP function does reflect the user's language settings in the control panel. Also, based on the link to the "sorting it all out" blog, that you provided, DEFAULT USER INTERFACE LANGUAGE is the language that the Windows user interface will use, which is not the same as the language to be used by programs.
However, if you really want to use DEFAULT USER INTERFACE LANGUAGE then you get it by calling GetUserDefaultUILanguage and then you can map the language id to a code page, using the following table.
Language Identifiers and Locales
You can also use the GetLocaleInfo function to do the mapping, but first you would have to convert the language id that you got from GetUserDefaultUILanguage into a locale id, and I think you will get the name of the code page instead of a numeric value, but you could try it and see.
If all you want to be able to do is configure a locale object to use the currently selected locale settings, you should be able to do something like this:
std::locale loc = std::locale("");
You can also access the current code page in windows using the Win32 ::GetACP() function. Here is an example that I implemented in a string class to append multi-byte characters to a unicode string:
void StrUni::append_mb(char const *buff, size_t buff_len)
{
UINT current_code_page = ::GetACP();
int space_needed;
if(buff_len == 0)
return;
space_needed = ::MultiByteToWideChar(
current_code_page,
MB_PRECOMPOSED | MB_ERR_INVALID_CHARS,
buff,
buff_len,
0,
0);
if(space_needed > 0)
{
reserve(this->buff_len + space_needed + 1);
MultiByteToWideChar(
current_code_page,
MB_PRECOMPOSED | MB_ERR_INVALID_CHARS,
buff,
buff_len,
storage + this->buff_len,
space_needed);
this->buff_len += space_needed;
terminate();
}
}
Just use CW2A() or CA2W() which will take care of the conversion for you using the current system locale (or language used for non-Unicode applications).
FWIW, this is what I ended up doing:
#define _CONVERSION_DONT_USE_THREAD_LOCALE // force CP_ACP *not* CP_THREAD_ACP for MFC CString auto-conveters!!!
In application startup, construct the desired locale: m_locale(FStringA(".%u", GetACP()).GetString(), LC_CTYPE)
force it to agree with GetACP(): // force C++ and C libraries based on setlocale() to use system locale for narrow strings
m_locale = ::std::locale::global(m_locale); // we store the previous global so we can restore before termination to avoid memory loss
This gives me relatively ideal use of MFC's built-in narrow<->wide conversions in CString to automatically use the user's default language when converting to or from MBCS strings for the current locale.
Note: m_locale is type ::std::locale
I have inherited some MFC C++ code (it's an ActiveX OCX control running on a Windows Mobile 6.5 device) and I need to acquire the system date and time and append it as part of an existing string which gets passed via the com port to another device.
I can get the system date and time, but I can not figure out how to convert that into a string so that I can append it (via strcat.)
I've found a number of different answers on Google and Bing for what at first glance seemed like such a simple problem... :( but I don't know enough MFC C++ to adapt any of it to my needs. Any help would be greatly appreciated.
CTime t = CTime::GetCurrentTime();
CString s = t.Format( "%A, %B %d, %Y" );
char * str = (LPCTSTR) s;
Note, I believe that str is only valid while s is in scope. Probably should copy it off somewhere if you need it to be around after s is destroyed. If you are passing it to strcat() you're probably OK.
In MFC the following code is for current date in MMDDYYYY format.
CTime t = CTime::GetCurrentTime();
CString strDate = t.Format("%m%d%Y");