Why is _tcstod using my windows region settings when parsing a string? - c++

Windows-related C++ question!
I'm trying to use _tcstod( ) to parse a string to obtain a float value. Normally, if I call
wchar_t* endPtr;
float result = static_cast<float>(_tcstod( "12.345678", &endPtr));
I get a float value of 12.345678 and endPtr behaves as expected. This actually misbehaves if I change my regional decimal delimiter in the Windows Region and Language settings. Specifically, if I change the decimal delimiter from "." to ",", suddenly _tcstod only returns a value of 12 rather than the whole thing. Anything after the . is chopped off.
Is there some way for me to parse the float value from the string while being agnostic to my Region settings?

Why is _tcstod using my windows region settings when parsing a string?
Because it is supposed to.
Is there some way for me to parse the float value from the string while being agnostic to my Region settings?
Of course. The simplest way, in C++, is to use a stringstream and imbue it with a default or "C" locale.

Related

How to append a long to a string in C++?

I'm working on a school project and I'm making a VEX Robotics program. It's in C++, and I am very new to this language. I can't import any new libraries to help me and I want to display a value on a screen. Unfortunately, I need to make it say "Left Stick tilt: " and then the tilt value of the VEX controller's left stick and the same with the right stick. The code should work aside from the fact that I can't simply add the two together and have the value of the controller tilt converted to numerical characters. Here's my code:
Controller1.Screen.setCursor(1, 1);
Controller1.Screen.print("Left Stick tilt: " + Controller1.Axis3.position());
Controller1.Screen.setCursor(2, 1);
Controller1.Screen.print("Right Stick tilt: " + Controller1.Axis2.position());
Could anyone experienced with the VEX system help me? (I'm using VEXcode V5 on a chromebook, if it makes any difference)
Edit: so far everyone has recommended things within libraries. I was not clear enough; I cannot use any libraries, including the standard library, due to the limitations of VEXcode V5
How to append a long to a string in C++?
In order to append long to a string, you must convert the integer to a string. You can for example use std::to_string.
You can append to another string like this:
long l = 42;
std::string s = "look at this long: ";
s += std::to_string(l);
Alternatively, you can use a format string. For example:
std::string s = std::format("look at this long: {}", l);
However, for purposes of output, don't necessarily need to append to the string. Instead, you could keep them separate, output the string, and then output the long.

SetLocale vs Custom Data format on Windows

I'm working on a C++/MFC Windows application and I want that the format of the number, date, etc is correct in all over the world.
Example: Someone prefer to write a numbers with the " , " separator for the decimals, other prefer to use the dot, and so on.
I start putting the locals command at the startup of my windows application:
setlocale(LC_ALL, "" );
This works for language format, because when the second parameter is empty this function ask the local setting to the OS for the numbers, date, etc.
This works for the Region-> Format but not with the "Additional settings": for example, if your system format is English (United States) like in the image, and you change it through the Format list using another language with a different format for numbers, it work on my application.
For "works" I mean that the number insidet texbox, ListCtrl, etc use the correct format for decimal symbol, digit grouping, etc.
But if you go in the "Additional settings" and change the single format like decimal separator, etc without changing the forma language it doesn't work.
In this case, when I use the "setlocale" function the format of the numbers is the same of language; it does not follow the custom rules. Other programs, like Microsoft Excel, are able to show the numbers following the custom rules,
How I can do the same on my software?
How get the correct format for numbers (language format + custom settings)?
Thanks
This is the additional setting dialog:
Take a look at MSDN topics:
"Using National Language Support"->"Working with Custom Locales".
Also try GetLocaleInfoEx.

QDoubleValidator accepts multiple decimal points

I'm using a QDoubleValidator for my QLineEdit. The application locale (set in QtCreator) is QLocale::German.
Now when I enter a valid double (either using a dot or a comma as decimal separator) writing to the textedit as well as converting the string to a float works perfectly fine. But the validator also lets me write stuff with multiple decimal separators. Strings like 123.567,890 or ,,03.4... get validated but can't get converted into a float.
Is there a way to tell QDoubleValidator to only validate real numbers and not just strings without alphabetical characters?
I basically want to have a validator, that only validates strings, that can get converted to floats
using either the default locale or the german locale.
I have not used the QDoubleValidator so far but I could achieve such behaviour by using a QRegExpValidator:
QRegExpValidator* rxv = new QRegExpValidator(QRegExp("[+-]?\\d*[\\.,]?\\d+"), this);
lineedit->setValidator(rxv);
If you want only convert your content into the float and you don't want locale specifications, you can use QRegExpValidator with next deep regexp.
ui->lineEdit->setValidator(new QRegExpValidator(QRegExp("[-+]?[0-9]*\\.?[0-9]+([eE][-+]?[0-9]+)?")));

Possible to pass UTF-8/UTF-16 options to JVM invoked from C++?

I've got a Windows C++ program where I want to invoke a JVM and be able to pass it an option that might be given from the command line invocation of the C++ program (the command line option might not be plain text, for example "-Dblah=japan日本"). The JavaVMOption struct in jni.h appears to define the option string as chars only, so it looks like I can't just pass it a wide string.
I tried converting it to UTF-8 and storing it as a narrow string on the C++ side and then on the Java side to convert it back, but it seems the "日本" gets replaced with the actual "??" characters, and thus are lost in the conversion-unconversion process.
Am I thinking about this incorrectly? Would this not be expected to work?
The invocation api documentation makes it clear:
typedef struct JavaVMOption {
char *optionString; /* the option as a string in the default platform encoding */
void *extraInfo;
} JavaVMOption;
The term "default platform encoding" is unambiguous, that does not mean utf-8 on Windows. It means the encoding used by the default system code page. If your machine is not configured to use a Japanese code page (like 932) then the conversion from the utf-16 string is going to produce question marks for Japanese characters that cannot be converted. This is not normally a problem since a Japanese user will have the correct code page selected. No workaround for having the wrong one.
Ensure you've got the correct system code page selected, Control Panel + Region and Language to change. And use WideCharToMultiByte() with CP_ACP to make the conversion.

utf-8 encoding a std::string?

I use a drawing api which takes in a const char* to a utf-8 encoded string. Doing myStdString.cstr() does not work, the api fails to draw the string.
for example:
sd::stringsomeText = "■□▢▣▤▥▦▧▨▩▪▫▬▭▮▯▰▱";
will render ???????????????
So how do I get the std::string to act properly?
Thanks
Try using std::codecvt_utf8 to write the string to a stringstream and then pass the result (stringstream::str) to the API.
There are so many variables here it's hard to know where to begin.
First, verify that the API really and truly supports UTF-8 input, and that it doesn't need some special setup or O/S support to do so. In particular make sure it's using a font with full Unicode support.
Your compiler will be responsible for converting the source file into a string. It probably does not do UTF-8 encoding by default, and may not have an option to do so no matter what. In that case you may have to declare the string as a std::wstring and convert it to UTF-8 from there. Alternatively you can look up each character beyond the first 128 and encode them as hex values in the string, but that's a hassle and makes for an unreadable source.