Using the code below sometimes the text variable contains a very huge and strange number, something like "1552505576255083400000000000000000000000000000000000000000000000000000.000".
A "0.000" string is expected.
I've also tryed with a basic dialog app and execute these two lines of code in the "OnInitDialog()"
I'm using VS 2013. With VS 2003 seems that it works correctly.
can somebody tell me why?
CString text;
text.Format(_T("%.3f"), 0);
Your code has a bug. The %f format specifies requires a floating point number and you specify an integer. To fix the bug, change the 0 to 0.0.
Related
I use MinGW64 to compile c++ programs. But since I upgraded to Windows 10, I found my c program output Chinese will be garbled code.
I follow the online method, adding a code in the program header: SetConsoleOutputCP(65001);, then it fixes. but I think it's so troublesome to do this for each c++ program. What should I do?
I think this is my system's problem, the same code in Windows 7 is not a problem, I just want to find a more convenient solution instead of adding the same code to every file
There's the code:
#include <iostream>
#include <windows.h>
using namespace std;
int main()
{
SetConsoleOutputCP(65001);
cout << "文本" ; //Output will be garbled code if there's no line above
return 0;
}
The console in most operating systems only expects ASCII character input. In order to show some other char set you have to specify that in your code. The SetConsoleOutputCP command sets the "code page" windows should read from. By the way not all versions of windows have the same code for this command.
Please refer to the documentation found here.
The documentation suggests using EnumSystemCodePages to make sure the code for that language exists on that system.
P.S.
Your English is very good :)
EDIT
I tested your code on my computer with Visual Studio 2019 and got the following
Warning C4566 character represented by universal-character-name '\u6587' cannot be represented in the current code page (1255)
even with the SetConsoleOutputCP command you added. I assume you need to have chines installed for this to work. The problem is that I don't have the relevant code page for the windows console to look in for the char set. see this answer and this answer.
I have a hard coded string in my code (which should be used as a file mask), but compiler always changes the "??-" sequence to "~", for example:
const wchar_t textW[] = L"test-??-??-??.txt";
The testW will be "test-~~??.txt" (without quotes).
The same happens for non-unicode strings as well:
const char textA[] = "test-????-??-??.txt";
textA will be "test-??~~??.txt" (without quotes).
My compiler is Microsoft Visual C++ 2008.
I have just tried this with Visual Studio 2013, the string in runtime is correct and intellisense displays the correct value on the tooltip when I'm tracing the app, but... But in the writing mode (when app isn't running) intellisense displays incorrect value with tildas on the tooltip.
That's a trigraph, a way to express characters that are not always available on keyboards.
This behavior is controlled by the /Zc:trigraphs option, which is off by default. It appears it is enabled for your project, I would suggest you disable it.
It's called a trigraph. They are replaced by the preprocessor.
When I was using up to qt4.8(qt quick 1.1) for gui then I am successfully able to print degree with \260 but when things got upgraded to qt5 and above then this stopped working. I searched on the net and found many relevant link such as (http://www.fileformat.info/info/unicode/char/00b0/index.htm) I tried but no help. Do I need to include some library for usinf UTF format or problem is sth else. Please some one help. What to do?
#Revised,
Here it is described what is being done.
First I am storing the printable statement in string text.
As in cpp function:-
sprintf(text, "%02d\260 %03d\260 ",latD, longD);
QString positionText(text.c_str());
return positionText;
And then using positionText in qml file to display on the window.
So, someone please answer what do I need to do to have degree in display?
Thanks.
Problem is simple you used \260 most probably inside Ansii C-string (const char []). In such cases Qt has use some codec to convert this to Unicode characters. For some reason when you change Qt version default codec was changed and this is why it stopped working.
Anyway your approach is wrong. You shouldn't use C-string which are codec depended (usually this leads to this kind of problems). You can define QChar const as QChar(0260) or best approach is to use tr and provide translation.
It would be best if you give representative example with string with degree character, then someone will provide you best solution.
Edit:
I would change your code like this:
const QChar degreeChar(0260); // octal value
return QString("%1%3 %2%3").arg(latD, 2, 10, '0').arg(longD, 3, 10, '0').arg(degreeChar);
or add translation which will handle this line:
return tr("%1degree %2degree").arg(latD, 2, 10, '0').arg(longD, 3, 10, '0');
Note that this translation for this line only have to be added always no mater what is current locale.
Try
return QString::fromLatin1(text);
or, if that doesn't work, another static QString::fromXXX method.
QT5 changed Qt's default codec from Latin-1 to UTF-8, as described here:
https://www.macieira.org/blog/2012/05/source-code-must-be-utf-8-and-qstring-wants-it/
Latin-1 and Unicode both use 176 (0xB0 or 0260) as the degree symbol, so your usage of it coincidentally worked, since it was interpreted as Latin-1 and converted to the same value in Unicode.
That first line could be changed to:
sprintf(text, "%02d\302\260 %03d\302\260 ",latD, longD);
As mentioned before, going directly to a QString is indeed better, but if you had to go through a std::string, you could simply substitute the UTF-8 encoding of Unicode 176, in which the lower 6 bits 110000 would have a 10 prepended, and the upper 2 bits 10, would have 110000 prepended in the first byte. This becomes: \302\260.
To easily print angles with degree symbols in console, try this:
#include <QDebug>
double v = 7.0589;
qDebug().noquote() << "value=" << v << QString(248);
Console output:
value= 7.0589 °
This works out-of-the-box under Windows.
While this question has probably been asked a thousand times before (pretty sure of it I have read a thousand answers). I still don't get it.
Lets say I have a function that creates a ComboBox like this:
scopeComboSelector=CreateCombobox(hwnd,
GetModuleHandle(0),
CBS_DROPDOWNLIST,
re,
IDCC_DROPDOWNLIST_SCOPE_SELECTOR,
_T("Scopes"));
Where "re" is a positioning rectangle. And IDCC_DROPDOWNLIST_SCOPE_SELECTOR (pretty long name) is the id of the combobox. Now the point is, I can actually fill this "drop down select list" but I have no clue as how I can simply get the currently selected value as a string.
I have seen about 10 ways to do it, which all give errors straight away (need to Convert to LPWSTR -> fixing results in more terror).
Maybe I'm just to used to Java where one can simply say:
textfield.getText();
How would one achieve this in Win32 C++ (microsoft visual studio)?
Edit
Code I've used:
char userName[_MAX_PATH+1];
GetDlgItemTextW(scopeComboSelector,
IDCC_DROPDOWNLIST_SCOPE_SELECTOR,
(LPWSTR)userName,
200);
Returns: userName == empty
Update
Now using: GetDlgItemText(). Debugger tells me the value of userName = ""
The documentation has a C style Windows 9x code example.
You need simply to replace C with C++ and Windows 9x silly T macros with wchar_t and friends.
It's always a good idea to read the documentation.
While trying to convert some existing code to support unicode characters this problem popped up. If i try to pass a unicode character (in this case im using the euro symbol) into any of the *wprintf functions it will fail, but seemingly only in xcode. The same code works fine in visual studio and I was even able to get a friend to test it successfully with gcc on linux. Here is the offending code:
wchar_t _teststring[10] = L"";
int _iRetVal = swprintf(_teststring, 10, L"A¥€");
wprintf(L"return: %d\n", _iRetVal);
// print values stored in string to check if anything got corrupted
for (int i=0; i<wcslen(_teststring); ++i) {
wprintf(L"%d: (%d)\n", i, _teststring[i]);
}
In xcode the call to swprintf will return -1, while in visual studio it will succeed and proceed to print out the correct values for each of the 3 chars (65, 165, 8364).
I have googled long and hard for solutions, one suggestion that has appeared a number of times is using a call such as:
setlocale(LC_CTYPE, "UTF-8");
I have tried various combinations of arguments with this function with no success, upon further investigation it appears to be returning null if i try to set the locale to any value other than the default "C".
I'm at a loss as to what else i can try to solve this problem, and the fact it works in other compilers/platforms just makes it all the more frustrating. Any help would be much appreciated!
EDIT:
Just thought i would add that when the swprintf call fails it sets an error code (92) which is defined as:
#define EILSEQ 92 /* Illegal byte sequence */
It should work if you fetch the locale from the environment:
#include <stdio.h>
#include <wchar.h>
#include <locale.h>
int main(void) {
setlocale(LC_ALL, "");
wchar_t _teststring[10] = L"";
int _iRetVal = swprintf(_teststring, 10, L"A¥€");
wprintf(L"return: %d\n", _iRetVal);
// print values stored in string to check if anything got corrupted
for (int i=0; i<wcslen(_teststring); ++i) {
wprintf(L"%d: (%d)\n", i, _teststring[i]);
}
}
On my OS X 10.6, this works as expected with GCC 4.2.1, but when compiled with CLang 1.6, it places the UTF-8 bytes in the result string.
I could also compile this with Xcode (using the standard C++ console application template), but because graphical applications on OS X don't have the required locale environment variables, it doesn't work in Xcode's console. On the other hand, it always works in the Terminal application.
You could also set the locale to en_US.UTF-8 (setlocale(LC_ALL, "en_US.UTF-8")), but that is non-portable. Depending on your goal there may be better alternatives to wsprintf.
If you are using Xcode 4+ make sure you have set an appropriate encoding for your files that contain your strings. You can find the encoding settings on a right pane under "Text Settings" group.
Microsoft had a plan to be compatible with other compilers starting from VS 2015 but finally it never happened because of problems with legacy code, see link.
Fortunately you can still enable ISO C (C99) standard in VS 2015 by adding _CRT_STDIO_ISO_WIDE_SPECIFIERS preprocessor macro. It is recommended while writing portable code.
I found that using "%S" (upper case) in the formatting string works.
"%s" is for 8-bit characters, and "%S" is for 16-bit or 32-bit characters.
See: https://developer.apple.com/library/archive/documentation/Cocoa/Conceptual/Strings/Articles/formatSpecifiers.html
I'm using Qt Creator 4.11, which uses Clang 10.