I have an eclipse based application. I want to set name for an object in German locale. If I use Java it is working fine. But in c++, if i hard code a string with German characters, the German characters displayed as question mark.
I am using visual studio for c++. Whether am missing any compiler configuration?
Related
I'm trying to make a xy program which prints ASCII art in the console with chracters such as ⣿, when running the program just prints question marks (?). I understand that its either because of me using the wrong encoding or Microsoft Visual Studio not having the dictionary of these ASCII Characters.
If you have any idea on how to either change encoding or fixing the isue ,it would be much appreciated
Possible solutions:
Try to change the source file encoding to UTF-8 without signature
or UTF-8 with signature.
Try to use wchar_t literal, i.e. std::wcout << L"Your String";.
Learn more:
how to change source file encoding in csharp project (visual studio / msbuild machine)? (Also applies to C++)
What does the 'L' in front a string mean in C++?
There is not a problem with your code but rather a problem with the console that shows your output. It does not show unicode character correctly. In order for it to show these characters correctly it need to recognize unicode and use a font that actually have those characters. To verify this, simple open a cmd window and copy/paste the character into it and see what heppens.
Gothic II Returning 2.0 / Visual Studio 2015
I'm trying to translate from Russia to Polish few lines of an AST.dll file. I have visual studio 2015, and project of it. I found lines with russian Text and I translated this into polish, with polish characters like L with line Ł but in game they look like strange symbols.
I tried to save this part of the script (with lines I translated) ConstText.cpp using encoding with UNICODE (UTF-8), and replace this with original script it changed a Little cause I got diffrent types of symbols, but this still isn't polish characters.
1>ConstText.cpp(95): warning C4566: character represented by universal-character-name '\u00F3' cannot be represented in the current code page (1251)
In my opinion i should Save this ConstText.cpp with specific encoding but i don't know what Type of encoding.
I am trying to display a unicode character (Euro sign) on a button using Qt and C++ in Visual Studio 2013. I tried the following code:
_rotateLeftButton->setText("\u20AC");
and
_rotateLeftButton->setText("€");
and
_rotateLeftButton->setText(QString::fromUtf8("\u20AC"));
and
_rotateLeftButton->setText(QString::fromUtf8("€"));
However, all of those lines result in the following:
All my code files are UTF-8 encoded, except for the moc files (.cxx). For whichever reason the moc executable does not generate them using unicode. Yet I was not able to get this unicode symbol displayed correctly. I also tried setting another font than the default one withouth success. Does anyone know what could be the problem?
Thank you for your help.
QString::fromUtf8("€")
Will work if the file really is handled as UTF-8. As #n.m. commented, VS requires some help from a faux-BOM to ensure this.
QString::fromUtf8("\u20AC")
\u doesn't make sense in a byte string literal. You could spell it using \x byte escapes for the UTF-8 encoded version:
QString::fromUtf8("\xE2\x82\xAC")
Or use a wide string literal:
QString::fromWCharArray(L"\u20AC")
I want to directly embed non-ASCII Unicode characters in string literals and use them in printf. This implies my source codes must be saved in utf-8 or utf-16. Visual Studio 2010 does support editing and saving C++ source files in either format. But when compiled & executed, it does not produce the correct unicode characters. Does the compiler support string literals with unicode characters embedded?
e.g.
wprintf(L" chinese characters:中文字\n"); the trailing chinese characters cannot be displayed
I don't have a Chinese version of Windows to test with, so this is complete speculation.
The console and file output functions are aware that files are not coded in UTF-16, so they attempt to convert the characters to a code page before output. Just as the default locale is "C" rather than anything based on your system settings, so too the default code page is probably an inappropriate one that does not include Chinese characters.
There is a function SetConsoleOutputCP to change the code page for the console. It is not clear if this function changes the code page used by the actual console window, or if it only affects conversions from Unicode within the program.
The easy way to test wide literals is to skip the formatting part of printf, and give your string straight to the OS: WriteConsoleW(GetStdHandle(STD_OUTPUT_HANDLE), L" chinese characters:中文字", ....
It's possible that #pragma setlocale may be what you need.
So I was just going through the basic Windows Programming guide over at MSDN and attempted to do the D2D1Circle Sample in Module 3. The problem I encountered was an error my VC++ 2008 was throwing.
" 'CreateWindowExA' : cannot convert parameter 2 from 'PCWSTR' to 'LPCSTR'"
So, figuring that I had made a slight error while typing the code in I downloaded the sample code rar and opened it up and it threw the exact same error. Any ideas on how I can fix this so it will work. Also, does the fact that I'm programming on a x64 bit machine have anything to do with why it won't work? I know pointers carry different sized values dependent on the machine and both the parameters being called are pointers.
Update # Jollymorphic: In the first few modules, the MSDN tutorial was saying that there really isn't any reason to continue using ascii since unicode covers ascii and also supports all other languages like Chinese, Japanese, etc. Wouldn't implementing your solution cause my program to only support ascii and subsequently not allow support for east asian languages?
A PCWSTR is a pointer to wide (16-bit) characters. An LPCSTR is a pointer to regular (8-bit) characters. Your project probably is set to generate code based on the UNICODE character set. If you open the properties for your project in Visual Studio, and then navigate to the "General" page, you'll see a "Character Set" property. If it is currently set to "Use Unicode character set," then you can change it to "Use Multi-Byte character set," and your string literals will be generated as 8-bit character strings.