I have a hard coded string in my code (which should be used as a file mask), but compiler always changes the "??-" sequence to "~", for example:
const wchar_t textW[] = L"test-??-??-??.txt";
The testW will be "test-~~??.txt" (without quotes).
The same happens for non-unicode strings as well:
const char textA[] = "test-????-??-??.txt";
textA will be "test-??~~??.txt" (without quotes).
My compiler is Microsoft Visual C++ 2008.
I have just tried this with Visual Studio 2013, the string in runtime is correct and intellisense displays the correct value on the tooltip when I'm tracing the app, but... But in the writing mode (when app isn't running) intellisense displays incorrect value with tildas on the tooltip.
That's a trigraph, a way to express characters that are not always available on keyboards.
This behavior is controlled by the /Zc:trigraphs option, which is off by default. It appears it is enabled for your project, I would suggest you disable it.
It's called a trigraph. They are replaced by the preprocessor.
Related
I've a rather complicated problem, but try to break it down to some basic stuff I can't get my head around.
At the end, I need to get a hash out of a user input string, which must be the same on Mac and Windows. After failing with all kinds of approaches, I started with just hashing (picosha2.h) a simple char. While the ASCII range hashes equal on both platforms e.g. const char* cc = "ø"; gives different results. Guessing from the error when changing the declaration to unsigned is that Xcode treats the "ø" as const char[3] and Visual Studio treats it as const char[2].
Another example "書", which spits out a correct Unicode char array size of 4 in Xcode, still has only 2 in Visual Studio.
If I look at the actual memory content, I see that Xcode stores c3 b8 for "ø", which matches the UTF-8 code, while Visual Studio shows f8, which would be the value for the Unicode code point for that letter, which confuses me even more (https://www.utf8-chartable.de).
Is there any chance to make Visual Studio interpret (at runtime) strings as UTF-8 and store them the same way as Xcode does?
You could set UTF-8 in Visual Studio.
Properties->C/C++->Command Line->Additional Options->add /utf-8
Also, you could save the file in utf-8 format.
Tools->Customize->Commands->Menu bar: File->Add Command->File->Advanced Save Options
Then, save files
File->Advanced Save Options->select utf-8
I add a break point in Visual Studio 2015, with an action to output a string to Output Window. There will be an auto-added line break at the end. Problem is, my previous output message(which is not output by break point) has no line break.
So I want to add new line character at the beginning of my string, to avoid it messing up with my previous message. I tried to add \n, but the \n outputs as it is, without being escaped.
How to add a new line character in break point's action?
Here are four things for you to try:
You can produce a line break using the debugger expression {"\n",s8b} which makes use of the C++ debugger format specifier s8b (unquoted 8-bit string).
Here's an example with a two-line message First{"\n",s8b}Second:
(Other than that, I am not aware of any other way to include line breaks in the message. While there are ways to enter a multi-line message (by entering line break characters' Unicode code points using the numpad), Visual Studio will just throw away everything but the first text line entered.)
Just before your current breakpoint, add an additional breakpoint with a very short action message (a dot or comma) in order to get an additional line break before your real message.
If you're on Windows (which appears likely, given Visual Studio), you can send a message to the debugger using the Windows API function OutputDebugString. This is the currently suggested solution to the SO question, "How do I print to the debug output window in a Win32 app?"
Write a message to clog: std::clog << message << std::endl;.
In Addition to the answer from stakx that matches the original question for debugging C++ applications, I would like to add a character sequence that instead works for debugging .NET applications:
{"\n",nq}
The C++ sequence would otherwise result in this error message: 's8b' is not a valid format specifier
I am facing a strange problem in VC++6.0.
CString m_strData = "W" + CString(char(165));
m_strData.MakeUpper();
MessageBox(m_strData, "Alert from C++",MB_ICONEXCLAMATION|MB_OK);
If I build the project with Win32 Debug, the alert value is correct. But if I build with Win32 Release MinDependency. The value is different with Win32 Debug. Why? Is their any article show me the reason? I have tried setlocale(LC_ALL,"English_United States.1250") front of the code, but it didn't work.
the value of Win32 Debug
the value of Win32 Release MinDepency
I've take a look into the CString MakeUpper function. It will call _tcsupr() function. But from the MSDN page, it only shows me this function will depends on the locale. But not the project Build mode...?
The output of char(165) depends on the codepage you are using. It could be the Yen-symbol or an N with a ~ above it. I assume that the code generated in debugging either uses different codepage or replaces the character with a question mark to tell you, that it is a non-printable character.
While this question has probably been asked a thousand times before (pretty sure of it I have read a thousand answers). I still don't get it.
Lets say I have a function that creates a ComboBox like this:
scopeComboSelector=CreateCombobox(hwnd,
GetModuleHandle(0),
CBS_DROPDOWNLIST,
re,
IDCC_DROPDOWNLIST_SCOPE_SELECTOR,
_T("Scopes"));
Where "re" is a positioning rectangle. And IDCC_DROPDOWNLIST_SCOPE_SELECTOR (pretty long name) is the id of the combobox. Now the point is, I can actually fill this "drop down select list" but I have no clue as how I can simply get the currently selected value as a string.
I have seen about 10 ways to do it, which all give errors straight away (need to Convert to LPWSTR -> fixing results in more terror).
Maybe I'm just to used to Java where one can simply say:
textfield.getText();
How would one achieve this in Win32 C++ (microsoft visual studio)?
Edit
Code I've used:
char userName[_MAX_PATH+1];
GetDlgItemTextW(scopeComboSelector,
IDCC_DROPDOWNLIST_SCOPE_SELECTOR,
(LPWSTR)userName,
200);
Returns: userName == empty
Update
Now using: GetDlgItemText(). Debugger tells me the value of userName = ""
The documentation has a C style Windows 9x code example.
You need simply to replace C with C++ and Windows 9x silly T macros with wchar_t and friends.
It's always a good idea to read the documentation.
While trying to convert some existing code to support unicode characters this problem popped up. If i try to pass a unicode character (in this case im using the euro symbol) into any of the *wprintf functions it will fail, but seemingly only in xcode. The same code works fine in visual studio and I was even able to get a friend to test it successfully with gcc on linux. Here is the offending code:
wchar_t _teststring[10] = L"";
int _iRetVal = swprintf(_teststring, 10, L"A¥€");
wprintf(L"return: %d\n", _iRetVal);
// print values stored in string to check if anything got corrupted
for (int i=0; i<wcslen(_teststring); ++i) {
wprintf(L"%d: (%d)\n", i, _teststring[i]);
}
In xcode the call to swprintf will return -1, while in visual studio it will succeed and proceed to print out the correct values for each of the 3 chars (65, 165, 8364).
I have googled long and hard for solutions, one suggestion that has appeared a number of times is using a call such as:
setlocale(LC_CTYPE, "UTF-8");
I have tried various combinations of arguments with this function with no success, upon further investigation it appears to be returning null if i try to set the locale to any value other than the default "C".
I'm at a loss as to what else i can try to solve this problem, and the fact it works in other compilers/platforms just makes it all the more frustrating. Any help would be much appreciated!
EDIT:
Just thought i would add that when the swprintf call fails it sets an error code (92) which is defined as:
#define EILSEQ 92 /* Illegal byte sequence */
It should work if you fetch the locale from the environment:
#include <stdio.h>
#include <wchar.h>
#include <locale.h>
int main(void) {
setlocale(LC_ALL, "");
wchar_t _teststring[10] = L"";
int _iRetVal = swprintf(_teststring, 10, L"A¥€");
wprintf(L"return: %d\n", _iRetVal);
// print values stored in string to check if anything got corrupted
for (int i=0; i<wcslen(_teststring); ++i) {
wprintf(L"%d: (%d)\n", i, _teststring[i]);
}
}
On my OS X 10.6, this works as expected with GCC 4.2.1, but when compiled with CLang 1.6, it places the UTF-8 bytes in the result string.
I could also compile this with Xcode (using the standard C++ console application template), but because graphical applications on OS X don't have the required locale environment variables, it doesn't work in Xcode's console. On the other hand, it always works in the Terminal application.
You could also set the locale to en_US.UTF-8 (setlocale(LC_ALL, "en_US.UTF-8")), but that is non-portable. Depending on your goal there may be better alternatives to wsprintf.
If you are using Xcode 4+ make sure you have set an appropriate encoding for your files that contain your strings. You can find the encoding settings on a right pane under "Text Settings" group.
Microsoft had a plan to be compatible with other compilers starting from VS 2015 but finally it never happened because of problems with legacy code, see link.
Fortunately you can still enable ISO C (C99) standard in VS 2015 by adding _CRT_STDIO_ISO_WIDE_SPECIFIERS preprocessor macro. It is recommended while writing portable code.
I found that using "%S" (upper case) in the formatting string works.
"%s" is for 8-bit characters, and "%S" is for 16-bit or 32-bit characters.
See: https://developer.apple.com/library/archive/documentation/Cocoa/Conceptual/Strings/Articles/formatSpecifiers.html
I'm using Qt Creator 4.11, which uses Clang 10.