C2664 error c++ Visual Studio - c++

I am trying to modify an old MFC program. After opening the project in Visual Studio 2013 there are many errors of the type below.
In AviPlay.cpp
#include "stdafx.h"
#include "AviPlay.h"
#define OPEN_AVI_VIDEO "open avivideo"
BOOL initAVI()
{
return mciSendString(OPEN_AVI_VIDEO, NULL, 0, NULL) == 0;
}
The error thrown is error C2664: 'MCIERROR mciSendStringW(LPCWSTR,LPWSTR,UINT,HWND)' : cannot convert argument 1 from 'const char [14]' to 'LPCWSTR'
Should setting the compiler option for Strict to off, or some other compiler option, resolve this error? If not, I can modify the many lines of code manually. In that case, what might have changed in the last 15 years that would make code like this OK before but not OK now?
Thank you in advance.

LPCWSTR tells you it is expecting a wchar_t string, not a char string. By default, all Windows APIs now accept wchar_t strings (unicode). You can change it back to char strings in the project properties, General page, Character Set. Setting it to 'Use Multibyte char set' will get it working as it used to.

Also if you are creating a new project and facing this issue , then make the conformance mode to Default or No .

Related

Difference in storing Unicode strings on OSX (XCode) and Windows (Visual Studio)

I've a rather complicated problem, but try to break it down to some basic stuff I can't get my head around.
At the end, I need to get a hash out of a user input string, which must be the same on Mac and Windows. After failing with all kinds of approaches, I started with just hashing (picosha2.h) a simple char. While the ASCII range hashes equal on both platforms e.g. const char* cc = "ø"; gives different results. Guessing from the error when changing the declaration to unsigned is that Xcode treats the "ø" as const char[3] and Visual Studio treats it as const char[2].
Another example "書", which spits out a correct Unicode char array size of 4 in Xcode, still has only 2 in Visual Studio.
If I look at the actual memory content, I see that Xcode stores c3 b8 for "ø", which matches the UTF-8 code, while Visual Studio shows f8, which would be the value for the Unicode code point for that letter, which confuses me even more (https://www.utf8-chartable.de).
Is there any chance to make Visual Studio interpret (at runtime) strings as UTF-8 and store them the same way as Xcode does?
You could set UTF-8 in Visual Studio.
Properties->C/C++->Command Line->Additional Options->add /utf-8
Also, you could save the file in utf-8 format.
Tools->Customize->Commands->Menu bar: File->Add Command->File->Advanced Save Options
Then, save files
File->Advanced Save Options->select utf-8

C++ multiline CString literal in VS2013 Unicode

A multiline CString literal 'str1', accepted without a wink in VS2012 (with MBCS) is now refused at build time, after upgrading to VS2013 (with Unicode, to alleviate tons of errors from the newly deprecated MBCS, even after installing its addon), with the output message:
error C2308: concatenating mismatched strings
as in the following example (A):
str1 = _T(" HELP - available commands \n\n\n"
"F1 : the present help message \n\n");
The first line is reported 'wide' and the second 'narrow'.
I have then tried (B) to add mono-line CString literals:
str1 = _T(" HELP - available commands \n\n\n")
+ _T("F1 : the present help message \n\n");
but the IDE already complains with
Error: expression must have integral or unscoped enum type
and the builder with
error C2110: '+' : cannot add two pointers
It does indeed work if I build (C) the CString str1 with mono-line literals one by one:
str1 = _T(" HELP - available commands \n\n\n");
str1 += _T("F1 : the present help message \n\n");
but I would like to understand why (A) and (B) do not work here, as expected, and as they were so until now. There are several such problems in this (large) program, but in most other similar instances it does work just fine.
Is it due to changes in VS2013 or (and?) the switch from MBCS to Unicode? Are there special characters I overlooked in these strings? And then, how to fix these problems?
Thanks in advance for your responses.
It should be:
str1 = _T(" HELP - available commands \n\n\n") // no semicolon here
_T("F1 : the present help message \n\n");
The reason it worked before is that _T is a no-op for MBCS but for Unicode expands to a width prefix. In C/C++ concatenation is just by making them adjacent with whitespace, but they need the same prefix, at least for MSVC.

Why MSVC compiler converts "??-" sequence to "~" in string literals?

I have a hard coded string in my code (which should be used as a file mask), but compiler always changes the "??-" sequence to "~", for example:
const wchar_t textW[] = L"test-??-??-??.txt";
The testW will be "test-~~??.txt" (without quotes).
The same happens for non-unicode strings as well:
const char textA[] = "test-????-??-??.txt";
textA will be "test-??~~??.txt" (without quotes).
My compiler is Microsoft Visual C++ 2008.
I have just tried this with Visual Studio 2013, the string in runtime is correct and intellisense displays the correct value on the tooltip when I'm tracing the app, but... But in the writing mode (when app isn't running) intellisense displays incorrect value with tildas on the tooltip.
That's a trigraph, a way to express characters that are not always available on keyboards.
This behavior is controlled by the /Zc:trigraphs option, which is off by default. It appears it is enabled for your project, I would suggest you disable it.
It's called a trigraph. They are replaced by the preprocessor.

const char* to LPCWSTR in Qt

I have faced a problem in Qt with error: cannot convert parameter 1 from 'const char *' to 'LPCWSTR'
when calling: OutputDebugString( "wtf!" );
In simple C++ project I've always been setting "Character Set" to "Not Set", but this time it doesn't work, the error keeps displaying all the time. I tried other possibilities like "Use multi-byte" but still no effect. What's going on?
Thanks.
I found out that when I use qDebug instead of OutputDebugString it works.

swprintf fails with unicode characters in xcode, but works in visual studio

While trying to convert some existing code to support unicode characters this problem popped up. If i try to pass a unicode character (in this case im using the euro symbol) into any of the *wprintf functions it will fail, but seemingly only in xcode. The same code works fine in visual studio and I was even able to get a friend to test it successfully with gcc on linux. Here is the offending code:
wchar_t _teststring[10] = L"";
int _iRetVal = swprintf(_teststring, 10, L"A¥€");
wprintf(L"return: %d\n", _iRetVal);
// print values stored in string to check if anything got corrupted
for (int i=0; i<wcslen(_teststring); ++i) {
wprintf(L"%d: (%d)\n", i, _teststring[i]);
}
In xcode the call to swprintf will return -1, while in visual studio it will succeed and proceed to print out the correct values for each of the 3 chars (65, 165, 8364).
I have googled long and hard for solutions, one suggestion that has appeared a number of times is using a call such as:
setlocale(LC_CTYPE, "UTF-8");
I have tried various combinations of arguments with this function with no success, upon further investigation it appears to be returning null if i try to set the locale to any value other than the default "C".
I'm at a loss as to what else i can try to solve this problem, and the fact it works in other compilers/platforms just makes it all the more frustrating. Any help would be much appreciated!
EDIT:
Just thought i would add that when the swprintf call fails it sets an error code (92) which is defined as:
#define EILSEQ 92 /* Illegal byte sequence */
It should work if you fetch the locale from the environment:
#include <stdio.h>
#include <wchar.h>
#include <locale.h>
int main(void) {
setlocale(LC_ALL, "");
wchar_t _teststring[10] = L"";
int _iRetVal = swprintf(_teststring, 10, L"A¥€");
wprintf(L"return: %d\n", _iRetVal);
// print values stored in string to check if anything got corrupted
for (int i=0; i<wcslen(_teststring); ++i) {
wprintf(L"%d: (%d)\n", i, _teststring[i]);
}
}
On my OS X 10.6, this works as expected with GCC 4.2.1, but when compiled with CLang 1.6, it places the UTF-8 bytes in the result string.
I could also compile this with Xcode (using the standard C++ console application template), but because graphical applications on OS X don't have the required locale environment variables, it doesn't work in Xcode's console. On the other hand, it always works in the Terminal application.
You could also set the locale to en_US.UTF-8 (setlocale(LC_ALL, "en_US.UTF-8")), but that is non-portable. Depending on your goal there may be better alternatives to wsprintf.
If you are using Xcode 4+ make sure you have set an appropriate encoding for your files that contain your strings. You can find the encoding settings on a right pane under "Text Settings" group.
Microsoft had a plan to be compatible with other compilers starting from VS 2015 but finally it never happened because of problems with legacy code, see link.
Fortunately you can still enable ISO C (C99) standard in VS 2015 by adding _CRT_STDIO_ISO_WIDE_SPECIFIERS preprocessor macro. It is recommended while writing portable code.
I found that using "%S" (upper case) in the formatting string works.
"%s" is for 8-bit characters, and "%S" is for 16-bit or 32-bit characters.
See: https://developer.apple.com/library/archive/documentation/Cocoa/Conceptual/Strings/Articles/formatSpecifiers.html
I'm using Qt Creator 4.11, which uses Clang 10.