Error : Cannot convert char to wchat_t* - c++

Im trying to get the active window title using GetForegroundWindow and GetWindowText Functions and this is my code
HWND hwnd = GetForegroundWindow();
char wname[255];
GetWindowText(hwnd,wname,255);
And Everytime i try to build the project i get this error message "Error : Error : Cannot convert char to wchat_t*"
Im using c++builder xe7
So, What's wrong?

You are calling the TCHAR version of GetWindowText(). In your Project Options, you have the "TCHAR maps to" option set to wchar_t, so GetWindowText() maps to GetWindowTextW(), which takes a wchar_t* parameter. That is why you cannot pass in a char[] buffer.
So, you need to either:
Change "TCHAR maps to" to char so that GetWindowText() maps to GetWindowTextA() instead (also similarly affects every other TCHAR-based API function call in your code. Use this approach only when migrating legacy pre-Unicode code to C++Builder 2009+).
Change your code to use TCHAR instead:
TCHAR wname[255];
GetWindowText(hwnd,wname,255);
Change your code to use the Ansi or Unicode version of GetWindowText() directly:
char wname[255];
GetWindowTextA(hwnd,wname,255);
wchar_t wname[255];
GetWindowTextW(hwnd,wname,255);

You're building your application in Unicode-aware mode; a char is not large enough to hold a UTF-16 character. The type system is saving you from a lot of potential headache here by catching this for you. Either change to ASCII mode (easy but bad solution), switch to using wide strings everywhere (annoying solution), or use the provided macros to choose at compile time based on build parameters (even more annoying but most correct solution).
This is what this code snippet would look like with either of the above solutions implemented:
HWND hwnd = GetForegroundWindow();
wchar_t wname[255];
GetWindowText(hwnd, wname, 255);
HWND hwnd = GetForegroundWindow();
TCHAR wname[255];
GetWindowTextW(hwnd, wname, 255);
If you choose to build a Unicode-aware application (which you should), you must also remember to use wmain or _tmain as applicable, rather than plain old boring main. Because Windows.

Related

Understanding Multibyte/Unicode

I'm just getting back into Programming C++, MFC, Unicode. Lots have changed over the past 20 years.
Code on another project compiled just fine, but had errors when I paste it into my code. It took me 1-1/2 days of wasted time to solve the function call below:
enter code here
CString CFileOperation::ChangeFileName(CString sFileName)
{
char drive[MAX_PATH], dir[MAX_PATH], name[MAX_PATH], ext[MAX_PATH];
_splitpath_s(sFileName, drive, dir, name, ext); //error
------- other code
}
After reading help, I changed the CString sFileName to use a cast:
enter code here
_splitpath_s((LPTCSTR)sFileName, drive, dir, name, ext); //error
This created an error too. So then I used GetBuffer() which is really the same as above.
enter code here
char* s = sFileName.GetBuffer(300);
_splitpath_s(s, drive, dir, name, ext); //same error for the 3rd time
sFileName.ReleaseBuffer();
At this point I was pretty upset, but finally realized that I needed to change the CString to Ascii (I think because I'm set up as Unicode).
hence;
enter code here
CT2A strAscii(sFileName); //convert CString to ascii, for splitpath()
then use strAscii.m_pz in the function _splitpath_s()
This finally worked. So after all this, to make a story short, I need help focusing on:
1. Unicode vs Mulit-Byte (library calls)
2. Variables to uses
I'm willing to purchase another book, please recommend.
Also, is there a way to filter my help on VS2015 so that when I'm on a variable and press F1, it only gives me help for Unicode and ways to convert old code to unicode or convert Mylti-Byte to Unicode.
Hope this is not to confusing, but I have some catching up to do. Be patient if my verbiage is not perfect.
Thanks in advance.
The documentation of _splitpath lists a Unicode (wchar_t based) version _wsplitpath. That's the one you should be using. Don't convert to ASCII or Windows ANSI, that will in general lose information and not produce a valid path when you recombine the pieces.
Modern Windows programming is Unicode based.
A Visual Studio C++ project is Unicode-based by default, in particular it defines the macro symbol UNICODE, which affects the declarations from <windows.h>.
All supported versions of Windows use Unicode internally throughout, and your application should, too. Windows uses UTF-16 encoding.
To make your application Unicode-enabled you need to perform the following steps:
Set up your project's Character Set to "Use Unicode Character Set" (if it's currently set to "Use Multi-Byte Character Set"). This is not strictly required, but it deals with those cases, where you aren't using the Unicode version explicitly.
Use wchar_t (in place of char or TCHAR) for your strings.
Use wide character string literals (L"..." in place of "...").
Use CStringW (in place of CStringA or CString) in an MFC project.
Explicitly call the Unicode version of the CRT (e.g. wcslen in place of strlen or _tcslen).
Explicitly call the Unicode version of any Windows API call where it exists (e.g. CreateWindowExW in place of CreateWindowExA or CreateWindowEx).
Try using _tsplitpath_s and TCHAR.
So the final code looks something like:
CString CFileOperation::ChangeFileName(CString sFileName)
{
TCHAR drive[MAX_PATH], dir[MAX_PATH], name[MAX_PATH], ext[MAX_PATH];
_tsplitpath_s(sFileName, drive, dir, name, ext); //error
------- other code
}
This will enable C++ compiler to use the correct character width during build time depending on the project settings

2010 version MFC code doesn't work anymore on 2013, any suggestions?

How should I change this vs2010 code to work in vs2013?
This is part of MFC application. I have two Edit Control with CString variables m_Name, and m_Age. There is also a print button that if clicked it shows these two information on a Message Box.
void CMyProgramDlg::OnBUTTON_PRINT()
{
UpdateData(TRUE);
char szText[100];
sprintf(szText, "Name: %s\n"\
"Age: %d",
m_Name, m_Age);
MessageBox(szText, m_Name+"Message", NULL);
}
The problem is MessageBox() doesn't take char any more. So I converted to CString. But the new problem is the printed message only shows the first letter of the name and age. So if I put in 'Jack' for the name and '40' for the age, it only shows 'J' and '4'.
The new project apparently compiles in Unicode mode, so TCHAR is wchar_t, and all WinAPI functions accept wchar_t instead of char (or pointers to them).
More precisely: The macro MessageBox that expanded to MessageBoxA in the old project now expands to MessageBoxW, and where MessageBoxA accepted pointers to char, MessageBoxW expects pointers to wchar_t. This mechanism exists for all WinAPI "functions" that accept strings.
Go into the project's project properties and set "Character Set" under Configuration Properties -> C/C++ -> General to not set or multibyte instead of Unicode, then it should behave as before. Alternatively, use MessageBoxA instead of MessageBox to invoke the ANSI version explicitly, or change the code so that it uses TCHAR everywhere.
See https://msdn.microsoft.com/en-us/library/c426s321.aspx for details.
The VS 2013 no longer supports MBCS. So you have to convert your code to UNICODE:
UpdateData(TRUE);
CString szText;
szText.Format(_T("Name: %s\nAge: %d"), m_Name, m_Age);
MessageBox(szText, m_Name + _T("Message"), NULL);

Set Unicode text on MFC form controls in Multi-Byte Char Set application

I have Multi-Byte Char Set MFC windows application. Now I need to display international single byte ASCI characters on windows controls. I can't use ASCI characters directly because to display them correctly it is required windows locale be set to adequate country. I need to display characters in all windows locale cases. For this purpose I must convert ASCI to unicode. I can display required international characters in MessageBoxW, but how to display them on windows MFC controls using SetWindowText?
To show unicode string in MessageBoxW I construct it in wstring
WORD g [] = {0x105,0x106,0x107,0x108,0x109,0x110,0x111,0x112,0x113,0x114,0x115,0x116,0x117,0x118,0x119,0x120};
wstring gg (reinterpret_cast<wchar_t*>(g),15);
MessageBoxW(NULL, gg.c_str() , gg.c_str() , MB_ICONEXCLAMATION | MB_OK);
Seting MFC form control text:
class MyFrm: public CDialogEx
{
virtual BOOL OnInitDialog();
}
...
BOOL MyFrm::OnInitDialog()
{
GetDlgItem(IDC_EDIT_TICKET_NUMBER)->SetWindowText( ???);
}
Is it possible somehow convert wstring gg to CString and show unicode chars on window control?
You could try casting your CDialogEx 'this' object to HWND and then call explictly Win32 API to set text using wchars. So your code will look something like this:
BOOL MyFrm::OnInitDialog()
{
SetDlgItemTextW((HWND)(*this), IDC_EDIT_TICKET_NUMBER, gg.c_str());
}
But as I mentioned earlier Unicode is supported starting from Windows XP and using ASCII is really not a good idea unless you're targeting those very very old OS'es before it. Using them nowdays will cause ALL ASCII strings you pass to be firstly converted into Unicode by the Win32 API. So it is a better idea to switch your project entirely to UNICODE.
First, note that you can simply directly initialize a std::wstring with your Unicode hex character data, without any ugly useless reinterpret_cast<wchar_t*>, etc.
Instead of this:
WORD g [] = {0x105,0x106,0x107,0x108,...,0x120};
wstring gg (reinterpret_cast<wchar_t*>(g),15);
just consider that:
wstring text = L"\x0105\x0106\x0108...\0x0120";
The latter seems much cleaner to me.
Second, if you want to pass an instance to std::wstring to an MFC method that expects a const wchar_t* input string pointer, just consider using wstring::c_str() method.
In addition, the best suggestion I can give you is to just port your app to Unicode.
ASCII/MBCS should be considered programming model of the past for MFC; they bring lots of problem when you want to write "international" code.

GetLastError returns error 2 in SystemParametersInfo

#include <iostream>
#include <windows.h>
using namespace std;
int main(){
LPWSTR test = L"c:/aizen.png";
int result = SystemParametersInfo(SPI_SETDESKWALLPAPER, 0, test, SPIF_UPDATEINIFILE);
if(result)
cout << "Wallpaper set!";
else
cout << "Error: " << GetLastError();
cin >> result;
return 0;
}
The code is simply meant to change the background wallpaper, but I keep getting Error: 2, which means "file not found". However, the file is there! I'm using microsoft visual studio 2010 and I've tried running as admin, case-sensitive, changing slashes, etc. What am I doing wrong?
Error 2 is File not found.
First, make sure that aizen.png is actually located in the root folder of drive C:\ (which on Vista and above is not likely, given that non-admin users don't typically have write access there).
If the file is indeed there, the problem is most likely that you're not properly escaping backslashes:
LPWSTR test = L"c:\\aizen.png";
The problem is you are passing a UNICODE string - LPWSTR - to an API that takes ANSI.
Nearly all Win32 APIs (all that take strings at any rate) come in two versions, one that ends in ...A for ANSI (8-bit characters), and one that ends in ...W for Wide-char, aka UNICODE (technically not 'real' unicode, but that's more than is worth getting in this reply).
If you have UNICODE #defined at compilation time, then the plain unadorned version gets #defined as the ...W version; otherwise it gets #defined as the ...A version. Take a look at winuer.h, and you'll see:
WINUSERAPI
BOOL
WINAPI
SystemParametersInfoA(
__in UINT uiAction,
__in UINT uiParam,
__inout_opt PVOID pvParam,
__in UINT fWinIni);
WINUSERAPI
BOOL
WINAPI
SystemParametersInfoW(
__in UINT uiAction,
__in UINT uiParam,
__inout_opt PVOID pvParam,
__in UINT fWinIni);
#ifdef UNICODE
#define SystemParametersInfo SystemParametersInfoW
#else
#define SystemParametersInfo SystemParametersInfoA
#endif // !UNICODE
Note that Windows has two SystemParametersInfo functions; the W one expects wide LPWSTR and the A one expect plain LPSTRs; and whether you have UNICODE defined or not selects which is the 'default' one. (You can always add the A or W manually to call either explicitly.)
What's likely happening in your original code is that because you do not have UNICODE defined, you end up using the ...A version, which expects an ANSI string, but you're passing in a UNICODE string - so it doesn't work.
The "bit of a change" you made to get it working is more than just a bit: you're now passing an ANSI string to the ...A version of the API so it works fine:
int result = SystemParametersInfo(SPI_SETDESKWALLPAPER, 0, (void*)"c:/aizen.jpg", SPIF_UPDATEINIFILE);
Alternatively, you could call the W version explicitly with a LPWSTR:
int result = SystemParametersInfoW(SPI_SETDESKWALLPAPER, 0, L"c:\\aizen.jpg", SPIF_UPDATEINIFILE);
Or, you could define UNICODE at the start of your app, use L"..." strings, and the plain version of the APIs - just add #define UNICODE at the top of your original app before #include . (UNICODE is more usually defined in a makefile or in compiler settings options instead of being defined explicitly in code, so if you're new to Win32 programming, it can come as something of a surprise feature.)
Note that LPWSTR is not deprecated; if anything, it's the opposite; typical Win32 practice since XP or so has been to use W-flavor strings across the board, so it's effectively the plain "..." strings that are considered 'deprecated' on Win32. (For example, many COM APIs use only wide strings.)
Most other functions have some protection against this; if you accidentally try to pass an ANSI string to say SetWindowTextW, you'll get a compile-time error, because the LPSTR you are passing in doesn't match the expcted LPWSTR type the function is expecting. But SystemParamtersInfo is tricky; it takes a void* for the data parameter, so will accept [almost] anything at compile time, and it's only when you call the function at runtime that you'll hit the error.
--
This, by the way, is what David Herfernan pointed out in the answer to your question the first time you posted -
Some possible causes spring to mind:
...
You have an ANSI/Unicode encoding mismatch.
It's very weird looks like if you compile with MinGW c++ compiler it actually compiles this with a bit of change:
int main(){
int result = SystemParametersInfo(SPI_SETDESKWALLPAPER, 0, (void*)"c:/aizen.jpg", SPIF_UPDATEINIFILE);
This would work, and apparently LPWSTR is deprecated...
Visual Studio was probably having privilege issues. Run it as Admin and try again.

How to view the value of a unicode CString in VC6?

I'm using Visual Studio 6 to debug a C++ application. The application is compiled for unicode string support. The CString type is used for manipulating strings. When I am using the debugger, the watch window will display the first character of the string, but will not display the full string. I tried using XDebug, but this tool does not handle unicode strings properly. As a work around, I can create a custom watch for each character of the string by indexing into the private array the CString maintains, but this is very tedious.
How can I view the full, unicode value of a CString in the VC6 debugger?
Go to tools->options->Debug, and check the "Display unicode string" check-box. That would probably fix the problem. Two other options:
In the watch window, if you have a Unicode string variable named szText, add it to the watch as szText,su. This will tell VS to interpret it as a Unicode string (See Symbols for Watch Variables for more of this sort).
Worst comes to worst, you can have a global ANSI string buffer, and a global function that will get a Unicode CString and store its content as ANSI, in that global variable. Then, when need call that function with the string whose content you'd like to see in the watch window, and watch the ANSI buffer.
But the "Display unicode string" thing is probably the problem...