Issue with RegQueryValueEx - c++

I am using RegQueryValueEx to read a String value (REG_SZ) from the registry.
The value if this registry contains some Japanese Characters along with english.
For eg: C:\Program Files\MyReg\チチチ\helloworld
I am using the following code snippet:
BYTE* dwValue = 0;
DWORD dwSize = 0;
DWORD dwType = SZ_REG;
TCHAR* valueName= TEXT("test");
//Get the size
if(RegQueryValueEx(hKey, valueName,NULL,&dwType,NULL,&dwSize) == ERROR_SUCCESS)
{
dwValue = new BYTE[dwSize];
if(RegQueryValueEx(hKey, valueName,NULL,&dwType,( BYTE* )( dwValue ),&dwSize) == ERROR_SUCCESS ) )
_tprintf(TEXT("The value is %s"),(TCHAR*)dwValue);
}
The output i get is-
The value is C:\Program Files\MyReg\
This exe is console application and has got UNICODE preprocessor defined.
If i remove it then it works correctly and gives the correct string.
I am not sure whats going wrong due to UNICODE.
-Thanks

This is what fixed my problem but it still confuses me though.
My project was based on VS IDE 2003 , If i compiled the same code in VS 2003 with UNICODE/_UNICODE , it will fail to give me the complete string meaning (it will remove the characters starting from the first Double Byte character).
If I build with VS IDE 2003 WITHOUT the _UNICODE/UNICODE flag , it WORKS CORRECTLY and gives me the complete string and then depending on my code page I may get to see the actual characters or bunch of question marks (???) for each double byte character.
If I compile the same code with VS 2005 , then I get the correct and complete string (from the registry) irrespective of whether or not I have defined _UNICODE/UNICODE.
But when the characters can't be displayed on Japanese OS console as well as english OS console. But MessageBox() displays them correctly.
In order to display the characters correctly on Japanese OS console I am forced to compile without the _UNICODE/UNICODE flags. (which i m not able to understand ..)
In my code there is no change at all what i have to make except compiling it with VS 2005 to get the problem fixed(except the display problem)

Related

Robust ways to get App Data Folder Path for Non English Window User

I was using this code well to get the App Data Folder path for my C++ application.
char* actFilePath = NULL;
TCHAR szPath[MAX_PATH];
if (SUCCEEDED(SHGetFolderPath(NULL, CSIDL_LOCAL_APPDATA, NULL, 0, szPath)))
{
PathAppend(szPath, _T("\\MyFile.txt"));
actFilePath = wchar_to_string(szPath);
}
When I run the this code over some of Non English Window 8 or Window 10 OS, this code fails (actFilePath is just null). I found that the code fails due the non English User name in the folder path like débarquer Matyáš or 姓 名 as you can see from the path below:
C:\Users\débarquer Matyáš\AppData\Local
C:\Users\姓 名\AppData\Local
What are more robust approach and less prone to error just in case some user name is written in their native language even including Chinese or Japanese or European, etc. Working code example will be really appreciated.
Kind regards.
=====================================================================
Updated to the answer from VTT on 12 Nov 2018
I created this code following the answer from VTT.
This code compiles fine. However, the returned folder path is behaving unexpected. Sometimes, it gives me the correct path but sometimes, it returns unreadable file path. Something like this. See attached link for some strange characters. Some impression I am getting is this code is unstable.
https://ibb.co/goLzxq
wchar_t* actFilePath = NULL;
if (SUCCEEDED(SHGetKnownFolderPath(FOLDERID_LocalAppData, 0, NULL, &actFilePath)))
{
PathAppendW(actFilePath, L"\\MyFile.txt");
}
I have followed some advice from this answer too here.
How do I convert PWSTR to string in C++?
SHGetFolderPath is deprecated. You should use SHGetKnownFolderPath instead. Note that this new function only has wide char version so it works with Unicode paths properly.
PWSTR psz_path{};
auto const hr
{
::SHGetKnownFolderPath
(
FOLDERID_LocalAppData
, KF_FLAG_DEFAULT
, HANDLE{}
, ::std::addressof(psz_path)
)
};
if(SUCCEEDED(hr))
{
assert(psz_path);
// do something with path...
::CoTaskMemFree(psz_path);
}

Can I point the necessary codepage for the individual string variable in the `Watch1` window?

Visual Studio 2015, C++ language, debugging.
In the Watch1 window I look the values of my variables (strings) of the wchar_t* and char* types. The first of them is Unicode and the second is ANSI (CP_OEMCP codepage). In the Watch1 window the text of the wchar_t* variable is displaying correctly, but the text of the char* variable is displaying unreadable. Can I point the necessary codepage for the individual string variable in the Watch1 window? I want to see both values of my strings correctly in the Watch1 window.
Maybe for such cases is exists the some syntax, similar the $err,hr (the text of the last error, which was gotten via the GetLastError() function).
UPD (the screen added)
Console window has the right output, but in the memory and in the Watch1 window I see unreadable string for my ansiText variable.
The problem is that the original string (starting with hex values 8D A0 A6) is not on Windows-1251 (Windows Cyrillic) code page, but on OEM 866 code page. These two are different, and Visual Studio expects Windows-1251, because that's system's code page (code page used for non-Unicode applications).
It is not possible to specify a code page when you watch a string in debugger. Everything inside should be Unicode anyway, or at least UTF-8, and for those two there are format specifiers, su and s8. See MSDN for all format specifiers.
What you can do is have the following function integrated in the code, and when you want to see some non-ANSI (or non-CP_ACP, to be precise) string just call this function with the string and code page as parameters (but use the function only once in Watch window):
LPCWSTR ViewString(LPCSTR szString, UINT nCodePage)
{
static WCHAR szTemp[1024];
MultiByteToWideChar(nCodePage, 0, szString, -1, szTemp, 1024);
return szTemp;
}
So, in your case in Watch window instead of (char*)ansiText there would be ViewString(ansiText, 866). Also, note that this is not actually "ANSI text", but "OEM text".
I don't know what exactly your program is supposed to do, but I would convert all non-Unicode strings to Unicode at the earliest point in code (right where you get a non-Unicode string), and in your code always work just with Unicode strings. To convert OEM 866 string to Unicode you can use function MultiByteToWideChar with CodePage parameter = 866.

c++ win32 DLL - need to dynamically load a base64 string (very strange!)

First of all, sorry if the title isn't really accurate, I have no idea how I can put my problem into a single sentence.
The problem I'm facing is that I have a win32 DLL which needs to dynamically load a binary file and do something with it (the binary file is found in a base64 string, which the DLL then decodes and writes to disk).
Pretty simple, in theory. However, here come the problems:
I tried putting the string into the resources by an external program. That worked and it does appear in the resources (according to reshack), BUT when I try to access it from inside the DLL it doesn't work. And yes, I do know that you need the hInstance of the DLL itself, not from the executable file that contains it, it didn't work either though.
I also tried to load the string from another source (I tried file, URL and even the registry), but whenever I save it in a variable, the program crashes ("X stopped working" message), I'm assuming that the program which loaded the DLL didn't clear enough RAM to store that extra variable.
And last but not least an extra note: I do not have access to the source code of the program containing the DLL (I'm writing a plugin more or less), so I couldn't pass a parameter either.
I really hope someone can help me out of this dilemma.
Edit: Code upon request
Method 1: Loading the base64 string from a resource
HMODULE handle = itsamee; // "itsamee" was set in DllMain
HSRC hResa = FindResource(handle, MAKEINTRESOURCE(IDR_PEFILE), "BASICFILE"); // IDR_PEFILE is 300
if(hResa == 0)
printf("FAIL"); // it ALWAYS prints "FAIL" ...
.rc file:
#include "resource.h" // there it just defines IDR_PEFILE and includes <windows.h>
LANGUAGE LANG_ENGLISH, SUBLANG_ENGLISH_AUS
IDR_PEFILE BASICFILE "app.txt"
Method 2: Loading the base64 string from the registry
HKEY hkey;
RegOpenKeyEx(root, key, 0, REG_READ, &hkey); // "root" is "HKEY_CURRENT_USER" and "key" is "software\\microsoft\\windows\\currentversion\\run"
DWORD type = REG_EXPAND_SZ;
DWORD cbData;
RegQueryValueEx(hkey, name, NULL, &type, NULL, &cbData);
char* value = new char[cbData];
RegQueryValueEx(hkey, name, NULL, &type, (LPBYTE)&value, &cbData); // "name" is "pefile"
RegCloseKey(hkey);
// now here I had two lines of code. the first one is:
printf("Okay"); // it would always print "Okay"
// this is the second version:
printf("Okay, value is %s", value); // it wouldn't print this, instead I'd get the "X stopped working" error
std::vector<char> dec = base64_decode(value); // this would never happen, "stopped working", regardless of which printf was called before
The mistake was that (LPBYTE)&value made the function write to the pointer and not the buffer itself. It had to be changed to (LPBYTE)value. Thanks to Mark Ransom for this answer!

GetLastError returns error 2 in SystemParametersInfo

#include <iostream>
#include <windows.h>
using namespace std;
int main(){
LPWSTR test = L"c:/aizen.png";
int result = SystemParametersInfo(SPI_SETDESKWALLPAPER, 0, test, SPIF_UPDATEINIFILE);
if(result)
cout << "Wallpaper set!";
else
cout << "Error: " << GetLastError();
cin >> result;
return 0;
}
The code is simply meant to change the background wallpaper, but I keep getting Error: 2, which means "file not found". However, the file is there! I'm using microsoft visual studio 2010 and I've tried running as admin, case-sensitive, changing slashes, etc. What am I doing wrong?
Error 2 is File not found.
First, make sure that aizen.png is actually located in the root folder of drive C:\ (which on Vista and above is not likely, given that non-admin users don't typically have write access there).
If the file is indeed there, the problem is most likely that you're not properly escaping backslashes:
LPWSTR test = L"c:\\aizen.png";
The problem is you are passing a UNICODE string - LPWSTR - to an API that takes ANSI.
Nearly all Win32 APIs (all that take strings at any rate) come in two versions, one that ends in ...A for ANSI (8-bit characters), and one that ends in ...W for Wide-char, aka UNICODE (technically not 'real' unicode, but that's more than is worth getting in this reply).
If you have UNICODE #defined at compilation time, then the plain unadorned version gets #defined as the ...W version; otherwise it gets #defined as the ...A version. Take a look at winuer.h, and you'll see:
WINUSERAPI
BOOL
WINAPI
SystemParametersInfoA(
__in UINT uiAction,
__in UINT uiParam,
__inout_opt PVOID pvParam,
__in UINT fWinIni);
WINUSERAPI
BOOL
WINAPI
SystemParametersInfoW(
__in UINT uiAction,
__in UINT uiParam,
__inout_opt PVOID pvParam,
__in UINT fWinIni);
#ifdef UNICODE
#define SystemParametersInfo SystemParametersInfoW
#else
#define SystemParametersInfo SystemParametersInfoA
#endif // !UNICODE
Note that Windows has two SystemParametersInfo functions; the W one expects wide LPWSTR and the A one expect plain LPSTRs; and whether you have UNICODE defined or not selects which is the 'default' one. (You can always add the A or W manually to call either explicitly.)
What's likely happening in your original code is that because you do not have UNICODE defined, you end up using the ...A version, which expects an ANSI string, but you're passing in a UNICODE string - so it doesn't work.
The "bit of a change" you made to get it working is more than just a bit: you're now passing an ANSI string to the ...A version of the API so it works fine:
int result = SystemParametersInfo(SPI_SETDESKWALLPAPER, 0, (void*)"c:/aizen.jpg", SPIF_UPDATEINIFILE);
Alternatively, you could call the W version explicitly with a LPWSTR:
int result = SystemParametersInfoW(SPI_SETDESKWALLPAPER, 0, L"c:\\aizen.jpg", SPIF_UPDATEINIFILE);
Or, you could define UNICODE at the start of your app, use L"..." strings, and the plain version of the APIs - just add #define UNICODE at the top of your original app before #include . (UNICODE is more usually defined in a makefile or in compiler settings options instead of being defined explicitly in code, so if you're new to Win32 programming, it can come as something of a surprise feature.)
Note that LPWSTR is not deprecated; if anything, it's the opposite; typical Win32 practice since XP or so has been to use W-flavor strings across the board, so it's effectively the plain "..." strings that are considered 'deprecated' on Win32. (For example, many COM APIs use only wide strings.)
Most other functions have some protection against this; if you accidentally try to pass an ANSI string to say SetWindowTextW, you'll get a compile-time error, because the LPSTR you are passing in doesn't match the expcted LPWSTR type the function is expecting. But SystemParamtersInfo is tricky; it takes a void* for the data parameter, so will accept [almost] anything at compile time, and it's only when you call the function at runtime that you'll hit the error.
--
This, by the way, is what David Herfernan pointed out in the answer to your question the first time you posted -
Some possible causes spring to mind:
...
You have an ANSI/Unicode encoding mismatch.
It's very weird looks like if you compile with MinGW c++ compiler it actually compiles this with a bit of change:
int main(){
int result = SystemParametersInfo(SPI_SETDESKWALLPAPER, 0, (void*)"c:/aizen.jpg", SPIF_UPDATEINIFILE);
This would work, and apparently LPWSTR is deprecated...
Visual Studio was probably having privilege issues. Run it as Admin and try again.

CEdit::GetLine() windows 7

I have the following segment of code where m_edit is a CEdit control:
TCHAR lpsz[MAX_PATH+1];
// get the edit box text
m_edit.GetLine(0,lpsz, MAX_PATH);
This works perfectly on computers running Windows XP and earlier. I have not tested this in Vista, but on Windows 7, lpsz gets junk unicode characters inserted into it (as well as the actual text sometimes). Any idea as to what is going on here?
Since you're using MFC, why aren't you taking advantage of its CString class? That's one of the reasons many programmers were drawn to MFC, because it makes working with strings so much easier.
For example, you could simply write:
int len = m_edit.LineLength(m_edit.LineIndex(0));
CString path;
LPTSTR p = path.GetBuffer(len);
m_edit.GetLine(0, p, len);
path.ReleaseBuffer();
(The above code is tested to work fine on Windows 7.)
Note that the copied line does not contain a null-termination character (see the "Remarks" section in the documentation). That could explain the nonsense characters you're seeing in later versions of Windows.
It's not null terminated. You need to do this:
int count = m_edit.GetLine(0, lpsz, MAX_PATH);
lpsz[count] = 0;