I'm creating a dll file.
My code:
BOOL CALLBACK EnumWindowsProc(HWND hwnd, LPARAM lParam);
void test() {
EnumWindows(EnumWindowsProc, NULL);
}
BOOL CALLBACK EnumWindowsProc(HWND hwnd, LPARAM lParam)
{
char class_name[80];
char title[80];
GetClassName(hwnd, (LPWSTR) class_name, sizeof(class_name));
GetWindowText(hwnd, (LPWSTR) title,sizeof(title));
std::string titlas(title);
std::string classas(class_name);
Loggerc(titlas);
Loggerc("Gooing");
return TRUE;
}
Then I just call test().
In the log, titlas is empty and code stops.
When I try this code in a Win32 app with CodeBlock, everything works, all of the titles show. But in a dll, it does not work.
Where is the problem?
char class_name[80];
char title[80];
GetClassName(hwnd, (LPWSTR) class_name, sizeof(class_name));
GetWindowText(hwnd, (LPWSTR) title,sizeof(title));
std::string titlas(title);
std::string classas(class_name);
Considering that since VS2005 the default has been building in Unicode mode (instead of ANSI/MBCS) and that you have those (ugly C-style) (LPWSTR) casts, I'm assuming that you got compile-time errors when passing your char-based string buffers to APIs like GetClassName() and GetWindowText(), and you tried to fix those errors with casts.
That's wrong. The compiler was actually helping you with those errors, so please follow its advice instead of casting the compiler errors away.
Assuming Unicode builds, you may want to use wchar_t and std::wstring instead of char and std::string, and _countof() instead of sizeof() to get the size of buffers in wchar_ts, not in bytes (chars).
E.g.:
// Note: wchar_t used instead of char
wchar_t class_name[80];
wchar_t title[80];
// Note: no need to cast to LPWSTR (i.e. wchar_t*)
GetClassName(hwnd, class_name, _countof(class_name));
GetWindowText(hwnd, title, _countof(title));
// Note: std::wstring used instead of std::string
std::wstring titlas(title);
std::wstring classas(class_name);
If other parts of your code do use std::string, you may want to convert from UTF-16-encoded text stored in std::wstring (returned by Windows APIs) to UTF-8-encoded text and store it in std::string instances.
Related
In bellow function I need to dereference shared pointer to an array of TCHAR
however none of the operands available in std::share_ptr seem to work:
The FormatMessage API expects PTSTR which is in case of UNICODE wchar_t*
How to dereference the given pointer (see comment in the code)?
If you think the same thing could be achieved with more elegant sintax that it would be great you provide example code.
const std::shared_ptr<TCHAR[]> FormatErrorMessage(const DWORD& error_code)
{
constexpr short buffer_size = 512;
std::shared_ptr<TCHAR[]> message = std::make_shared<TCHAR[]>(buffer_size);
const DWORD dwChars = FormatMessage(
FORMAT_MESSAGE_FROM_SYSTEM,
nullptr,
error_code,
MAKELANGID(LANG_ENGLISH, SUBLANG_ENGLISH_US),
*message, // no operator "*" matches these operands
buffer_size,
nullptr);
return message;
}
EDIT
Thanks to answers and commnts (the only) way to make it work with Microsoft compiler is this:
const std::shared_ptr<std::array<WCHAR, buffer_size>>
FormatErrorMessageW(const DWORD& error_code, DWORD& dwChars)
{
const std::shared_ptr<std::array<WCHAR, buffer_size>> message =
std::make_shared<std::array<WCHAR, buffer_size>>();
dwChars = FormatMessageW(
FORMAT_MESSAGE_FROM_SYSTEM,
nullptr, // The location of the message definition.
error_code,
MAKELANGID(LANG_ENGLISH, SUBLANG_ENGLISH_US),
message.get()->data(),
buffer_size,
nullptr);
return message;
}
*message returns TCHAR&, whereas FormatMessage requires TCHAR* there. Instead of *message do message.get().
Also, since this function doesn't keep a reference to the formatted message, it should return std::unique_ptr<TCHAR[]> to document the fact that the caller is now the sole owner.
I need to use WinAPI function to restart a windows service, I am not familiar with strings in C++.
My function receive as parameter: const CStringA& serviceName:
bool MyClassName::RestartServer(const CStringA& serviceName)
When I obtain SC Handle via OpenService(..) I need to provide type LPCWSTR :
SC_HANDLE SHandle = OpenService(hSCManager, LPCWSTR serviceNameAsWideString, SC_MANAGER_ALL_ACCESS);
How do I convert CStringA to LPCWSTR?
I tried to following:
CA2W(serviceName, CP_UTF8);
CString str("MyServiceName"); CStringW strw(str); LPCWSTR ptr = strw;
Both did not work properly, they compiled, but when I tried to execute the code.
It failed to OpenService().
What worked:
LPCWSTR newString = serviceName.AllocSysString();
What am I missing here? Why 1 and 2 did not work? Why 3 worked?
How do I properly deallocate newString?
Your code requires a conversion because you are calling the TCHAR-based OpenService() macro, which maps to either OpenServiceW() or OpenServiceA() depending on whether UNICODE is defined:
__checkReturn
WINADVAPI
SC_HANDLE
WINAPI
OpenServiceA(
__in SC_HANDLE hSCManager,
__in LPCSTR lpServiceName,
__in DWORD dwDesiredAccess
);
__checkReturn
WINADVAPI
SC_HANDLE
WINAPI
OpenServiceW(
__in SC_HANDLE hSCManager,
__in LPCWSTR lpServiceName,
__in DWORD dwDesiredAccess
);
#ifdef UNICODE
#define OpenService OpenServiceW
#else
#define OpenService OpenServiceA
#endif // !UNICODE
In your case, UNICODE is clearly being defined in your project, so your code is really calling OpenServiceW(), which is why it expects an LPCWSTR as input.
Your RestartServer() method takes a CStringA (char-based ANSI) string as input, so you should use OpenServiceA() explicitly to match the same character type, no conversion needed:
bool MyClassName::RestartServer(const CStringA& serviceName)
{
...
SC_HANDLE SHandle = OpenServiceA(hSCManager, serviceName, SC_MANAGER_ALL_ACCESS);
...
}
Otherwise, if you are going to continue using TCHAR-based functionality in your code 1, then you should change your RestartServer() method to take a CString instead of a CStringA so it adopts the same ANSI/Unicode mapping that OpenService() does (and other TCHAR-based functions do), again avoiding a conversion:
1: which you should not do, since there is rarely a need to ever write code for Win9x/ME nowadays. Windows has been a Unicode-based OS since NT4.
bool MyClassName::RestartServer(const CString& serviceName)
If that is not an option for you, then CA2W() will work just fine:
bool MyClassName::RestartServer(const CStringA& serviceName)
{
USES_CONVERSION;
...
SC_HANDLE SHandle = OpenService(hSCManager, ATL::CA2W(serviceName), SC_MANAGER_ALL_ACCESS);
...
}
Though, you might consider just using CString internally and let it handle a conversion if needed:
bool MyClassName::RestartServer(const CStringA& serviceName)
{
...
SC_HANDLE SHandle = OpenService(hSCManager, CString(serviceName), SC_MANAGER_ALL_ACCESS);
...
}
Or, make the code conditional:
bool MyClassName::RestartServer(const CStringA& serviceName)
{
...
SC_HANDLE SHandle = OpenService(hSCManager,
#ifdef UNICODE
CStringW(serviceName)
#else
serviceName
#endif
, SC_MANAGER_ALL_ACCESS);
...
}
CStringA and CStringW have constructors taking both const char* and const wchar_t* C strings.
Write following:
CStringW serviceNameW( serviceName );
About AllocSysString, it creates a copy in BSTR, they’re more complex than C strings, they’re null-terminated too but they also have length at negative offset. If you want to do manual memory management, call SysFreeString on the pointer. Or if you want BSTR but don’t want manual memory management, use CComBSTR class.
Can't change the whole project to unicode.
void CreateDir(string dirname)
{
char my_dir[247];
WCHAR wcmy_dir[UNLEN+1];sprintf_s(my_dir, dirname.c_str());
MultiByteToWideChar(CP_ACP, 0, my_dir, (int)strlen(my_dir)+1, wcmy_dir,
sizeof(wcmy_dir)/sizeof(wcmy_dir[0]));
CreateDirectory(wcmy_dir, NULL);
}
Your project is not configured to use Unicode, so CreateDirectory() will map to CreateDirectoryA() instead of CreateDirectoryW() like your code is assuming. Passing a WCHAR string to CreateDirectoryA() is indeed an error.
Since you are not actually calling CreateDirectoryW(), you don't need to call MultiByteToWideChar() at all. Just call CreateDirectoryA() explicitly, passing it your input string as-is:
void CreateDir(string dirname)
{
CreateDirectoryA(dirname.c_str(), NULL);
}
Internally, it will convert the char data to WCHAR using CP_ACP and then call CreateDirectoryW() for you.
If you ever decide to update your project to use Unicode but do not change your function to use wstring, this same code will still work without change.
If you ever decide to change your function to use wstring instead, simply call CreateDirectoryW() explicitly to match:
void CreateDir(wstring dirname)
{
CreateDirectoryW(dirname.c_str(), NULL);
}
I have a Win32 C++ dll (A) that calls another Win32 C++ dll (B). (B) is loaded using LoadLibrary and contains a method:
Draw(HDC hDC, LPRECT lpRect, LPBUFFER buffer, LPOPTIONS options)
Buffer structure is defined as:
struct Buffer
{
char* pData;
long Length;
TCHAR FileName[MAX_PATH];
Extension Extension;
};
typedef Buffer BUFFER, *LPBUFFER;
(A) fills BUFFER with filename, length etc and calls the Draw function. The Draw function then uses the values from BUFFER. It all works fine when DLLs are compiled as 64-bit but if I compile them as 32-bit then I start getting garbage values in BUFFER fields in (B). Logs shows that the values are good in (A) but turn into garbage when they reach (B).
I tried changing the Structure Alignment Option /ZpX and calling convention for Draw method (__cdecl, __stdcall) but none helped. I think it is related to calling convention because if I change Draw function syntax and put BUFFER as first param then (B) gets correct values. What's going on here?
Function pointer type:
typedef bool (__cdecl *DrawFunc)(HDC hDC, LPRECT lpRect, LPBUFFER buffer, LPOPTIONS options);
Then in InitInstance:
pDrawFunc = (DrawFunc)GetProcAddress(dllHandle, "Draw");
UPDATE
1. As mentioned above, if I put BUFFER as first param then it receives correct values.
2. HDC being a single numeric value always receives correct value
3. RECT gets incorrect values, very large ones
I believe the problem has something to do with structs. Only structs get incorrect values.
UPDATE 2
OK I found out my own silly mistake, the declaration for Draw method had LPRECT whereas the implementation had RECT. My bad, sorry about that.
But I am still not sure why:
1. Other parameters were showing garbage values?
2. Why it worked in 64-bit?
Ok, I create a solution with 3 projects: library B, that contains Draw(), library A, that has Test(), that loads library B and call Draw() with some Buffer* and application test, that links with library A and calls Test(). Everything works fine, both for 32 bit and 64. Small snippet of Test():
#include "stdafx.h"
#include "A.h"
#include "../B/B.h"
namespace {
LPBUFFER CreateBuffer(const char* const data, LPCTSTR const name)
{
if(!data || !name)
return NULL;
LPBUFFER buffer = new BUFFER();
buffer->Length = static_cast<long>(strlen(data) + 1);
buffer->pData = new char[buffer->Length];
strcpy_s(buffer->pData, buffer->Length * sizeof(char), data);
buffer->Extension = 0;
::ZeroMemory(buffer->FileName, _countof(buffer->FileName) * sizeof(TCHAR));
_tcscpy_s(buffer->FileName, name);
return buffer;
}
void DestroyBuffer(LPBUFFER buffer)
{
delete [] buffer->pData;
buffer->Length = 0;
buffer->pData = NULL;
buffer->Extension = 0;
::ZeroMemory(buffer->FileName, _countof(buffer->FileName) * sizeof(TCHAR));
delete buffer;
}
} // namespace
A_API void Test()
{
HMODULE b_lib = ::LoadLibrary(_T("B.dll"));
if(!b_lib)
{
::OutputDebugString(_T("Can't load library\n"));
return;
}
typedef bool (*DrawFunction)(HDC hDC, LPRECT lpRect, LPBUFFER buffer, LPOPTIONS options);
DrawFunction draw = reinterpret_cast<DrawFunction>(::GetProcAddress(b_lib, "Draw"));
if(!draw)
{
::OutputDebugString(_T("Can't get address of Draw()"));
goto FINISH_LABEL;
}
LPBUFFER buffer = CreateBuffer("test", _T("path"));
draw(NULL, NULL, buffer, NULL);
DestroyBuffer(buffer);
FINISH_LABEL:
::FreeLibrary(b_lib);
b_lib = NULL;
}
And a whole solution: https://www.dropbox.com/s/5ei6ros9e8s94e2/B.zip
Here is my need
BSTR l_strArgs;
LPCWSTR sth;
//----
//---
OutputDebugStringW(sth);
How to convert BSTR to LPCWSTR ?
Is there any header only library that coverts any string type(microsoft) to LPCWSTR type ?
Just cover NULL scenario and you're good to go
BSTR l_strArgs;
LPCWSTR sth = strArgs ? strArgs : L"";
As you mentioned ATL in the tag, here is ATL-style one-liner:
OutputDebugString(CString(l_strArgs));
or, to make sure you are staying in Unicode domain:
OutputDebugStringW(CStringW(l_strArgs));
I just found this one
BSTR l_strArgs;
LPCWSTR sth;
CString cs(_com_util::ConvertBSTRToString(l_strArg));
sth = cs;
OutputDebugStringW(sth);
BSTRs become easier to handle when you use a wrapper like _bstr_t instead. Here's the microsoft documentation on them
http://msdn.microsoft.com/en-us/library/zthfhkd6%28v=VS.100%29.aspx
As you would expect, one of the _bstr_t constructors takes a BSTR parameter. There is also an operator to return a const wchar_t* which you should be able to cast to LPCWSTR.
Hope this helps