How to get quick launch bar size in Windows XP? - c++

I want to get quick launch bar size, but my code returns false, where is the problem?
REBARBANDINFOW prbi;
memset(&prbi, 0, sizeof(REBARBANDINFOW));
prbi.cbSize = sizeof(REBARBANDINFOW);
prbi.fMask = 892;
HWND hWndTray = ::FindWindow(L"Shell_TrayWnd", 0);
HWND hRebar = ::FindWindowEx(hWndTray, NULL, L"ReBarWindow32", 0);
int i = ::SendMessage(hRebar, RB_GETBANDINFOW, 0, (LPARAM)(LPREBARBANDINFOW)&prbi);

The problem is with the LPARAM of RB_GETBANDINFOW. The address of the structure you are sending is only valid in your own address space, not that of Explorer. Luckily, Explorer detects this and fails gracefully instead of blowing up.
One way to solve this is to use VirtualAllocEx to allocate the REBARBANDINFOW in Explorer's memory, use WriteProcessMemory to initialize it, send the message, and finally call ReadProcessMemory to read the result.
I've successfully used this technique in a Python script to automatically set the Quick Launch's size.

You are trying to get too much info. Place prbi.fMask = 32; or prbi.fMask = 64;.

Related

SendMessage to 64bit application is not working

I use the following function to send message to application. It seems to work find for a 32bit application but doesn't work for a 64bit application (the 64bit application does not seem to get any message). What is wrong and how can I fix it? Thank you.
void MyTest::SendCmd(HWND hwnd, QString cmd)
{
COPYDATASTRUCT data;
data.dwData = FIXHEADER;
data.cbData = cmd.size()+1;
data.lpData = cmd.toLocal8Bit().data();
LPARAM lpdwResult;
LRESULT err = SendMessageTimeout(hwnd, WM_COPYDATA, 0, (LPARAM)&data, SMTO_ABORTIFHUNG, 2000, &lpdwResult);
}
//FIXHEADER is a fixed hex value that the application checks against to make sure the message is sent intentional for it.
You need to compile your application in 64-bit mode otherwise this cannot work and 64-bit handles are truncated making them invalid.
Some reading for you:
https://learn.microsoft.com/en-us/windows/desktop/winauto/32-bit-and-64-bit-interoperability

ERROR_INSUFFICIENT_BUFFER returned from GetAdaptersAddresses

Using the following code, more or less copy-pasted from the MSDN example of
GetAdaptersAddresses, I get the return value 122, which means ERROR_INSUFFICIENT_BUFFER (according to this system error code list).
ULONG outBufLen = 150000; // Tried for different (large) values here...
PIP_ADAPTER_ADDRESSES pAddresses = (IP_ADAPTER_ADDRESSES *) malloc(outBufLen);
DWORD dwRetVal = GetAdaptersAddresses(AF_INET, 0, NULL, pAddresses, &outBufLen);
// ....
free(pAddresses);
The documentation of GetAdaptersAddresses does not list ERROR_INSUFFICIENT_BUFFER as one of the expected return values. (It lists ERROR_BUFFER_OVERFLOW, which should adjust outBufLen to the needed value, but that remains unchanged).
Using GetAdaptersInfo instead leads to the same symptoms.
This error does not occur on my development machine, but on one virtual and one real clean Windows 7 x86 SP1 installation (added the VC++ redistributables).
As a c++ newbie, am I doing something wrong? What could cause this error and how to fix it? =)
First of all, you can - as others suggested - do two calls, to find out required buffer size, and then do the query itself. Especially if you are seeing the error, your first try would be to ask API what size it expected.
Second, you need to know that this API is not quite safe in 32-bit processes consuming high amounts of memory, so that buffers span into higher 2GB of address space. API might start acting in a weird way, either due to its own bug, or a bug in an underlying layer. See details on this on MS Connect here: GetAdaptersAddresses API incorrectly returns no adapters for a process with high memory consumption.
The fact that error code is not "one of the expected return values" tells for the versions that the error comes from an underlying layer and this API just passes it up on internal failure. As a clue, having disabled some network adapter on the system, you might get rid of the error.
Visual Studio deployed a library named "IPHLPAPI.dll" together with my project which caused the problem. Deleting this file solved it.
Why this was the case is subject to further research =)
First, a buffer is a block of memory.
So insufficient could mean that you haven't given it enough memory somehow. Our could be a block of memory which you don't have access to. Maybe the address doesn't even exist.
Look at this:
ERROR_INSUFFICIENT_BUFFER
122 (0x7A)
The data area passed to a system call is too small.
This sounds really like the buffer hasn't got enough allocated memory. Or similar.
Maybe the
outBufLen
has to be a specific length, maybe the size of the memory block. Because sometimes it doesn't check for the 'name' but tries to compare for each of the variables size. This idea came from the High Level Shader Language.
So i would try to look a bit more on the:
ULONG outBufLen = 150000; // Tried for different (large) values here...
PIP_ADAPTER_ADDRESSES pAddresses = (IP_ADAPTER_ADDRESSES *) malloc(outBufLen);
Good luck!
To know the exact buffer size required, you can just pass NULL into pAddresses and size will be set to the required size. You may want to rewrite your code slightly to make that work;
DWORD rv, size = 0;
PIP_ADAPTER_ADDRESSES adapter_addresses;
rv = GetAdaptersAddresses(AF_INET, 0, NULL, NULL, &size);
if (rv != ERROR_BUFFER_OVERFLOW)
return false; // ERROR
adapter_addresses = (PIP_ADAPTER_ADDRESSES)malloc(size);
rv = GetAdaptersAddresses(AF_INET, 0, NULL, adapter_addresses, &size);
if (rv != ERROR_SUCCESS) {
free(adapter_addresses);
return false; // ERROR
}

How to create a partition without Windows assigning a drive letter?

I am trying to initialize and partition an attached virtual hard disk through the Windows API. I have been successful using DeviceIoControl() to do so, however whenever I apply the desired drive layout Windows is automatically assigning a drive letter to the partition and popping up an annoying "Would you like to format?" dialog.
My intent is to handle the formatting and mounting of this partition later in the program, but I'm not sure how to stop this behavior. I have tried setting RecognizedPartition to FALSE, but this seems to have no effect.
Relevant code:
Layout.PartitionStyle = PARTITION_STYLE_MBR;
Layout.PartitionCount = 4;
Layout.Mbr.Signature = MY_DISK_MBR_SIGNATURE;
Layout.PartitionEntry[0].PartitionStyle = PARTITION_STYLE_MBR;
Layout.PartitionEntry[0].PartitionNumber = 1;
Layout.PartitionEntry[0].StartingOffset.QuadPart = MY_DISK_OFFSET;
Layout.PartitionEntry[0].PartitionLength.QuadPart =
(Geom.DiskSize.QuadPart - MY_DISK_OFFSET);
Layout.PartitionEntry[0].Mbr.PartitionType = PARTITION_IFS;
Layout.PartitionEntry[0].Mbr.BootIndicator = FALSE;
Layout.PartitionEntry[0].Mbr.RecognizedPartition = FALSE;
Layout.PartitionEntry[0].Mbr.HiddenSectors =
(MY_DISK_OFFSET / Geom.Geometry.BytesPerSector);
for (int i = 0; i < 4; i++)
{
Layout.PartitionEntry[i].RewritePartition = TRUE;
}
if (!DeviceIoControl(hDisk, IOCTL_DISK_SET_DRIVE_LAYOUT_EX,
Layout, dwLayoutSz, NULL, 0, &dwReturn, NULL))
{
// Handle error
}
DeviceIoControl(hDisk, IOCTL_DISK_UPDATE_PROPERTIES,
NULL, 0, NULL, 0, &dwReturn, NULL);
What can I do to prevent automatic drive letter assignment?
The only reliable way I could find to work around this issue was to stop the "Shell Hardware Detection" service while the volume was created and formatted. However, this approach is so unapologetically silly that I refused to put it into code.
Another "hackish" option is to have the service start up and then immediately spawn itself (or a "worker" executable) in a hidden window via CreateProcess() with the CREATE_NO_WINDOW flag.
Since this software runs as a system service and I'd rather not complicate the code for something that only happens once or twice over the lifetime of the system, I've just had to accept that sometimes there will occasionally be an Interactive Services Detection window pop up for a few moments while creating the partitions.
If anyone discovers a good method for preventing the format prompt while programmatically creating and formatting a drive, I'll happily change the accepted answer (and owe you a beer).
It's been awhile since I've used this API, but from memory you can't. But it's doesn't stop you from removing the drive letter assignment after the fact.
I'm not sure if it will stop the format prompt tho, all the times that I have done this the partition has already been formatted correctly before I do the disk layout update.
I just solved this problem, by waiting for several seconds for the drive to be available and then directly issue a format action. See my answer here.
Rufus has an interesting workaround: it installs a window event hook that detects the "do you want to format this drive?" prompts and immediately closes them. See source code here.
It then goes on to arrange to mount only the partitions it cares about, but that's orthogonal.

Remote thread is failing on call to LoadLibrary with error 87

I am tring to create a Remote thread that will load a DLL I wrote, and run a function from it.
The DLL is working fine (Checked) but from some reason, the Remote thread fails and the proccess in which it was created stop responding.
I used ollyDebug to try and see what is going wrong and I noticed two things...
My strings (dll name and function name) are passed to the remote thread correctly
The thread fails on LoadLibrary with lasterror code 87 "ERROR_INVALID_PARAMETER"
My best guess is that somehow, The remote thread can't find LoadLibrary (Is this because the linker is done with repspect to my proccess???, Just a guess...)
What am I doing wrong?
This is the code to the remote function:
static DWORD WINAPI SetRemoteHook (DATA *data)
{
HINSTANCE dll;
HHOOK WINAPI hook;
HOOK_PROC hookAdress;
dll = LoadLibrary(data->dll);
hookAdress = (HOOK_PROC) GetProcAddress(dll,data->func);
if (hookAdress != NULL)
{
(hookAdress)();
}
return 1;
}
Edit:
This is the part in which I allocate the memory to the remote proccess:
typedef struct
{
char* dll;
char* func;
} DATA;
char* dllName = "C:\\Windows\\System32\\cptnhook.dll";
char* funcName = "SetHook";
char* targetPrgm = "mspaint.exe";
Data lData;
lData.dll = (char*) VirtualAllocEx( explorer, 0, sizeof(char)*strlen(dllName), MEM_COMMIT, PAGE_READWRITE );
lData.func = (char*) VirtualAllocEx( explorer, 0, sizeof(char)*strlen(funcName), MEM_COMMIT, PAGE_READWRITE );
WriteProcessMemory( explorer, lData.func, funcName, sizeof(char)*strlen(funcName), &v );
WriteProcessMemory( explorer, lData.dll, dllName, sizeof(char)*strlen(dllName), &v );
rDataP = (DATA*) VirtualAllocEx( explorer, 0, sizeof(DATA), MEM_COMMIT, PAGE_READWRITE );
WriteProcessMemory( explorer, rDataP, &lData, sizeof(DATA), NULL );
Edit:
It looks like the problem is that the remote thread is calling a "garbage" address
instead of LoadLibrary base address. Is there a possibily Visual studio linked
the remote proccess LoadLibrary address wrong?
Edit:
when I try to run the same exact code as a local thread (I use a handle to the current procces in CreateRemoteThread) the entire thing works just fine. What can cause this?
Should I add the calling function code? It seems to be doing its job as
the code is being executed in the remote thread with the correct parameters...
The code is compiled under VS2010.
data is a simple struct with char* 's to the names. (As explicetly writing the strings in code would lead to pointers to my original proccess).
What am I doing wrong?
Failing with ERROR_INVALID_PARAMETER indicates that there is a problem with the parameters passed.
So one should look at data->dll which represents the only parameter in question.
It is initialised here:
lData.dll = VirtualAllocEx(explorer, 0, sizeof(char) * (strlen(dllName) + 1), MEM_COMMIT, PAGE_READWRITE);
So let's add a check whether the allocation of the memory which's reference should be store into lData.dll really succeded.
if (!lData.dll) {
// do some error logging/handling/whatsoever
}
Having done so, you might have detected that the call as implemented failed because (verbatim from MSDN for VirtualAllocEx()):
The function fails if you attempt to commit a page that has not been
reserved. The resulting error code is ERROR_INVALID_ADDRESS.
So you might like to modifiy the fourth parameter of the call in question as recommended (again verbatim from MSDN):
To reserve and commit pages in one step, call VirtualAllocEx with
MEM_COMMIT | MEM_RESERVE.
PS: Repeat this exercise for the call to allocate lData.func. ;-)
It's possible that LoadLibrary is actually aliasing LoadLibraryW (depending on project settings), which is the Unicode version. Whenever you use the Windows API with "char" strings instead of "TCHAR", you should explicitly use ANSI version names. This will prevent debugging hassles when the code is written, and also in the future for you or somebody else in case the project ever flips to Unicode.
So, in addition to fixing that horrible unterminated string problem, make sure to use:
LoadLibraryA(data->dll);

First and last window don't show up

I'm creating a WinApi application for my programming course. The program is supposed to show an LED clock using a separate window for each 'block'. I have figured most of it out, except for one thing: when creating the two-dimensional array of windows, the first and last window never show up. Here's the piece of code from the InitInstance function:
for (int x=0;x<8;x++)
for (int y=0;y<7;y++) {
digitWnd[x][y] = CreateWindowEx((WS_EX_LAYERED | WS_EX_TRANSPARENT | WS_EX_NOACTIVATE | WS_EX_STATICEDGE),
szWindowClass, szTitle, (WS_POPUP| WS_BORDER), NULL, NULL, NULL, NULL, dummyWnd, NULL, hInstance, NULL);
ShowWindow(digitWnd[x][y], nCmdShow);
UpdateWindow(digitWnd[x][y]);
}
The same loop bounds are used everytime I interact with the windows (set position and enable/disable). All the windows seem to be working fine, except for digitWnd[0][0] and digitWnd[7][6]... Any ideas as to what is happening?
Open Spy++ and check if the missing windows are really missing or just overlapped by other windows. It's possible that you have some small error in the position calculations code that puts them behind another window or outside of the screen.
To validate your creation mechanism I would check:
the array initialisation HWND digitWnd[8][7]
if the parent window dummyWnd is valid
the return value of CreateWindowEx() != NULL
Another point which comes to my mind is, that you create windows with dimension 0 - no width or height. So maybe it would be a good idea to set the size within CreateWindowEx(...)
Is this your first call to ShowWindow()? If so, according to MSDN, "nCmdShow: [in] Specifies how the window is to be shown. This parameter is ignored the first time an application calls ShowWindow". This could mean that you can fix your program by simply calling ShowWindow() twice. Give it a try and see if it works. Other than that, you'll probably have to provide more of the code for us to look at.