ReadFile, COM and NULL characters in c++ - c++

I have a problem with ReadFile function in a virtual serial port:
char tmp[128];
int multiplo=0;
DWORD err;
COMSTAT stt;
ClearCommError(hcom, &err, &stt);
do{
if(ReadFile(hcom, tmp, stt.cbInQue, &err, NULL)){
tmp[err] = '\0';
memcpy(bfIn+multiplo, tmp, err);
multiplo = multiplo + err;
}else
return 0;
}while(err > 0);
this code works when ReadFile get valid character like 0x01, 0x02, 0x03... but there is a problem with 0x00, the code doesn't read like I expected, I try with hyperterminal and that works perfect.
I've defined in dcb structure:
dcb.fNull = false;
but still I have the same problem, any help?

The problem seems to be not in ReadFile() but rather in your use of tmp[] as the terminating '\0' happens to be 0x00, too.
What do you mean by "doesn't read like I expected"? Can you describe the symptoms in more detail?

Related

How to pass ESC/POS commands to WriteFile FileAPI method?

I have some trouble with ESC/POS commands. The code below prints well regular text, but when it comes down the ESC/POS commands the printer does nothing. I have tried many different ways how to pass on the data (see cData1 & cData2). Can anyone help with pointing out the correct way how to pass the command? Thanks, Oliver.
HANDLE CreateFileResult;
BOOL bResult;
PSP_INTERFACE_DEVICE_DETAIL_DATA dummy = GetDevices();
if (dummy != 0)
{
CreateFileResult = CreateFile(dummy->DevicePath,
GENERIC_WRITE | GENERIC_READ,
FILE_SHARE_WRITE | FILE_SHARE_READ,
NULL,
OPEN_ALWAYS,
FILE_ATTRIBUTE_NORMAL | FILE_FLAG_OVERLAPPED,
NULL);
if (INVALID_HANDLE_VALUE == CreateFileResult)
{
std::cout << "Handle Failed " << std::endl;
}
else
{
char cData1[] = { (CHAR)27, (CHAR)112 , CHAR(0), CHAR(255), CHAR(0) };
char cData2[] = { 0x1b, 0x70 ,0x00 ,0xFF ,0x00 };
DWORD dwBytesToWrite = (DWORD)strlen(cData2);
DWORD bytesWritten;
OVERLAPPED osWrite = { 0,0,0 };
WriteFile(CreateFileResult, cData2, dwBytesToWrite, &bytesWritten, &osWrite);
}
So after hours of debugging I found out there are 2 problems with my code:
1) For some reason the ESC/POS command that works with C# + Windows.Devices.USB namespace doesn't work with the current WriteFile approach. My guess is that Windows.Devices.USB does a lot more behind the scenes and somehow manages to make it work. The command that works with simply writefile is 0x1B, 0x70, 0x00, 0x98, 0x00.
2) As written by #JohnnyMopp & #MarkRanson the way how I was measuring buffer size was wrong. Thank you very much for helping out.

Writing wstring to a pipe process with WriteFile

I have an application where a user can write to the Windows shell from a Linux machine and then receive the output. But when the user tries to access a directory that contains accented characters such as ("TÉST_DIR" "ÇOME_DIR"), it can not, the only way I could find it to read accented characters, was to convert the user's message to wstring, however I'm having problems to pass this conversion to the pipe input with WriteFile, for some reason it does not work, I'm probably doing something wrong.
Another method I found was putting the character encoding from the linux side terminal to cp-1252 and using SetConsoleCP / SetConsoleOutputCP with the same encoding on the windows side, however it's a solution I do not want to use because that application in the future should stay multi platform .
Below are the functions I use to receive client messages and write to the pipe input, and the function for convert a string to wstring.
Any solution to my problem?
/*---------------------------------------------------------------------*/
/*--- s2ws - Convert string to wstring ---*/
/*---------------------------------------------------------------------*/
std::wstring Gear::s2ws(const std::string& str)
{
using convert_typeX = std::codecvt_utf8<wchar_t>;
std::wstring_convert<convert_typeX, wchar_t> converterX;
return converterX.from_bytes(str);
}
/*----------------------------------------------------------------------------------------*/
/*--- ClientSocketToShell - Read client response then write to pipe process ---*/
/*----------------------------------------------------------------------------------------*/
DWORD WINAPI Gear::ClientSocketToShell(LPVOID lpParameter)
{
sThreadInfo *stI = (sThreadInfo*)lpParameter;
BYTE buffer[4096];
fd_set rds;
DWORD BytesWritten;
while (!stI->Stop) {
FD_ZERO(&rds);
FD_SET(stI->sockfd, &rds);
timeval timeout;
timeout.tv_sec = 1;
timeout.tv_usec = 0;
int ret = select(0, &rds, NULL, NULL, &timeout);
if (ret < 0)
break;
if (ret > 0) {
ret = SSL_read(stI->ssl, (char *)buffer, sizeof(buffer) - 1);
if (ret <= 0)
break;
if (!WriteFile(stI->hStdIn, buffer, ret, &BytesWritten, NULL))
break;
}
}
return(0);
}

Win32 changing to binary mode child's Stdout (pipe)

Hello to this great community,
I have problems with the automatic conversion of ('\n') 0x0A to ('\n\r') 0x0D 0x0A when using a pipe to redirect child's stdout to a file, the child's output are bytes and not text.
First, I have used these examples MSDN-Creating a Child Process with Redirected Input and Output and http://support.microsoft.com/kb/190351), and now I have this basic application, it creates a pipe and redirect the child's STDOUT to a binary file. All this in a Win32 Console Application in Visual C++ 6.0 (yes it is old but is a requirement).
#define BUFSIZE 256
HANDLE g_hChildStd_OUT_Rd = NULL;
HANDLE g_hChildStd_OUT_Wr = NULL;
int _tmain(int argc, TCHAR *argv[])
{
SECURITY_ATTRIBUTES saAttr;
saAttr.nLength = sizeof(SECURITY_ATTRIBUTES);
saAttr.bInheritHandle = TRUE;
saAttr.lpSecurityDescriptor = NULL;
if ( ! CreatePipe(&g_hChildStd_OUT_Rd, &g_hChildStd_OUT_Wr, &saAttr, 0) )
ErrorExit(TEXT("StdoutRd CreatePipe"));
if ( ! SetHandleInformation(g_hChildStd_OUT_Rd, HANDLE_FLAG_INHERIT, 0) )
ErrorExit(TEXT("Stdout SetHandleInformation"));
CreateChildProcess();
if (!CloseHandle(g_hChildStd_OUT_Wr))
ErrorExit("CloseHandle");
ReadFromPipe();
if (!CloseHandle(g_hChildStd_OUT_Rd))
ErrorExit("CloseHandle");
return 0;
}
void CreateChildProcess()
{
TCHAR szCmdline[]=TEXT("child.exe");
PROCESS_INFORMATION piProcInfo;
STARTUPINFO siStartInfo;
BOOL bSuccess = FALSE;
ZeroMemory( &piProcInfo, sizeof(PROCESS_INFORMATION) );
ZeroMemory( &siStartInfo, sizeof(STARTUPINFO) );
siStartInfo.cb = sizeof(STARTUPINFO);
siStartInfo.hStdOutput = g_hChildStd_OUT_Wr;
siStartInfo.dwFlags |= STARTF_USESTDHANDLES;
bSuccess = CreateProcess(NULL,
szCmdline, // command line
NULL, // process security attributes
NULL, // primary thread security attributes
TRUE, // handles are inherited
0, // creation flags
NULL, // use parent's environment
NULL, // use parent's current directory
&siStartInfo, // STARTUPINFO pointer
&piProcInfo); // receives PROCESS_INFORMATION
if ( ! bSuccess )
ErrorExit(TEXT("CreateProcess"));
else
{
CloseHandle(piProcInfo.hProcess);
CloseHandle(piProcInfo.hThread);
}
}
void ReadFromPipe(void)
{
DWORD dwRead, dwWritten;
CHAR chBuf[BUFSIZE];
BOOL bSuccess = FALSE;
HANDLE hParentStdOut = GetStdHandle(STD_OUTPUT_HANDLE);
DWORD nTotalBytesRead = 0;
fstream filePk;
filePk.open("result.out", ios::out | ios::trunc | ios::binary);
for (;;)
{
bSuccess = ReadFile( g_hChildStd_OUT_Rd, chBuf, BUFSIZE, &dwRead, NULL);
if( ! bSuccess || dwRead == 0 ) {
if (GetLastError() == ERROR_BROKEN_PIPE)
break; // pipe done - normal exit path.
else
ErrorExit("ReadFile"); // Something bad happened.
}
filePk.write(chBuf, dwRead);
nTotalBytesRead += dwRead;
}
filePk.close();
char ibuff[24];
sprintf(ibuff,"%d bytes." , (int)nTotalBytesRead);
::MessageBox(NULL, ibuff, "", 0);
}
And in this dummy child.cpp you'll notice that if I set the STDOUT to binary mode everything works just fine (I get just 0x0A 0x0A!), but my real child is an EXE and I don't have access to that code.
int main(int argc, char* argv[])
{
_setmode( _fileno( stdout ), _O_BINARY );
printf("\n");
unsigned char buffer[] = {'\n'};
fwrite(buffer, sizeof(unsigned char), sizeof(buffer), stdout);
return 0;
}
So, after searching for about 2 days and considering that I have a basic C++ knowledge I ask: Is there a Way that I could do _setmode to the childs stdout from the parent, considering that I don't have access to the child's code.
As a solution, I am seriously considering finding every '0x0D' '0x0A' and replacing it with '0x0A'. I am really going crazy with this problem... So if someone could help me I will be very grateful.
Related Question: Win32 Stream Handles - Changing To Binary Mode but he has access to the child's code!
Edit
As, #librik point, the final solution will have to replace every single occurrence of 0x0D 0x0A by 0x0A. For this to work the file contents must be in memory. There are certain issues but I can live with it (excess of memory allocated). I hope this will be helpful:
void ReadFromPipe(void)
{
DWORD dwRead, dwWritten;
CHAR *chBuf = NULL, *chBufTmp = NULL;
BOOL bSuccess = FALSE;
HANDLE hParentStdOut = GetStdHandle(STD_OUTPUT_HANDLE);
DWORD nTotalBytesRead = 0;
fstream filePk;
filePk.open("result.out", ios::out | ios::trunc | ios::binary);
int nIter = 0;
for (;;)
{
if(chBuf == NULL) {
if((chBuf = (CHAR*)malloc(BUFSIZE*sizeof(CHAR))) == NULL) {
ErrorExit("Malloc");
}
} else {
chBufTmp = chBuf; // save pointer in case realloc fails
if((chBuf = (CHAR*)realloc(chBuf, (nIter+1)*(BUFSIZE*sizeof(CHAR)))) == NULL) {
free(chBufTmp); // free original block
ErrorExit("Realloc");
}
}
CHAR* chBufNew = chBuf+nTotalBytesRead;
bSuccess = ReadFile(g_hChildStd_OUT_Rd, chBufNew, BUFSIZE, &dwRead, NULL);
if( ! bSuccess || dwRead == 0 ) {
if (GetLastError() == ERROR_BROKEN_PIPE) {
break; // pipe done - normal exit path.
} else {
ErrorExit("ReadFile"); // Something bad happened.
}
}
nTotalBytesRead += dwRead;
nIter ++;
}
// 0xD 0xA -> 0xA
nTotalBytesRead = ClearBuffer(chBuf, nTotalBytesRead);
filePk.write(chBuf, nTotalBytesRead);
filePk.close();
free(chBuf);
char ibuff[24];
sprintf(ibuff,"%d bytes." , (int)nTotalBytesRead);
::MessageBox(NULL, ibuff, "", 0);
}
int ClearBuffer(char *buffer, int bufferlength) {
// lmiguelhm-es requerido que TODO el buffer esté en memoria
int chdel = 0;
for (int i = 0; (i+chdel) < bufferlength; i++) {
char firstChar = buffer[i+chdel];
buffer[i] = firstChar;
if (firstChar == 0x0D) {
if ((i+chdel+1) < bufferlength) {
char secondChar = buffer[i+chdel+1];
if (secondChar == 0x0A) {
buffer[i] = secondChar;
chdel++;
}
}
}
}
return bufferlength - chdel;
}
Your problem is that the "stream mode" isn't part of Windows, so it isn't something you can change from outside the other program. It's part of the C and C++ system, and so it's a private part of each separate C or C++ program you run.
There's a library of functions that is combined with every program compiled in C++, called the "C++ Standard Library." The C++ Standard Library holds all the functions for streams like stdout. It's inside the other program's C++ Standard Library that the 0x0A is being converted to 0x0D 0x0A before it's written to the stream. _setmode is a function inside the C++ Standard Library which turns on and off that conversion, so when you add a call to it inside child.cpp, that tells child.cpp's C++ Standard Library to leave stdout alone. But you have no way to force the other program to call its _setmode function.
So the best thing really is the "crazy" solution you suggested:
As a solution, I am seriously considering finding every '0x0D' '0x0A'
and replacing it with '0x0A'.
So long as you know that child.exe is writing in text mode, not binary mode, then every single occurrence of 0x0D 0x0A must have originally been a single 0x0A. (If the program tried to write the two bytes 0x0D 0x0A, it would come out as the three bytes 0x0D 0x0D 0x0A.) Therefore you are absolutely safe and correct to "fix" the output by converting back again.
I think the easiest approach is just to write result.out exactly like you're doing it now, but then translate 0x0D 0x0A to 0x0A at the end, creating a new file that is correct. There are little tool programs you can download that will do this sort of thing for you -- one of them is called dos2unix. That might be the easiest way, in fact -- just make the last step of your program run dos2unix < result.out.with_bad_newlines > result.out. If, for some reason, you can't do this, you could have your program change 0x0D 0x0A to 0x0A inside chBuf before you write it out, translating as you go. (But be careful when chBuf ends in 0x0D...)
(There are certain techniques that can "inject" a little bit of code into another program that's under your control on Windows. They are a little dangerous, and a lot of trouble. If you're really unhappy with the translation idea, you can look up "DLL injection.")

Bad Data error at CryptDecrypt when using AES 256 (MS CryptoAPI)

I'm trying to decrypt - using the microsoft's CryptoAPI in C++ - a short message encrypted using mcrypt_encrypt in PHP. The php line is:
mcrypt_encrypt( MCRYPT_RIJNDAEL_256, $key, $msg, MCRYPT_MODE_CBC);
where $key and $msg are strings.
In C++ I have the key, and my decryption function looks like this:
bool decrypt( const unsigned char* input_buffer,
const size_t& input_size,
const unsigned char* key,
const size_t& key_size,
unsigned char* output_buffer,
DWORD* out_size)
{
Log(L"START init_crypto");
bool ret = false;
HCRYPTKEY hKey = NULL;
HCRYPTPROV hCryptProv = NULL;
DWORD dwDataLen = 0;
// Attempt to acquire a handle to the crypto provider for AES
if(0 == CryptAcquireContext(&hCryptProv, NULL, NULL, PROV_RSA_AES, CRYPT_VERIFYCONTEXT) ){//PROV_RSA_AES
Log(L"CryptAcquireContext failed with code %ld", GetLastError());
goto end;
}
// key creation based on
// http://mirror.leaseweb.com/NetBSD/NetBSD-release-5-0/src/dist/wpa/src/crypto/crypto_cryptoapi.c
struct {
BLOBHEADER hdr;
DWORD len;
BYTE key[32];
} key_blob;
key_blob.hdr.bType = PLAINTEXTKEYBLOB;
key_blob.hdr.bVersion = CUR_BLOB_VERSION;
key_blob.hdr.reserved = 0;
key_blob.hdr.aiKeyAlg = CALG_AES_256;
key_blob.len = 32;//key_size;
memset(key_blob.key, '\0', sizeof(key_blob.key));
assert(key_size <= sizeof(key_blob.key));
CopyMemory(key_blob.key, key, min(key_size, sizeof(key_blob.key)));
if (!CryptImportKey( hCryptProv,
(BYTE *)&key_blob,
sizeof(key_blob),
0,
CRYPT_EXPORTABLE,
&hKey)){
Log(L"Error in CryptImportKey 0x%08x \n", GetLastError());
goto free_context;
}
else{
//---------------------------
// Set Mode
DWORD dwMode = CRYPT_MODE_CBC;
if(!CryptSetKeyParam( hKey, KP_MODE, (BYTE*)&dwMode, 0 )){
// Handle error
Log(L"Cannot set the cbc mode for decryption 0x%08x \n", GetLastError());
goto free_key;
}
//----------------------------
// Set IV
DWORD dwBlockLen = 0;
DWORD dwDataLen = sizeof(dwBlockLen);
if (!CryptGetKeyParam(hKey, KP_BLOCKLEN, (BYTE *)&dwBlockLen, &dwDataLen, 0)){
// Handle error
Log(_USTR("Cannot get the block length 0x%08x \n"), GetLastError());
goto free_key;
}
dwBlockLen /= 8;
BYTE *pbTemp = NULL;
if (!(pbTemp = (BYTE *)LocalAlloc(LMEM_FIXED, dwBlockLen))){
// Handle error
Log(L"Cannot allcoate the IV block 0x%08x \n", GetLastError());
goto free_key;
}
memset(pbTemp, '\0', dwBlockLen);
if (!CryptSetKeyParam(hKey, KP_IV, pbTemp, 0)){
// Handle error
Log(L"Cannot set the IV block 0x%08x \n", GetLastError());
LocalFree(pbTemp);
goto free_key;
}
LocalFree(pbTemp);
}
CopyMemory(output_buffer, input_buffer, min(*out_size, input_size));
*out_size = input_size;
if (!CryptDecrypt(hKey, NULL, TRUE, 0, output_buffer, out_size)){
Log(L"CryptDecrypt failed with code %ld", GetLastError());
goto free_key;
}
else{
Log(L"Decryption...");
ret = true;
}
free_key:
if (hKey)
CryptDestroyKey( hKey );
free_context:
if (hCryptProv)
CryptReleaseContext(hCryptProv, 0);
end:
return ret;
}
I consistently get the "bad data" error at CryptDecrypt()... I may be missing something obvious - if so, please help.
EDIT - To isolate the cause of the problem I replaced the two lines before CryptDecrypt (the CopyMemory stuff) with the following code:
....
strcpy((char*)output_buffer, "stuff");
DWORD plain_txt_len = strlen((char*)output_buffer);
if (!CryptEncrypt(hKey, NULL, TRUE, 0, output_buffer, &plain_txt_len, *out_size)){
Log(L"CryptEncrypt failed with code 0x%08x", GetLastError());
goto free_key;
}
...
and the CryptDecrypt is working -- which makes me believe that the problem is the key/and or message transmission from php to C++ ... If you agree can you give me a hint on how to make sure that the strings I use in PHP are the same with the ones in C++ (at byte level?)
EDIT2 --
After I changed my strings in binary streams (using pack) in php, and after I implemented the workaround(?) for AES vs Rijndael from here : http://kix.in/2008/07/22/aes-256-using-php-mcrypt/ I finaly have CryptDecrypt decrypting my PHP message... the problem is that it also still fails - even if the output contains the decrypted text. Any ideas about why could this happen?
Try passing NULL instead of CRYPT_VERIFYCONTEXT when acquiring the context.
With block encryption algorithms such as AES you need to add padding to the data being encrypted up to a multiple of the block length. Using your code example that already retrieves the block size you can calculate the padding required for encryption:
dwPadding = dwBlockLen - dwDataLen % dwBlockLen
Then append "dwPadding" number of characters (NULL works fine) to the data being encrypted and you will no longer get "Bad Data" errors in decryption.
You can also find out the size of the required encryption buffer directly by making an additional call to "CryptEncrypt" with a NULL buffer before the actual encryption:
dwBuffSize = dwDataLen
CryptEncrypt(hKey, NULL, TRUE, 0, NULL, &dwBuffSize, 0)
dwPadding = dwBuffsize - dwDataLen
Both methods are equivalent and produce the same desired result. The "CryptDecrypt" function already knows about padding and will return the actual size of the decryption buffer in one go so you will know exactly where your decrypted data ends.

how to get vendor id and product id of a plugged usb device on windows

I am using Qt on windows platform.
i want to get and display vendor id and product id of a plugged usb device from my local system.
Below is my full source code to get the vendor id and product id from the usb device.
when i run the my qt application it does not throw me any errors .
so i plug the usb device into the system.
but my print statement displays the result as below
qDebug ()<<pDetData->DevicePath;
i get the result as 0x4
Whether i have any implementation mistakes in my source code ?
if so please guide me what i am doing wrong..
Have i missed out any other functions ?
Is it possible to get the vendor id and product id from the usb device based on my source code .( my implementation of the code ) ?
kindly find my source code below
static GUID GUID_DEVINTERFACE_USB_DEVICE = { 0xA5DCBF10L, 0x6530, 0x11D2,
{ 0x90, 0x1F, 0x00, 0xC0, 0x4F, 0xB9, 0x51, 0xED } };
HANDLE hInfo = SetupDiGetClassDevs(&GUID_DEVINTERFACE_USB_DEVICE,NULL,NULL,
DIGCF_PRESENT | DIGCF_INTERFACEDEVICE);
if ( hInfo == INVALID_HANDLE_VALUE )
{
qDebug ()<<"invalid";
}
else
{
qDebug ()<<"valid handle";
SP_DEVINFO_DATA DeviceInfoData;
DeviceInfoData.cbSize = sizeof(SP_DEVINFO_DATA);
SP_INTERFACE_DEVICE_DATA Interface_Info;
Interface_Info.cbSize = sizeof(Interface_Info);
BYTE Buf[1024];
DWORD i;
DWORD InterfaceNumber= 0;
PSP_DEVICE_INTERFACE_DETAIL_DATA pspdidd =
(PSP_DEVICE_INTERFACE_DETAIL_DATA) Buf;
for (i=0;SetupDiEnumDeviceInfo(hInfo,i,&DeviceInfoData);i++)
{
DWORD DataT;
LPTSTR buffer = NULL;
DWORD buffersize = 0;
while (!SetupDiGetDeviceRegistryProperty( hInfo,
&DeviceInfoData,
SPDRP_DEVICEDESC,
&DataT,
(PBYTE)buffer,
buffersize,
&buffersize))
{
if (GetLastError() == ERROR_INSUFFICIENT_BUFFER)
{
// Change the buffer size.
if (buffer) LocalFree(buffer);
buffer = (LPTSTR)LocalAlloc(LPTR,buffersize);
}
else
{
// Insert error handling here.
break;
}
qDebug ()<<(TEXT("Device Number %i is: %s\n"),i, buffer);
if (buffer) LocalFree(buffer);
if ( GetLastError() != NO_ERROR
&& GetLastError() != ERROR_NO_MORE_ITEMS )
{
// Insert error handling here.
qDebug ()<<"return false";
}
InterfaceNumber = 0; // this just returns the first one, you can iterate on this
if (SetupDiEnumDeviceInterfaces(hInfo,
NULL,
&GUID_DEVINTERFACE_USB_DEVICE,
InterfaceNumber,
&Interface_Info))
{
printf("Got interface");
DWORD needed;
pspdidd->cbSize = sizeof(*pspdidd);
SP_DEVICE_INTERFACE_DETAIL_DATA *pDetData = NULL;
DWORD dwDetDataSize = sizeof(SP_DEVICE_INTERFACE_DETAIL_DATA)
+ 256;
SetupDiGetDeviceInterfaceDetail(hInfo,
&Interface_Info, pDetData,dwDetDataSize, NULL,
&DeviceInfoData);
qDebug ()<<pDetData->DevicePath;
//qDebug ()<<QString::fromWCharArray(pDetData->DevicePath);
}
else
{
printf("\nNo interface");
//ErrorExit((LPTSTR) "SetupDiEnumDeviceInterfaces");
if ( GetLastError() == ERROR_NO_MORE_ITEMS)
printf(", since there are no more items found.");
else
printf(", unknown reason.");
}
// Cleanup
SetupDiDestroyDeviceInfoList(hInfo);
qDebug ()<<"return true";
}
}
}
--------------- Edited to add: -----------------
Hi... the application comes and prints this
\?\usb#vid_04f2&pid_0111#5&1ba5a77f&0&2#{a5dcbf1 0-6530-11d2-901f-00c04fb951ed}
again it goes to while loop .... here it gets breaked in the else statement...
Qt Code:
if (GetLastError() == ERROR_INSUFFICIENT_BUFFER) {
// Change the buffer size.
if (buffer) LocalFree(buffer);
buffer = (LPTSTR)LocalAlloc(LPTR,buffersize);
} else {
qDebug ()<<"Here it quits the application";
// Insert error handling here. break;
}
Any ideas in this....
After this line:
SP_DEVICE_INTERFACE_DETAIL_DATA *pDetData = NULL;
Add this:
DWORD dwDetDataSize = sizeof (SP_DEVICE_INTERFACE_DETAIL_DATA) + 256;
pDetData = (_SP_DEVICE_INTERFACE_DETAIL_DATA_A*) malloc (dwDetDataSize);
pDetData->cbSize = sizeof (SP_DEVICE_INTERFACE_DETAIL_DATA);
After this line:
qDebug ()<<pDetData->DevicePath;
Add this:
free(pDetData);
But eventually you're going to have to read the docs for SetupDiGetDeviceInterfaceDetail(). Do it, there are lots of functions that work like this, with pointers to variable-size structs.
-------- Edited to add: --------
You're really going about this the wrong way. I see you're following the advice you got here, and it's taken you down the wrong path. idVendor and idProduct can only be found in the USB_DEVICE_DESCRIPTOR (MSDN).
It looks like you already know how to get the device handle (using CreateFile()). After that, you call WinUsb_Initialize() (MSDN). That gets you a WINUSB_INTERFACE_HANDLE.
Once you have that handle, you want to call WinUsb_GetDescriptor() (MSDN), with the DescriptorType set to URB_FUNCTION_GET_DESCRIPTOR_FROM_DEVICE. I can't test code now, but it will look something like this:
USB_DEVICE_DESCRIPTOR udd;
memset(&udd, 0, sizeof(udd));
ULONG LengthTransferred = 0;
WinUsb_GetDescriptor(
winusb_interface_handle, // returned by WinUsbInitialize
URB_FUNCTION_GET_DESCRIPTOR_FROM_DEVICE,
0, // not sure if we need this
0x409, // not sure if we need this
&udd,
sizeof(udd),
&LengthTransferred);
After that, udd->idVendor and udd->idProduct should have what you want.
Microsoft used to supply sample code for all this in the DDK, and probably still does, but I don't have access to one.
---------- Edited to add: ----------
Daniel K writes that the code should really be:
USB_DEVICE_DESCRIPTOR udd;
memset(&udd, 0, sizeof(udd));
ULONG LengthTransferred = 0;
WinUsb_GetDescriptor(
winusb_interface_handle, // returned by WinUsbInitialize
USB_DEVICE_DESCRIPTOR_TYPE, // Daniel K's suggestion
0,
0x409, // asks for English
&udd,
sizeof(udd),
&LengthTransferred);
See the comments for further details.
An alternative is to obtain the hardwareID which includes the VID and PID.
Call SetupDiGetDeviceRegistryProperty with SPDRP_HARDWAREID like so:
wchar_t *hardwareID;
// First get requiredLength
SetupDiGetDeviceRegistryProperty(deviceInfoList, &deviceInfoData, SPDRP_HARDWAREID, NULL, NULL, 0, &requiredLength);
hardwareID = (wchar_t*)(new char[requiredLength]());
// Second call to populate hardwareID
SetupDiGetDeviceRegistryProperty(deviceInfoList, &deviceInfoData, SPDRP_HARDWAREID, NULL, (PBYTE)hardwareID, requiredLength, NULL);
// Display the string
qDebug() << "hardwareID =" << QString::fromWCharArray(hardwareID);
This will give you a string like USB\ROOT_HUB20&VID1002&PID4396&REV0000 which you can parse.
*Note: not all devices will have a VID and PID, such as non-USB devices.
You are enumerating the device "interface". Interfaces do not have a VID or PID - device instances do. I am not sure whether you are enumerating the interfaces to narrow down the devices you are interested in, for because it's an error.
If you just enumerate the device instances, then you can call SetupDiGetDeviceProperty with DEVPKEY_Device_HardwareIds and then grep the resulting hardware id for the VID and PID.
If you are using the device interfaces on purpose, then you need to call SetupDiGetDeviceInterfaceDetail once with a NULL PSP_DEVICE_INTERFACE_DETAIL parameter and a valid requiredSize pointer to get the required size of memory to allocate, allocate that memory and then call the function again. In that call, the last parameter is a SP_DEVINFO_DATA structure, which once retrieved, you can use in the call to SetupDiGetDeviceProperty as I mentioned above.