MessageBox printing extra unicode characters from TCHAR buffer - c++

So i'm in the middle of trying to get a NamedPipe IPC project working so that my C# GUI can communicate with my C++ code, i should mention i'm a fair bit out of my depth on the C++ side at the moment, although, for the most part i have it working.
I cannot for the life of me figure out how to 'translate?' the chReply buffer received by the C++ listener to a simple string and MessageBox it, i'm always getting extra unicode characters. I have added what i think are the most important parts.
C# Pipe Write
byte[] bReply = Encoding.Unicode.GetBytes("#TEST 123 456");
uint cbBytesWritten;
uint cbReplyBytes = (uint)bReply.Length;
bool bResult = PipeNative.WriteFile(hPipe, bReply, cbReplyBytes, out cbBytesWritten, IntPtr.Zero);
C++ Pipe Read
// Project's Character Set: Unicode
// BUFFER_SIZE = 1024
TCHAR chRequest[BUFFER_SIZE];
DWORD cbBytesWritten, cbRequestBytes;
TCHAR chReply[BUFFER_SIZE];
DWORD cbBytesRead, cbReplyBytes;
cbReplyBytes = sizeof(TCHAR) * BUFFER_SIZE;
do
{
bResult = ReadFile(hPipe, chReply, cbReplyBytes, &cbBytesRead, NULL);
}
while(!bResult);
MessageBox(NULL, chReply, _T("GUI Request"), MB_OK);
If somebody could save me from drowning i would be extremely grateful.

You have a few problems. The first of which is that you read and discard data. The second of which is you don't pay attention to the end of the buffer location.
// Project's Character Set: Unicode
// BUFFER_SIZE = 1024
TCHAR chRequest[BUFFER_SIZE];
DWORD cbBytesWritten, cbRequestBytes;
std::basic_string<TCHAR> result;
do {
TCHAR chReply[BUFFER_SIZE];
DWORD cbBytesRead;
bResult = ReadFile(hPipe, chReply, sizeof(chReply), &cbBytesRead, NULL);
if (bResult)
result.insert( result.end(), chReply, chReply+cbBytesRead/2 );
}
while(!bResult);
MessageBox(NULL, result.data(), _T("GUI Request"), MB_OK);
here we copy the bytes over into a basic_string<TCHAR>. It automatically handles null termination and the like, and permits long messages to be passed.
We'll read them 1024 characters at a time.

VTT is right. You need to initialize chReply with zeroes after each call to ReadFile.

Related

Windows API ReadFile() skips one out of every two characters

My aim is to read all the text located in a file. For some reason whenever I read from the file and print the result (drawText), the buffer seems to be skipping one character every two positions. HELLO will become HLO and SCAVENGER becomes SAEGR.
This is for Windows API. I wonder if CreateFile() and ReadFile() are just fine and whether it's something else causing the issue.
void init(HDC hdc)
{
HANDLE hFile;
LPCSTR fileName = "c:\\Users\\kanaa\\Desktop\\code\\HW2_StarterCode\\words.txt";
hFile = CreateFileA(fileName, GENERIC_READ, 0, NULL, OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL, NULL);
DWORD dwFileSize = GetFileSize(hFile, NULL);
DWORD dwBytesRead;
WCHAR* buffer = new WCHAR[dwFileSize / 2 + 1];
buffer[dwFileSize / 2] = 0;
bool read = ReadFile(hFile, buffer, dwFileSize, &dwBytesRead, NULL);
std::wstring wstr(buffer);
std::string str(wstr.begin(), wstr.end());
delete[] buffer;
CloseHandle(hFile);
if (read) parse(str, hdc);
}
void parse(std::string word, HDC hdc)
{
std::string to = word;
std::wstring wword = std::wstring(to.begin(), to.end());
const WCHAR* wcword = wword.c_str();
Graphics graphics(hdc);
drawText(&graphics, wcword);
}
The problem was the WCHAR buffer. Below are the corrections
CHAR* buffer = new CHAR[dwFileSize/sizeof(char) + 1];
bool read = ReadFile(hFile, buffer, dwFileSize, &dwBytesRead, NULL);
buffer[dwBytesRead] = 0;
You are processing the file data using a wchar_t[] buffer. wchar_t is 2 bytes in size on Windows. So, in the statement:
std::string str(wstr.begin(), wstr.end());
You are iterating through the file data 2 bytes at a time, interpreting each byte pair as a single wchar_t that gets truncated to a 1-byte char, discarding the other byte. That is why your str ends up skipping every other character.
Process the file data using a char[] buffer instead. However, there are easier ways to read 7/8-bit file data into a std::string.
Lastly, in this statement:
std::wstring wword = std::wstring(to.begin(), to.end());
This is not the correct way to convert a std::string to a std::wstring. All you are doing is iterating through the chars converting each one as-is into a 2-byte wchar_t. Windows APIs expect wchar_t strings to be encoded in UTF-16, which your code is not converting to. You need to use MultiByteToWideChar(), std::wstring_convert, or other equivalent Unicode library call to perform that conversion. In which case, you first need to know the encoding of the source file in order to convert it to Unicode correctly.

How to fix garbled text with using ReadFile?

I have a Win32 application that I'm making.
Use "ReadFile" to retrieve a text file that is written in Unicode.
To be printed in the EditBox.
const TCHAR FILE_DIRECTORY[] = TEXT("data/");
const TCHAR FILE_LIST[][MAX_LOADSTRING] = {
TEXT("fputs_fgets.h"), TEXT("fprintf_fscanf.h"),
TEXT("fprintfs_fscanfs.h"), TEXT("fread_fwrite.h"), TEXT("freads_fwrite.h") };
const int FILE_NAME_LENGTH = _tcslen(FILE_LIST[idx]);
const int FILE_DIRECTORY_LENGTH = _tcslen(FILE_DIRECTORY);
TCHAR* filePath = (TCHAR*)calloc(FILE_NAME_LENGTH + FILE_DIRECTORY_LENGTH + 1, sizeof(TCHAR));
_tcscpy_s(filePath, FILE_DIRECTORY_LENGTH + 1, FILE_DIRECTORY);
_tcscat_s(filePath, FILE_NAME_LENGTH + FILE_DIRECTORY_LENGTH + 1, FILE_LIST[idx]);
HANDLE file = CreateFile(filePath, GENERIC_READ, 0, NULL, OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL, NULL);
DWORD fileSize = GetFileSize(file, NULL);
DWORD dwRead;
if (editText != NULL)
free(editText);
editText = (TCHAR*)calloc(1, fileSize + 1);
ReadFile(file, editText, fileSize, &dwRead, NULL);
CloseHandle(file);
free(filePath);
However, there are some strange characters on the back of the output.
printf("y좌표(정수): %d\n", point.y);
}
fclose(file);
}ﴀ﷽ý
How can i fix it?
Thank you.
Assuming your file is UTF-16 and you are compiling with _UNICODE defined (assumptions justified by the fact that the rest of your text is read correctly), in this line:
editText = (TCHAR*)calloc(1, fileSize + 1);
you should actually do fileSize + sizeof(TCHAR) if you want to exploit the zeroing that calloc does to get a NUL-terminated string. As it is now, you have a wide string whose last character has only the low byte to zero, so the rest of your code goes on reading garbage until it happens to find two solid bytes of zero (adequately aligned).
Mind you, I'm extremely dubious about this code in general - if you use TCHAR it means you want to compile both in ANSI (TCHAR == char) and in Unicode (TCHAR ==wchar_t), having this change how you interpret the bytes of external files is a disputable idea.

the system command opens up a command prompt [duplicate]

I'm having a serious problem here. I need to execute a CMD command line via C++ without the console window displaying. Therefore I cannot use system(cmd), since the window will display.
I have tried winExec(cmd, SW_HIDE), but this does not work either. CreateProcess is another one I tried. However, this is for running programs or batch files.
I have ended up trying ShellExecute:
ShellExecute( NULL, "open",
"cmd.exe",
"ipconfig > myfile.txt",
"c:\projects\b",
SW_SHOWNORMAL
);
Can anyone see anything wrong with the above code? I have used SW_SHOWNORMAL until I know this works.
I really need some help with this. Nothing has come to light, and I have been trying for quite a while. Any advice anyone could give would be great :)
Redirecting the output to your own pipe is a tidier solution because it avoids creating the output file, but this works fine:
ShellExecute(0, "open", "cmd.exe", "/C ipconfig > out.txt", 0, SW_HIDE);
You don't see the cmd window and the output is redirected as expected.
Your code is probably failing (apart from the /C thing) because you specify the path as "c:\projects\b" rather than "c:\\projects\\b".
Here is my implementation of a DosExec function that allows to (silently) execute any DOS command and retrieve the generated output as a unicode string.
// Convert an OEM string (8-bit) to a UTF-16 string (16-bit)
#define OEMtoUNICODE(str) CHARtoWCHAR(str, CP_OEMCP)
/* Convert a single/multi-byte string to a UTF-16 string (16-bit).
We take advantage of the MultiByteToWideChar function that allows to specify the charset of the input string.
*/
LPWSTR CHARtoWCHAR(LPSTR str, UINT codePage) {
size_t len = strlen(str) + 1;
int size_needed = MultiByteToWideChar(codePage, 0, str, len, NULL, 0);
LPWSTR wstr = (LPWSTR) LocalAlloc(LPTR, sizeof(WCHAR) * size_needed);
MultiByteToWideChar(codePage, 0, str, len, wstr, size_needed);
return wstr;
}
/* Execute a DOS command.
If the function succeeds, the return value is a non-NULL pointer to the output of the invoked command.
Command will produce a 8-bit characters stream using OEM code-page.
As charset depends on OS config (ex: CP437 [OEM-US/latin-US], CP850 [OEM 850/latin-1]),
before being returned, output is converted to a wide-char string with function OEMtoUNICODE.
Resulting buffer is allocated with LocalAlloc.
It is the caller's responsibility to free the memory used by the argument list when it is no longer needed.
To free the memory, use a single call to LocalFree function.
*/
LPWSTR DosExec(LPWSTR command){
// Allocate 1Mo to store the output (final buffer will be sized to actual output)
// If output exceeds that size, it will be truncated
const SIZE_T RESULT_SIZE = sizeof(char)*1024*1024;
char* output = (char*) LocalAlloc(LPTR, RESULT_SIZE);
HANDLE readPipe, writePipe;
SECURITY_ATTRIBUTES security;
STARTUPINFOA start;
PROCESS_INFORMATION processInfo;
security.nLength = sizeof(SECURITY_ATTRIBUTES);
security.bInheritHandle = true;
security.lpSecurityDescriptor = NULL;
if ( CreatePipe(
&readPipe, // address of variable for read handle
&writePipe, // address of variable for write handle
&security, // pointer to security attributes
0 // number of bytes reserved for pipe
) ){
GetStartupInfoA(&start);
start.hStdOutput = writePipe;
start.hStdError = writePipe;
start.hStdInput = readPipe;
start.dwFlags = STARTF_USESTDHANDLES + STARTF_USESHOWWINDOW;
start.wShowWindow = SW_HIDE;
// We have to start the DOS app the same way cmd.exe does (using the current Win32 ANSI code-page).
// So, we use the "ANSI" version of createProcess, to be able to pass a LPSTR (single/multi-byte character string)
// instead of a LPWSTR (wide-character string) and we use the UNICODEtoANSI function to convert the given command
if (CreateProcessA(NULL, // pointer to name of executable module
UNICODEtoANSI(command), // pointer to command line string
&security, // pointer to process security attributes
&security, // pointer to thread security attributes
TRUE, // handle inheritance flag
NORMAL_PRIORITY_CLASS, // creation flags
NULL, // pointer to new environment block
NULL, // pointer to current directory name
&start, // pointer to STARTUPINFO
&processInfo // pointer to PROCESS_INFORMATION
)){
// wait for the child process to start
for(UINT state = WAIT_TIMEOUT; state == WAIT_TIMEOUT; state = WaitForSingleObject(processInfo.hProcess, 100) );
DWORD bytesRead = 0, count = 0;
const int BUFF_SIZE = 1024;
char* buffer = (char*) malloc(sizeof(char)*BUFF_SIZE+1);
strcpy(output, "");
do {
DWORD dwAvail = 0;
if (!PeekNamedPipe(readPipe, NULL, 0, NULL, &dwAvail, NULL)) {
// error, the child process might have ended
break;
}
if (!dwAvail) {
// no data available in the pipe
break;
}
ReadFile(readPipe, buffer, BUFF_SIZE, &bytesRead, NULL);
buffer[bytesRead] = '\0';
if((count+bytesRead) > RESULT_SIZE) break;
strcat(output, buffer);
count += bytesRead;
} while (bytesRead >= BUFF_SIZE);
free(buffer);
}
}
CloseHandle(processInfo.hThread);
CloseHandle(processInfo.hProcess);
CloseHandle(writePipe);
CloseHandle(readPipe);
// convert result buffer to a wide-character string
LPWSTR result = OEMtoUNICODE(output);
LocalFree(output);
return result;
}
You should use CreateProcess on cmd.exe with the /C parameter to tunnel the ipconfig command. The > does not work per se on the command line. You have to redirect programmatically the stdout.
I have a similar program [windows7 and 10 tested] on github
https://github.com/vlsireddy/remwin/tree/master/remwin
This is server program which
listens on "Local Area Connection" named interface in windows for UDP port (5555) and receives udp packet.
received udp packet content is executed on cmd.exe [please not cmd.exe is NOT closed after running the command and the output string [the output of the executed command] is fedback to the client program over same udp port].
In other words,
command received in udp packet -> parsed udp packet -> executed on cmd.exe -> output sent back on same port to client program
This does not show "console window"
No need for someone to execute manually command on cmd.exe
remwin.exe can be running in background and its a thin server program
To add to #Cédric Françoys answer, I fixed a few things in his code for a Windows build:
Missing function definition:
To make the code compile, add the following function definition:
#define UNICODEtoANSI(str) WCHARtoCHAR(str, CP_OEMCP)
LPSTR WCHARtoCHAR(LPWSTR wstr, UINT codePage) {
int len = (int)wcslen(wstr) + 1;
int size_needed = WideCharToMultiByte(codePage, 0, wstr, len, NULL, 0, NULL, NULL);
LPSTR str = (LPSTR)LocalAlloc(LPTR, sizeof(CHAR) * size_needed);
WideCharToMultiByte(codePage, 0, wstr, len, str, size_needed, NULL, NULL);
return str;
}
Unsafe CRT string function calls:
To make the code compile, replace strcpy and strcat with the following calls
strcpy_s(output, sizeof(output), "");
strcat_s(output, RESULT_SIZE, buffer);
Remove redundant null-termination:
Remove in the do-while loop:
buffer[bytesRead] = '\0';
because strcat_s takes care of that.
You could use
string command = "start /B cmd /c " + myCommand;
system(command.c_str());
Hopefully this works for you

forwarding a file write with ShellExecute in c++ [duplicate]

I'm having a serious problem here. I need to execute a CMD command line via C++ without the console window displaying. Therefore I cannot use system(cmd), since the window will display.
I have tried winExec(cmd, SW_HIDE), but this does not work either. CreateProcess is another one I tried. However, this is for running programs or batch files.
I have ended up trying ShellExecute:
ShellExecute( NULL, "open",
"cmd.exe",
"ipconfig > myfile.txt",
"c:\projects\b",
SW_SHOWNORMAL
);
Can anyone see anything wrong with the above code? I have used SW_SHOWNORMAL until I know this works.
I really need some help with this. Nothing has come to light, and I have been trying for quite a while. Any advice anyone could give would be great :)
Redirecting the output to your own pipe is a tidier solution because it avoids creating the output file, but this works fine:
ShellExecute(0, "open", "cmd.exe", "/C ipconfig > out.txt", 0, SW_HIDE);
You don't see the cmd window and the output is redirected as expected.
Your code is probably failing (apart from the /C thing) because you specify the path as "c:\projects\b" rather than "c:\\projects\\b".
Here is my implementation of a DosExec function that allows to (silently) execute any DOS command and retrieve the generated output as a unicode string.
// Convert an OEM string (8-bit) to a UTF-16 string (16-bit)
#define OEMtoUNICODE(str) CHARtoWCHAR(str, CP_OEMCP)
/* Convert a single/multi-byte string to a UTF-16 string (16-bit).
We take advantage of the MultiByteToWideChar function that allows to specify the charset of the input string.
*/
LPWSTR CHARtoWCHAR(LPSTR str, UINT codePage) {
size_t len = strlen(str) + 1;
int size_needed = MultiByteToWideChar(codePage, 0, str, len, NULL, 0);
LPWSTR wstr = (LPWSTR) LocalAlloc(LPTR, sizeof(WCHAR) * size_needed);
MultiByteToWideChar(codePage, 0, str, len, wstr, size_needed);
return wstr;
}
/* Execute a DOS command.
If the function succeeds, the return value is a non-NULL pointer to the output of the invoked command.
Command will produce a 8-bit characters stream using OEM code-page.
As charset depends on OS config (ex: CP437 [OEM-US/latin-US], CP850 [OEM 850/latin-1]),
before being returned, output is converted to a wide-char string with function OEMtoUNICODE.
Resulting buffer is allocated with LocalAlloc.
It is the caller's responsibility to free the memory used by the argument list when it is no longer needed.
To free the memory, use a single call to LocalFree function.
*/
LPWSTR DosExec(LPWSTR command){
// Allocate 1Mo to store the output (final buffer will be sized to actual output)
// If output exceeds that size, it will be truncated
const SIZE_T RESULT_SIZE = sizeof(char)*1024*1024;
char* output = (char*) LocalAlloc(LPTR, RESULT_SIZE);
HANDLE readPipe, writePipe;
SECURITY_ATTRIBUTES security;
STARTUPINFOA start;
PROCESS_INFORMATION processInfo;
security.nLength = sizeof(SECURITY_ATTRIBUTES);
security.bInheritHandle = true;
security.lpSecurityDescriptor = NULL;
if ( CreatePipe(
&readPipe, // address of variable for read handle
&writePipe, // address of variable for write handle
&security, // pointer to security attributes
0 // number of bytes reserved for pipe
) ){
GetStartupInfoA(&start);
start.hStdOutput = writePipe;
start.hStdError = writePipe;
start.hStdInput = readPipe;
start.dwFlags = STARTF_USESTDHANDLES + STARTF_USESHOWWINDOW;
start.wShowWindow = SW_HIDE;
// We have to start the DOS app the same way cmd.exe does (using the current Win32 ANSI code-page).
// So, we use the "ANSI" version of createProcess, to be able to pass a LPSTR (single/multi-byte character string)
// instead of a LPWSTR (wide-character string) and we use the UNICODEtoANSI function to convert the given command
if (CreateProcessA(NULL, // pointer to name of executable module
UNICODEtoANSI(command), // pointer to command line string
&security, // pointer to process security attributes
&security, // pointer to thread security attributes
TRUE, // handle inheritance flag
NORMAL_PRIORITY_CLASS, // creation flags
NULL, // pointer to new environment block
NULL, // pointer to current directory name
&start, // pointer to STARTUPINFO
&processInfo // pointer to PROCESS_INFORMATION
)){
// wait for the child process to start
for(UINT state = WAIT_TIMEOUT; state == WAIT_TIMEOUT; state = WaitForSingleObject(processInfo.hProcess, 100) );
DWORD bytesRead = 0, count = 0;
const int BUFF_SIZE = 1024;
char* buffer = (char*) malloc(sizeof(char)*BUFF_SIZE+1);
strcpy(output, "");
do {
DWORD dwAvail = 0;
if (!PeekNamedPipe(readPipe, NULL, 0, NULL, &dwAvail, NULL)) {
// error, the child process might have ended
break;
}
if (!dwAvail) {
// no data available in the pipe
break;
}
ReadFile(readPipe, buffer, BUFF_SIZE, &bytesRead, NULL);
buffer[bytesRead] = '\0';
if((count+bytesRead) > RESULT_SIZE) break;
strcat(output, buffer);
count += bytesRead;
} while (bytesRead >= BUFF_SIZE);
free(buffer);
}
}
CloseHandle(processInfo.hThread);
CloseHandle(processInfo.hProcess);
CloseHandle(writePipe);
CloseHandle(readPipe);
// convert result buffer to a wide-character string
LPWSTR result = OEMtoUNICODE(output);
LocalFree(output);
return result;
}
You should use CreateProcess on cmd.exe with the /C parameter to tunnel the ipconfig command. The > does not work per se on the command line. You have to redirect programmatically the stdout.
I have a similar program [windows7 and 10 tested] on github
https://github.com/vlsireddy/remwin/tree/master/remwin
This is server program which
listens on "Local Area Connection" named interface in windows for UDP port (5555) and receives udp packet.
received udp packet content is executed on cmd.exe [please not cmd.exe is NOT closed after running the command and the output string [the output of the executed command] is fedback to the client program over same udp port].
In other words,
command received in udp packet -> parsed udp packet -> executed on cmd.exe -> output sent back on same port to client program
This does not show "console window"
No need for someone to execute manually command on cmd.exe
remwin.exe can be running in background and its a thin server program
To add to #Cédric Françoys answer, I fixed a few things in his code for a Windows build:
Missing function definition:
To make the code compile, add the following function definition:
#define UNICODEtoANSI(str) WCHARtoCHAR(str, CP_OEMCP)
LPSTR WCHARtoCHAR(LPWSTR wstr, UINT codePage) {
int len = (int)wcslen(wstr) + 1;
int size_needed = WideCharToMultiByte(codePage, 0, wstr, len, NULL, 0, NULL, NULL);
LPSTR str = (LPSTR)LocalAlloc(LPTR, sizeof(CHAR) * size_needed);
WideCharToMultiByte(codePage, 0, wstr, len, str, size_needed, NULL, NULL);
return str;
}
Unsafe CRT string function calls:
To make the code compile, replace strcpy and strcat with the following calls
strcpy_s(output, sizeof(output), "");
strcat_s(output, RESULT_SIZE, buffer);
Remove redundant null-termination:
Remove in the do-while loop:
buffer[bytesRead] = '\0';
because strcat_s takes care of that.
You could use
string command = "start /B cmd /c " + myCommand;
system(command.c_str());
Hopefully this works for you

Win 32 Writefile: Access Viloation and Error 1784

Two problems with the below code. To begin, I have been scouring this and various other forums for answers to my 1784 error code and everything I've tried has failed. Two of the threads I've checked on stackoverflow are WriteFile returning error 1784 and BlockWrite I/O Error 1784. I've checked some others on this forum but I'm not remembering exactly what the are right now.
I'm trying to save an array of structs to an empty binary file. The first problem is that I get an access violation if my size variable (nNumberOfBytesToWrite parameter) is anything less about 99000 bytes. That number jumps around. For awhile when I was testing it would have the access violation if it was 99,999 bytes but not 100,000 bytes. Of course, what I eventually want to do is set the size to the size of the entire array. The original code to handle that is now commented out so I can test with various sizes.
The second thing that happens (if I don't get an access violation) is I get error code 1784 and WriteFile fails every time. As other threads on this topic have stated, this is defined on MSDN as ERROR_INVALID_USER_BUFFER and the description is "The supplied user buffer is not valid for the requested operation." I've looked at MSDN's own example for opening files like this (http://msdn.microsoft.com/en-us/library/windows/desktop/bb540534%28v=vs.85%29.aspx) and have tried some variations based on their code, but nothing seems to work.
This problem is probably massively noob and I'm sure I'm overlooking something ridiculously simple, but if anyone has suggestions they'd be greatly appreciated.
case IDM_SAVE1:
{
HANDLE hFile = CreateFile("MineSave.mss", GENERIC_WRITE, 0, NULL, CREATE_ALWAYS, FILE_ATTRIBUTE_NORMAL, NULL);
int test_buffer[] = {1,2,3,4,5,6,7,8,9,10};
if(hFile != INVALID_HANDLE_VALUE)
{
BOOL bSuccess;
DWORD size = 100000; //DWORD size = (((sizeof(tile)) * tiles_total));
LPDWORD bytes_written = 0;
bSuccess = WriteFile(hFile, test_buffer, size, bytes_written, NULL);
if(bSuccess)
{
MessageBox(hwnd, "File saved successfully.", "Great Job!", MB_OK);
}
else
{
DWORD error = GetLastError();
MessageBox(hwnd, "Could not write to file.", "Error", MB_OK);
}
CloseHandle(hFile);
}
else
{
MessageBox(hwnd, "Could not create file.", "Error", MB_OK);
}
}
break;
Your buffer is the size of 10 ints, which is 40 bytes on Windows. You are trying to write 100,000 bytes from that buffer. That is undefined behaviour, a buffer overrun. Hence the access violation.
You must not pass a value greater than sizeof(test_buffer), i.e. 40, to the nNumberOfBytesToWrite parameter of WriteFile.
You'll need to write this file in a loop, writing 40 bytes at a time, until you have written as much as you need. Perhaps something like this:
BOOL bSuccess = TRUE;
DWORD bytesRemaining = 100000;
while (bSuccess && bytesRemaining>0)
{
DWORD bytesToWrite = std::min(sizeof(test_buffer), bytesRemaining);
DWORD bytesWritten;
bSuccess = WriteFile(hFile, test_buffer, bytesToWrite, &bytesWritten, NULL);
bytesRemaining -= bytesToWrite;
}
if (!bSuccess)
{
//handle error;
}
Writing 40 bytes at a time is pretty slow. You'll find it more efficient to write a few KB with each call to WriteFile.
Note that you aren't allowed to pass NULL to the lpNumberOfBytesWritten parameter if you also pass NULL to lpOverlapped, as you do here. From the documentation:
lpNumberOfBytesWritten [out, optional]
......
This parameter can be NULL only when the lpOverlapped parameter is not NULL.
You must provide a buffer to receive the number of bytes written, either the lpNumberOfBytesWritten parameter must be non-NULL, or the lpOverlapped parameter must be non-NULL.
You are passing NULL for both, which is illegal and causes the access violation.