I am trying to open a file and read its content using the Win32 API:
HANDLE hFileRead = CreateFileA(FilePath,
GENERIC_READ,
0,
NULL,
OPEN_EXISTING,
FILE_ATTRIBUTE_NORMAL,
NULL);
LARGE_INTEGER fileSize = { 0 };
DWORD cbFileSize = GetFileSizeEx(hFileRead, &fileSize);
PBYTE buffer = (PBYTE)HeapAlloc(GetProcessHeap(), 0, fileSize.QuadPart);
DWORD dwBytesRead = 0;
NTSTATUS s = ReadFile(hFileRead,
buffer,
fileSize.QuadPart,
&dwBytesRead,
NULL);
std::cout << buffer << "\n"; // <<< expect to print "asdasd" but prints "asdasd"+random chars (1 or more each run)
What I want to get is the file content (.txt in this case).
What I get is the content of a .txt file + some more random chars (its different for each run).
I tried to write the buffer indexed, it seems that the buffer prints more than its size (?)
What am I doing wrong?
std::cout << buffer expects buffer to be null-terminated, but it is not. You need to allocate space for the terminator, eg:
PBYTE buffer = (PBYTE)HeapAlloc(GetProcessHeap(), 0, fileSize.QuadPart + 1);
...
buffer[dwBytesRead] = 0;
Alternatively, you can use cout.write() instead, then you don't need a terminator, eg:
std::cout.write(buffer,dwBytesRead);
Related
So I'm trying to write a sequence of zeroes from a file offset until the end of the file, here is my code:
HANDLE hFile = CreateFileA((LPCSTR)"hello.txt", GENERIC_READ | GENERIC_WRITE, 0, NULL, OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL, NULL);
if(hFile < 0) return -1;
DWORD fileSize = GetFileSize(hFile, NULL);
DWORD offset = 0x13d4;
DWORD check = NULL;
DWORD pos = SetFilePointer(hFile, offset, 0, FILE_BEGIN);
BYTE* zeroes = new BYTE[fileSize-offset];
ZeroMemory((PVOID)zeroes, fileSize-offset);
WriteFile(hFile, (PVOID)&zeroes, fileSize-offset, &check, NULL);
printf("Wrote %d bytes at %x\n", check, pos);
if(check < fileSize-offset)
{
printf("[+] An error occured while trying to patch the file.");
return EXIT_FAILURE;
}
CloseHandle(hFile);
Now I checked my fileSize is correct, the file offset (pos) is the same as offset, my file Handle is valid, the number of bytes written stored in check is equal to the the zeroes buffer length and the last error is 0. However, when I check my file in hex mode it did not add any zeroes at the end.
Any ideas?
Thanks in advance
The line
WriteFile(hFile, (PVOID)&zeroes, fileSize-offset, &check, NULL);
is wrong. You are writing data in the pointer variable zeroes itself, not what is pointed at by the variable. Typically the pointer has only 4 or 8 bytes, so it may cause out-of-range access if the file is large enough.
Remove & before zeros to have it write contents of the buffer pointed at by zeroes.
WriteFile(hFile, (PVOID)zeroes, fileSize-offset, &check, NULL);
Half of the buffer used with ReadFile is corrupt. Regardless of the size of the buffer, half of it has the same corrupted character. I have look for anything that could be causing the read to stop early, etc. If I increase the size of the buffer, I see more of the file so it is not failing on a particular part of the file.
Visual Studio 2019. Windows 10.
#define MAXBUFFERSIZE 1024
DWORD bufferSize = MAXBUFFERSIZE;
_int64 fileRemaining;
HANDLE hFile;
DWORD dwBytesRead = 0;
//OVERLAPPED ol = { 0 };
LARGE_INTEGER dwPosition;
TCHAR* buffer;
hFile = CreateFile(
inputFilePath, // file to open
GENERIC_READ, // open for reading
FILE_SHARE_READ, // share for reading
NULL, // default security
OPEN_EXISTING, // existing file only
FILE_ATTRIBUTE_NORMAL, // normal file | FILE_FLAG_OVERLAPPED
NULL); // no attr. template
if (hFile == INVALID_HANDLE_VALUE)
{
DisplayErrorBox((LPWSTR)L"CreateFile");
return 0;
}
LARGE_INTEGER size;
GetFileSizeEx(hFile, &size);
_int64 fileSize = (__int64)size.QuadPart;
double gigabytes = fileSize * 9.3132e-10;
sendToReportWindow(L"file size: %lld bytes \(%.1f gigabytes\)\n", fileSize, gigabytes);
if(fileSize > MAXBUFFERSIZE)
{
buffer = new TCHAR[MAXBUFFERSIZE];
}
else
{
buffer = new TCHAR[fileSize];
}
fileRemaining = fileSize;
sendToReportWindow(L"file remaining: %lld bytes\n", fileRemaining);
while (fileRemaining) // outer loop. while file remaining, read file chunk to buffer
{
sendToReportWindow(L"fileRemaining:%d\n", fileRemaining);
if (bufferSize > fileRemaining) // as fileremaining gets smaller as file is processed, it eventually is smaller than the buffer
bufferSize = fileRemaining;
if (FALSE == ReadFile(hFile, buffer, bufferSize, &dwBytesRead, NULL))
{
sendToReportWindow(L"file read failed\n");
CloseHandle(hFile);
return 0;
}
fileRemaining -= bufferSize;
// bunch of commented out code (verified that it does not cause the corruption)
}
delete [] buffer;
Debugger html view (512 byte buffer)
Debugger html view (1024 byte buffer). This shows that file is probably not the source of the corruption.
Misc notes: I have been told that memory mapping the file does not provide an advantage since I am sequentially processing the file. Another advantage to this method is that when I detect particular and reoccurring tags in the WARC file I can skip ahead ~500 bytes and resume processing. This improves speed.
The reason is that you use a buffer array of type TCHAR, and the size of TCHAR type is 2 bytes. So the bufferSize set when you call the ReadFile function is actually filled into the buffer array every 2 bytes.
But the actual size of the buffer is sizeof(TCHAR) * fileSize, so half of the buffer array you see is "corrupted"
I was learning about cryptography using c++, so i get started with very simple XOR, i want to open file 0x1.txt and encrypt it using XOR with 3 keys, and create a new file named 0x2.txt and put the encrypted data into it , and decrypt it and put its content in 0x3.txt:
encrpyt -> 0x1.txt -->put encrypted data in 0x2.txt ; decrypt 0x2.txt -->
put decryped data in 0x3.txt
and here is my code :
encrypt code :
LPVOID Crypt(HANDLE hFile, DWORD dwFileSize) {
// allocate buffer for file contents
LPVOID lpFileBytes = malloc(dwFileSize);
// read the file into the buffer
ReadFile(hFile, lpFileBytes, dwFileSize, NULL, NULL);
// apply XOR encryption
int i;
char key[3] = {'*', '~', '#'};
for (i = 0; i < dwFileSize; i++) {
*((LPBYTE)lpFileBytes + i) ^= key[i % sizeof(key)];
}
return lpFileBytes;
}
calling the function to encrypt the file:
HANDLE hFile = CreateFile("0x1.txt", GENERIC_READ, 0, NULL, OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL, NULL);
// get file size
DWORD dwFileSize = GetFileSize(hFile, NULL);
CloseHandle(hFile);
// crypt and get crypted bytes
LPVOID lpFileBytes = Crypt(hFile, dwFileSize);
then put the encrpyed data in 0x2.txt :
HANDLE hCryptedFile = CreateFile("0x2.txt", GENERIC_WRITE, 0, NULL, CREATE_ALWAYS, FILE_ATTRIBUTE_NORMAL, NULL);
// write to crypted file
WriteFile(hCryptedFile, lpFileBytes, dwFileSize, NULL, NULL);
Now i want to decrypt the content of 0x2.txt file i made this :
HANDLE hFile = CreateFile("0x2.txt", GENERIC_READ, 0, NULL, OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL, NULL);
// get file size
DWORD dwFileSize = GetFileSize(hFile, NULL);
// decrypt and obtain decrypted bytes
LPVOID lpFileBytes = Crypt(hFile, dwFileSize);
CloseHandle(hFile);
create file 0x3.txt:
HANDLE hTempFile = CreateFile("0x3.txt", GENERIC_WRITE, 0, NULL, CREATE_ALWAYS, FILE_ATTRIBUTE_NORMAL, NULL);
// write to temporary file
WriteFile(hTempFile, lpFileBytes, dwFileSize, NULL, NULL);
// clean up
CloseHandle(hTempFile);
free(lpFileBytes);
but the file, it encrypt more , not decrypt !. so what its the problem ?
here is my full code:
https://pastebin.com/6WZX5J1K
I think you should test your encrypt/decrypt first without all the file stuff.
Beware totally not tested code
void Encrypt( std::string& txt) {
// apply XOR encryption
int i;
char key[3] = {'*', '~', '#'};
for (i = 0; i < txt.length(); i++) {
txt[i] ^= key[i % sizeof(key)];
}
}
bool Test() {
std::string org { "123" };
std::string encrypted = org;
Encrypt(encrypted, encrypted.length());
std::string decrypted = encrypted;
Encrypt(decrypted, decrypted.length());
// std::cout << org << " " << encrypted << " " << decrypted << std::endl;
return org == decrypted;
}
If Test returns true your encode/decode works if not you can concentrate on that part else you need to start debugging that first.
This function should read string from file and return it, but immediately after call to ReadFile program hits breakpoint in debug_heap.cpp file at line 985.
char* readFile()
{
char curDirectory[MAX_PATH];
GetCurrentDirectory(MAX_PATH, curDirectory);
char filePath[MAX_PATH];
char *name = "\\data.txt";
sprintf_s(filePath, "%s%s", curDirectory, name);
HANDLE hFile = CreateFile(filePath, GENERIC_ALL, 0, NULL, OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL, NULL);
if (hFile == INVALID_HANDLE_VALUE)
{
DisplayError("Can't Create File");
return NULL;
}
DWORD fileSize = GetFileSize(hFile, NULL);
char *buffer = new char[fileSize / 2 + 1];
DWORD bytesReaded;
if (ReadFile(hFile, buffer, fileSize, &bytesReaded, NULL) == 0)
{
DisplayError("Can't read File");
return NULL;
}
buffer[bytesReaded] = '\0';
CloseHandle(hFile);
return buffer;
}
This is because your code writes beyond the end of buffer. You allocate buffer like this:
char *buffer = new char[fileSize / 2 + 1];
But then you attempt to read fileSize bytes from the file. Your allocation should instead be:
char *buffer = new char[fileSize + 1];
Some other comments:
Your call to sprintf_s risks buffer overrun.
Since you code in C++, use std::string and have that class manage buffers. You should do that for both filePath and buffer. That will allow you to avoid the leaks that your current code has. For instance, the failure return after ReadFile leaks memory. And it avoids placing a burden on the calling code to deallocate the memory.
You also leak the file handle if your code takes the failure return after ReadFile.
bytesReaded should be named bytesRead, to use the correct English word.
There is no real reason to believe that the executable file is located in the current working directory.
HINTERNET hInternet, hFtpSession, hFile;
hInternet = InternetOpen(NULL, INTERNET_OPEN_TYPE_DIRECT, NULL, NULL, 0);
hFtpSession = InternetConnect(hInternet, FTPHOST, INTERNET_DEFAULT_FTP_PORT,
FTPUSER, FTPPASS, INTERNET_SERVICE_FTP, INTERNET_FLAG_PASSIVE, 0);
hFile = FtpOpenFile(hFtpSession, argv[1], GENERIC_READ, FTP_TRANSFER_TYPE_ASCII, 0);
DWORD rSize;
char tmp[2048];
string buffer;
while(InternetReadFile(hFile, tmp, 2048, &rSize) && rSize > 0)
{
buffer += (string)tmp;
}
cout << buffer;
InternetCloseHandle(hFile);
InternetCloseHandle(hFtpSession);
InternetCloseHandle(hInternet);
I have this small program to read a text file from an FTP server to a string, but in the end of the read string there are some extra characters. I guess the problem is with the size of the text but I can't figure it out.
For example I want to download an encrypted text it will be corrupted like this
MIICIDANBgkqhkiG9w0BAQEFAAOCAg0AMIICCAKCAgEAp2q+92EQPncY0sN6SMTC0yh05GpZ
FUEGATvUx/zcUrzdDTva5JKz0MztuCn3lnHmaUB6L97w8fuVOhJjj90ItH4FdUk4R9m50son
DSZ4ad5ZKi7WE7GApIq21vgM0zoG5sr0Xb6X41IQgvYF7i9nX4zKO2znRyD3uzBqkqkhWzbS
HI2euCdhmXfx2az0ynNKrcnQINaWowipc0LrW0Q9PWI1McCs4V5sz8GkBMpKENb3m/LBlSqz
TboC/9hiD9Yfclvk3wFeNGvsnUUDpwZipF9cBMVzmfyjA1gBDNLV8qcTXSortHaGeHdLpqIg
Qn3SpDol8gPRis7A7Hy4KjRS8Y/iZa8Nv9EmEeful6u3IHY0Qror/wOeST5WhaTynVBT0wgP
6GSMWsofwA3NttsFCw55z5c8GBEGP6Uo+jP/rdiYvednT0iV8Grp+XJ6zMFqYlVcLqAzQWLw
dfqve/lr8+OKfR9WvG6hvrVduTnoy+LBFF/QEVxAlZqymlXMm/hcO/TUoE1Kmon6FwID4Mek
nV1eb1aCmUIzxFHtPkMO0KFitmxa5EGwAFHRAjXrp2lUHIQSaWwVnsfoQgmrG9ux2I27w+WR
8kFdkqWrutFz2xn6ovVwla7Oj0iL2f9azNO2Z2KT/sBPwGmI67M9Ceih0YLD0w7Woy32H2aM
mIeK368CARE=
8
The 8 shouldn't be there at the end.
The function InternetReadFile doesn't null terminate the buffer, so the content of the array tmp is not a string, yet you treat it as such.
The behavior is undefined.
Remove this line:
buffer += (string)tmp;
Instead use the overload of the string function append, which takes an array and it's size:
buffer.append( tmp , rSize );
As 2501 said you aren't taking into account the NULL terminator, so what you would want to do would be something like this
DWORD rSize;
char tmp[2048+1];
string buffer;
while(InternetReadFile(hFile, tmp, 2048, &rSize) && rSize > 0)
{
tmp[rSize] = '\0';
buffer += (string)tmp;
}