In the following code if I comment out the call to "GetCurrentDirectory" everything works fine, but if I don't then the code breaks after it, no child windows show up, but the program don't crash. The compiler doesn't give any error.
char *iniFilePath;
int lenWritten = GetCurrentDirectory( MAX_PATH, iniFilePath );
if( lenWritten )
{
lstrcat( iniFilePath, iniFileName.c_str() );
char *buffer;
GetPrivateProfileString( iniServerSectionName.c_str(), serverIp.c_str(), "", buffer, MAX_PATH, iniFilePath );// server ip
MessageBox( 0, buffer, 0, 0 );
}
else
{
MessageBox( 0,0,0,0 );
}
iniFilePath is an unintialised pointer which GetCurrentDirectory() is attempting to write to, causing undefined behaviour. GetCurrentDirectory() does not allocate a buffer for the caller: it must be provided.
Change to:
char iniFilePath[MAX_PATH]; // or similar.
Instead of using lstrcat(), which has Warning Do not use message on its reference page, construct the path use a std::string instead to avoid potential buffer overruns:
const std::string full_file_path(std::string(iniFilePath) + "/" + iniFileName);
Note similar issue with buffer, as pointed out by Wimmel.
I would do this in order to get the current directory -
int pathLength = GetCurrentDirectory(0, NULL);
std::vector<char> iniFilePath(pathLength);
GetCurrentDirectory(pathLength, iniFilePath.data());
Note however that this won't be thread safe as the directory could change from another thread between the two calls but as far as I know few programs change the current directory so it's unlikely to be an issue.
Related
I have never worked with SHA256 before. Recently, I've been trying to implement a SHA256 checksum in order to see if a library has been tampered with.
The funny thing is that OpenSSL SHA256 generates a different sum for the exact same library depending on its location. If it's located in another folder, the sum is different.
Is there anything I could do to get the same sum no matter where the file is located? I provided the code snippets and the sums I get.
unsigned char* getsum( char* filename ) {
std::ifstream pFile( filename, std::ios::binary );
SHA256_CTX sContext;
char pBuffer[ 1024*16 ];
unsigned char pSum[SHA256_DIGEST_LENGTH];
SHA256_Init( &sContext );
while( pFile.good() ) {
pFile.read( pBuffer, sizeof(pBuffer) );
SHA256_Update( &sContext, pBuffer, pFile.gcount() );
}
SHA256_Final( pSum, &sContext );
return pSum;
}
...
char* cl_sum = new char[256];
sprintf( cl_sum, "%02x", getsum("library.dll") );
MessageBoxA( NULL, cl_sum , NULL, NULL );
delete[] cl_sum;
exit( -1 );
I also tried using the SHA256() function instead of the whole SHA256 context, SHA256_Init(), Update & Final thing, but still the same result.
Your code has lots of basic mistakes:
sprintf( cl_sum, "%02x", getsum("library.dll") );
Format string says "print me int value", getsum returns char *! This is wrong and if you use compiler warnings it should explain issue.
Return value is wrong too:
unsigned char* getsum( char* filename ) {
...
unsigned char pSum[SHA256_DIGEST_LENGTH];
...
return pSum;
}
You are returning a pointer to local object which lifetime ends when function returns.
Here is my tweak to your code
using Sha256Hash = std::array<unsigned char, SHA256_DIGEST_LENGTH>;
Sha256Hash calcSha256Hash(std::istream& in)
{
SHA256_CTX sContext;
SHA256_Init(&sContext);
std::array<char, 1024> data;
while (in.good()) {
in.read(data.data(), data.size());
SHA256_Update(&sContext, data.data(), in.gcount());
}
Sha256Hash hash;
SHA256_Final(hash.data(), &sContext);
return hash;
}
https://godbolt.org/z/Y3nY13jao
Side note: I'm surprised that std::istream::read sets failure bit when end of file is reached and some data has bee read. This is a inconsistent with behavior of streaming operator.
You are returning a pointer to local variable pSum, that is undefined behaviour(UB).
An explanation for differen outputs might be that since the array is located on stack, calls to future functions like sprintf or MessageBoxA likely overwrite the array with their own variables. But anything can happen after UB.
Use and return std::vector<unsigned char> instead.
Also do not use new, instead std::array or another std::vector is much safer.
Lastly, I would strongly advise to turn on compiler warnings for your compiler, it should warn about the above issue.
Your code's output depends on what value getsum returns. The value getsum returns is an address on the stack. So your code's output depends on where the stack is located.
Somewhere in your code, you should save the result of the SHA256 operation in a buffer of some kind and you should output the contents of that buffer. Your code does neither of those two things.
If you think you save the output of the SHA256 operation in some kind of buffer, where do you think that buffer is allocated. The only buffer you use is pBuffer, and that's allocated on the stack in getsum and so doesn't exist when getsum returns.
If you think you actually look in the buffer somewhere, where do you think that code is? Your sprintf call never looks in the buffer, it just outputs a pointer in hex.
There are several problems with your code:
your function is returning a pointer to a local variable. When your function exits, the variable is destroyed, leaving the returned pointer dangling.
you are printing the value of the returned pointer itself (ie, the memory address it holds), not the SHA data that it points at. That is why you are seeing the same value being displayed in the message box. You are seeing the memory address of the local variable, and multiple invocations of your function may reuse the same memory.
your reading loop is not making sure that read() is successful before calling SHA256_Update().
Try something more like this instead:
using Sha256Digest = std::array<unsigned char, SHA256_DIGEST_LENGTH>;
Sha256Digest getsum( const char* filename ) {
std::ifstream file( filename, std::ios::binary );
SHA256_CTX sContext;
char buffer[ 1024*16 ];
Sha256Digest sum;
SHA256_Init( &sContext );
while( file.read( buffer, sizeof(buffer) ) ) {
SHA256_Update( &sContext, buffer, pFile.gcount() );
}
SHA256_Final( sum.data(), &sContext );
return sum;
}
...
Sha256Digest sum = getsum("library.dll");
char cl_sum[256] = {}, *ptr = cl_sum;
for (auto ch : sum) {
ptr += sprintf( ptr, "%02x", ch );
}
MessageBoxA( NULL, cl_sum, NULL, NULL );
/* Alternatively:
std::ostringstream cl_sum;
for (auto ch : sum) {
cl_sum << std::hex << std::setw(2) << std::setfill('0') << ch;
}
MessageBoxA( NULL, cl_sum.str().c_str(), NULL, NULL );
*/
I have a timer, after which a local html file should be executed, but I hit some kind of error:
int delay = 120;
delay *= CLOCKS_PER_SEC;
clock_t now = clock();
while (clock() - now < delay);
string strWebPage = "file:///D:/project/site/scam.html";
strWebPage = "file:///" + strWebPage;
ShellExecute(NULL, NULL, NULL, strWebPage, NULL, SW_SHOWNORMAL);
return 0;
E0413 no suitable conversion function from "std::string" to "LPCWSTR" exists
I'm new to C++, so it might be an obvious solution.
Could anyone point me to how I can fix it?
You have two problems.
But first, you should always take the time to read the documentation. For Win32 functions, you can get to a known function by typing something like “msdn ShellExecute” into your favorite search engine and clicking the “Lucky” button.
Problem One
ShellExecute() is a C function. It does not take std::string as argument. It needs a pointer to characters. Hence:
std::string filename = "birds.html";
INT_PTR ok = ShellExecute(
NULL, // no window
NULL, // use default operation
filename.c_str(), // file to open
NULL, // no args to executable files
NULL, // no start directory
SW_SHOWNORMAL );
if (ok <= 32)
fooey();
Notice that we pass a const char * to the function as the file to <default verb>.
Problem Two
From your image it would appear that you have your application declared as a Unicode application. In other words, somewhere there is a #define UNICODE.
This makes ShellExecute() expect a WIDE character string (const wchar_t *)as argument, not a narrow string (const char *).
You can still use a narrow string by simply specifying that you want the narrow version:
INT_PTR ok = ShellExecuteA(
...
I recommend you look at how you set up your project to figure out how you got things to think you were using wide strings instead of narrow strings.
So, code I can't change calls an executable, and I need to give it different commandline args than what the black box code calls. I was figuring I could make an executable to serve as a proxy. proc.exe sits where the black box section is expecting, takes the commandline arguments, modifies them, then calls procReal.exe, the original file.
Unfortunately CreateProcess seems to fail to start, returning status 183. I've looked up everything I can and can't find out much about this. Have tried flipping things around, setting the handle inheritance to true, manually specifying the working directory, not doing either of those things. No luck. I assume there's something going on here with inheriting the proper security context of the calling application so the wrapper works as a proper passthrough, but I can't figure out how to do it...
Code below, irrelevant sections pruned.
EDIT Put the full code here after request. This isn't making any sense anymore. It now will partially work, but only if the fopen section for traceFile isn't there. Not even the fprintfs removed, specifically the whole section has to be cut out.
I've tried to respond to everyone's comments and I think I've ruled out most of those things as an issue, but am left with the current anomalous behavior. What little more I could read up on this says that some forms of copying the strings around could lead to memory overflows, is that possible at all?
#include <iostream>
#include <fstream>
#include <windows.h>
#include <string>
#include <vector>
#include <stdio.h>
#include <tchar.h>
#include <algorithm>
#include <string>
using namespace std;
bool caseInsensitiveStringCompare( const std::string& str1, const std::string& str2 );
int main(int argc, char* argv[]) {
const string path = "E:\\util\\bin\\";
const string procName = "procReal.exe";
const string argToFilter = "-t";
string origValue;
string passedValue;
for(int i = 1; i < argc; i++)
{
origValue.append(" ").append(argv[i]);
}
for(int i = 1; i < argc; i++)
{
if (!caseInsensitiveStringCompare(argv[i],argToFilter))
{
passedValue.append(" ").append(argv[i]);
}
else
{
i++; // skip over argument and it's value
}
}
const LPCTSTR exeModule = (path+procName).c_str();
std::vector<char> chars(passedValue.c_str(), passedValue.c_str() + passedValue.size() + 1u);
LPTSTR exeArgs = &chars[0];
STARTUPINFO si;
PROCESS_INFORMATION pi;
ZeroMemory( &si, sizeof(si) );
si.cb = sizeof(si);
ZeroMemory( &pi, sizeof(pi) );
GetStartupInfo(&si);
FILE* traceFile;
traceFile = fopen ((path+"lastRun.txt").c_str(), "w"); // This causes exeModule to change value for unknown reasons???
fprintf(traceFile, "orig: %s%s\n", exeModule, origValue.c_str());
fprintf(traceFile, "%s%s\n", exeModule, exeArgs);
SetLastError(0);
// Start the child process.
if( !CreateProcess( exeModule, // use module name with args for exeArgs
exeArgs, // Command line
NULL, // Process handle not inheritable
NULL, // Thread handle not inheritable
TRUE, // Set handle inheritance to FALSE
0, // No creation flags
NULL, // Use parent's environment block
NULL, // use parent's starting directory
&si, // Pointer to STARTUPINFO structure
&pi ) // Pointer to PROCESS_INFORMATION structure
)
{
FILE* myfile;
myfile = fopen ((path+"error.txt").c_str(), "w");
fprintf(myfile, "CreateProcess failed (%d).\n", int(GetLastError()));
fclose(myfile);
}
// Wait until child process exits.
WaitForSingleObject( pi.hProcess, INFINITE );
DWORD exit_code;
GetExitCodeProcess(pi.hProcess, &exit_code);
fprintf(traceFile, "Exit Code: %d\n", int(exit_code));
fclose(traceFile);
// Close process and thread handles.
CloseHandle( pi.hProcess );
CloseHandle( pi.hThread );
return exit_code;
}
bool caseInsensitiveStringCompare( const std::string& str1, const std::string& str2 ) {
std::string str1Cpy( str1 );
std::string str2Cpy( str2 );
std::transform( str1Cpy.begin(), str1Cpy.end(), str1Cpy.begin(), ::tolower );
std::transform( str2Cpy.begin(), str2Cpy.end(), str2Cpy.begin(), ::tolower );
return ( str1Cpy == str2Cpy );
}
The most obvious problem is that you can't say (path+procName).c_str() because the temporary string object it builds is discarded immediately, invalidating the returned pointer. I'm also very dubious about assuming that the elements of a vector are necessarily consecutive.
The corrected string handling code should look something like this:
string passedValue(procName); // First element of command line MUST be module name
...
const string exeModuleString(path + procName);
const LPCTSTR exeModule = exeModuleString.c_str();
LPTSTR exeArgs = new char[passedValue.size() + 1];
passedValue.copy(exeArgs, passedValue.size());
exeArgs[passedValue.size()] = '\0';
(That might not be the best way to do it; I don't use C++ often. But it should work correctly.)
The corrected error handling code, ensuring that the last error code is read immediately, should look something like this:
{
DWORD err = GetLastError();
FILE* myfile;
myfile = fopen ((path+"error.txt").c_str(), "w");
fprintf(myfile, "CreateProcess failed (%d).\n", int(err));
fclose(myfile);
}
Your code was reporting the wrong error code, because calling fopen() changes it. (When a new file is created that overwrites an existing file, the last error code is set to ERROR_ALREADY_EXISTS.)
There are two broader issues, which may or may not matter in your context. Firstly, you're using argv[] to build the command line for the new process; that means that the command line parsing (as described in Parsing C Command-Line Arguments) will be applied twice (once by your process and once by the child) which may cause trouble if the command line contains any special characters such as quote marks or backslashes. Ideally, in the general case, you would call GetCommandLine() instead. (Granted, this makes parsing the string to remove the extra argument quite a bit harder.)
Secondly, you're obviously building the code in ANSI mode. This may cause problems if the command line ever contains any wide ("Unicode") characters. It is generally accepted that best practice is to always build in Unicode mode. The only major change you'll need to make to the code is to replace string with wstring, so it should be straightforward enough.
I have the same problem.
The problem seems to be there when i call executables with no window.
I found 2 solutions:
1.
Create a bat file with the name of thet exe, followed by the arguments,
and then execute the bat file:
CreateProcess( "temp.bat" , NULL , ....etc
2.
use _spawnl
bool sendMessageToGraphics(char* msg)
{
//char ea[] = "SSS";
char* chRequest = msg; // Client -> Server
DWORD cbBytesWritten, cbRequestBytes;
// Send one message to the pipe.
cbRequestBytes = sizeof(TCHAR) * (lstrlen(chRequest) + 1);
if (*msg - '8' == 0)
{
char new_msg[1024] = { 0 };
string answer = "0" + '\0';
copy(answer.begin(), answer.end(), new_msg);
char *request = new_msg;
WriteFile(hPipe, request, cbRequestBytes, &cbRequestBytes, NULL);
}
BOOL bResult = WriteFile( // Write to the pipe.
hPipe, // Handle of the pipe
chRequest, // Message to be written
cbRequestBytes, // Number of bytes to writ
&cbBytesWritten, // Number of bytes written
NULL); // Not overlapped
if (!bResult/*Failed*/ || cbRequestBytes != cbBytesWritten/*Failed*/)
{
_tprintf(_T("WriteFile failed w/err 0x%08lx\n"), GetLastError());
return false;
}
_tprintf(_T("Sends %ld bytes; Message: \"%s\"\n"),
cbBytesWritten, chRequest);
return true;
}
after the first writefile in running (In case of '8') the other writefile function doesn't work right, can someone understand why ?
the function sendMessageToGraphics need to send move to chess board
There are 2 problems in your code:
First of all, there's a (minor) problem where you initialize a string in your conditional statement. You initialize it as so:
string answer = "0" + '\0';
This does not do what you think it does. It will invoke the operator+ using const char* and char as its argument types. This will perform pointer addition, adding the value of '\0' to where your constant is stored. Since '\0' will be converted to the integer value of 0, it will not add anything to the constant. But your string ends up not having a '\0' terminator. You could solve this by changing the statement to:
string answer = std::string("0") + '\0';
But the real problem lies in the way you use your size variables. You first initialize the size variable to the string length of your input variable (including the terminating '\0' character). Then in your conditional statement you create a new string which you pass to WriteFile, yet you still use the original size. This may cause a buffer overrun, which is undefined behavior. You also set your size variable to however many bytes you wrote to the file. Then later on you use this same value again in the next call. You never actually check this value, so this could cause problems.
The easiest way to change this, is to make sure your sizes are set up correctly. For example, instead of the first call, you could do this:
WriteFile(hPipe, request, answer.size(), &cbBytesWritten, NULL);
Then check the return value WriteFile and the value of cbBytesWritten before you make the next call to WriteFile, that way you know your first call succeeded too.
Also, do not forget to remove your sizeof(TCHAR) part in your size calculation. You are never using TCHAR in your code. Your input is a regular char* and so is the string you use in your conditional. I would also advice replacing WriteFile by WriteFileA to show you are using such characters.
Last of all, make sure your server is actually reading bytes from the handle you write to. If your server does not read from the handle, the WriteFile function will freeze until it can write to the handle again.
I have the following function which I intend to load shaders with (error checking removed for brevity):
unsigned int readFile(const char* file, char** buffer)
{
FILE* fp;
fopen_s(&fp, file, "rb");
fseek(fp, 0, SEEK_END);
size_t size = ftell(fp);
fseek(fp, 0, SEEK_SET);
*buffer = new char[size + 1];
fread(*buffer, 1, size, fp);
*buffer[size] = 0; // BAD LINE, only [0] is fine.
fclose(fp);
return 0;
}
It is called with:
char* fileContents = nullptr;
readAllFile("test.txt", &fileContents);
I cannot figure out how to fix the bad line. When I use char*& buffer as the out parameter it works fine, and a reference in large part is functionally the same as a pointer right?
The error is:
Exception thrown at 0x011919D4 in My World_Win32_Debug.exe:
0xC0000005: Access violation writing location 0xCCCCCCCC.
How should I set the last element of the buffer to 0 (null terminator)? I've looked through the debugger and the contents of buffer are valid, and set properly until reaching the bad line despite buffer being referenced the same way every time.
With only [0] working fine, that indicates to me I'm address only the pointer itself, not it's data, but I don't know how to address it otherwise. Every other way I've tried gives a compile error.
I'm aware that references are preferred in many cases, and there's other problems here, but I do need to know why I have the problem above first.
You want this:
(*buffer)[1] = 0;
instead of:
*buffer[1] = 0; // same as *(buffer[1]) = 0;
Out of desperation, I tried putting stars and brackets everywhere and I realized the problem. Order of operations is attempting to deference buffer[size] not buffer. Using (*buffer)[size] fixes the problem.