C++ conversion from char * to unsigned char? - c++

Hello I'm trying to create a take a program path and put in in a registry file, but I keep on having an error. Here is the code:
#include <iostream>
#include <windows.h>
#include <winuser.h>
#include <tchar.h>
#include <limits>
using namespace std;
void reg() {
char buffer[MAX_PATH];
GetModuleFileName(NULL,buffer,sizeof(buffer));
const unsigned char Path[ MAX_PATH ] = {buffer};
::HKEY Handle_Key = 0;
::RegSetValueEx( Handle_Key, "My Directory", 0, 1, Path, sizeof Path );
};
The error I'm getting says
invalid conversion from 'char*' to 'unsigned char' [-fpermissive]
I have spent hours looking for a solution, but I can't find one.

The problem, I'm guessing, is this line
const unsigned char Path[ MAX_PATH ] = {buffer};
The problem here is that you try to create an array of single characters with a character pointer.
You only use that variable as a temporary for the RegSetValueEx call, so you don't really need it. Instead call that function with buffer directly.
Also, you should not use sizeof here, since that will put all of the buffer in the registry, and not only the actual string. Use strlen.
Like:
::RegSetValueEx( Handle_Key, "My Directory", 0, 1,
reinterpret_cast<unsigned char*>(buffer),
strlen(buffer));

Related

Strange behavior with c++ io

I am using zlib to compress data for a game I am making. Here is the code I have been using
#include <SFML/Graphics.hpp>
#include <Windows.h>
#include <fstream>
#include <iostream>
#include "zlib.h"
#include "zconf.h"
using namespace std;
void compress(Bytef* toWrite, int bufferSize, char* filename)
{
uLongf comprLen = compressBound(bufferSize);
Bytef* data = new Bytef[comprLen];
compress(data, &comprLen, &toWrite[0], bufferSize);
ofstream file(filename);
file.write((char*) data, comprLen);
file.close();
cout<<comprLen;
}
int main()
{
const int X_BLOCKS = 1700;
const int Y_BLOCKS = 19;
int bufferSize = X_BLOCKS * Y_BLOCKS;
Bytef world[X_BLOCKS][Y_BLOCKS];
//fill world with integer values
compress(&world[0][0], bufferSize, "Level.lvl");
while(2);
return EXIT_SUCCESS;
}
Now I would have expected the program to simply compress the array world and save it to a file. However I noticed a weird behavior. When I prited the value for 'comprLen' it was a different length then the created file. I couldn't understand where the extra bytes in the file were coming from.
You need to open the file in binary mode:
std::ofstream file(filename, std::ios_base::binary);
without the std::ios_base::binary flag the system will replace end of line characters (\n) by end of line sequences (\n\r). Suppressing this conversion is the only purpose of the std::ios_base::binary flag.
Note that the conversion is made on the bytes written to the stream. That is, the number of actually written bytes will increase compared to the second argument to write(). Also note, that you need to make sure that you are using the "C" locale rather than some locale with a non-trivial code conversion facet (since you don't explicitly set the global std::locale in your code you should get the default which is the "C" locale).

strtoull was not declared in this scope while converting?

I am working with C++ in eclipse CDT and I am trying to convert string to uint64_t by using strtoull but everytime I get below error message -
..\src\HelloTest.cpp:39:42: error: strtoull was not declared in this scope
Below is my C++ example
#include <iostream>
#include <cstring>
#include <string>
using namespace std;
int main() {
string str = "1234567";
uint64_t hashing = strtoull(str, 0, 0);
cout << hashing << endl;
}
return 0;
}
Is there anything wrong I am doing?
Why your solution doesn't work has already been pointed out by others. But there hasn't been a good alternative suggested yet.
Try this for C++03 strtoull usage instead:
#include <string>
#include <cstdlib>
int main()
{
std::string str = "1234";
// Using NULL for second parameter makes the call easier,
// but reduces your chances to recover from error. Check
// the docs for details.
unsigned long long ul = std::strtoull( str.c_str(), NULL, 0 );
}
Or, since C++11, do it directly from std::string via stoull (which is just a wrapper for the above, but saves on one include and one function call in your code):
#include <string>
int main()
{
std::string str = "1234";
// See comment above.
unsigned long long ul = std::stoull( str, nullptr, 0 );
}
Never use char[] or pointers if you have a working alternative. The dark side of C++, they are. Quicker, easier, more seductive. If once you start down the dark path, forever will it dominate your destiny, consume you it will. ;-)
the structure for strtoull is: strtoull(const char *, char * *, int)
You have given it a std::string as pointed out by #juanchopanza
This is the solution I came up with is
#include <iostream>
#include <cstring>
#include <string>
#include <cstdlib>
using namespace std;
int main() {
char str[] = "1234567";
unsigned long long ul;
char* new_pos;
charDoublePointer = 0;
ul = strtoull(str, &new_pos, 0);
cout << ul << endl;
return 0;
}
The output I got was: 1234567
Straight from the eclipse console.
Also at the end of your program you have return 0 out of scope with an extra curly brace.

'GetKeyNameTextW' : cannot convert parameter 2 from 'char [255]' to 'LPWSTR' [duplicate]

When I compile this code in Visual C++, I got the below error. Can help me solve this issue..
DWORD nBufferLength = MAX_PATH;
char szCurrentDirectory[MAX_PATH + 1];
GetCurrentDirectory(nBufferLength, szCurrentDirectory);
szCurrentDirectory[MAX_PATH +1 ] = '\0';
Error message:
Error 5 error C2664: 'GetCurrentDirectoryW' : cannot convert parameter 2 from 'char [261]' to 'LPWSTR' c:\car.cpp
Your program is configured to be compiled as unicode. Thats why GetCurrentDirectory is GetCurrentDirectoryW, which expects a LPWSTR (wchar_t*).
GetCurrentDirectoryW expects a wchar_t instead of char array. You can do this using TCHAR, which - like GetCurrentDirectory - depends on the unicode setting and always represents the appropriate character type.
Don't forget to prepend your '\0' with an L in order to make the char literal unicode, too!
It seems you have define UNICODE, _UNICODE compiler flags. In that case, you need to change the type of szCurrentDirectory from char to TCHAR.
Headers:
#include <iostream>
#include <fstream>
#include <direct.h>
#include <string.h>
#include <windows.h> //not sure
Function to get current directory:
std::string getCurrentDirectoryOnWindows()
{
const unsigned long maxDir = 260;
wchar_t currentDir[maxDir];
GetCurrentDirectory(maxDir, currentDir);
std::wstring ws(currentDir);
std::string current_dir(ws.begin(), ws.end());
return std::string(current_dir);
}
To call function:
std::string path = getCurrentDirectoryOnWindows(); //Output like: C:\Users\NameUser\Documents\Programming\MFC Program 5
To make dir (Folder) in current directory:
std::string FolderName = "NewFolder";
std::string Dir1 = getCurrentDirectoryOnWindows() + "\\" + FolderName;
_mkdir(Dir1.c_str());
This works for me in MFC C++.

Converting 'const char*' to 'LPCTSTR' for CreateDirectory

#include "stdafx.h"
#include <string>
#include <windows.h>
using namespace std;
int main()
{
string FilePath = "C:\\Documents and Settings\\whatever";
CreateDirectory(FilePath, NULL);
return 0;
}
Error: error C2664: 'CreateDirectory' : cannot convert parameter 1 from 'const char *' to 'LPCTSTR'
How do I make this conversion?
The next step is to set today's date as a string or char and concatenate it with the filepath. Will this change how I do step 1?
I am terrible at data types and conversions, is there a good explanation for 5 year olds out there?
std::string is a class that holds char-based data. To pass a std::string data to API functions, you have to use its c_str() method to get a char* pointer to the string's actual data.
CreateDirectory() takes a TCHAR* as input. If UNICODE is defined, TCHAR maps to wchar_t, otherwise it maps to char instead. If you need to stick with std::string but do not want to make your code UNICODE-aware, then use CreateDirectoryA() instead, eg:
#include "stdafx.h"
#include <string>
#include <windows.h>
int main()
{
std::string FilePath = "C:\\Documents and Settings\\whatever";
CreateDirectoryA(FilePath.c_str(), NULL);
return 0;
}
To make this code TCHAR-aware, you can do this instead:
#include "stdafx.h"
#include <string>
#include <windows.h>
int main()
{
std::basic_string<TCHAR> FilePath = TEXT("C:\\Documents and Settings\\whatever");
CreateDirectory(FilePath.c_str(), NULL);
return 0;
}
However, Ansi-based OS versions are long dead, everything is Unicode nowadays. TCHAR should not be used in new code anymore:
#include "stdafx.h"
#include <string>
#include <windows.h>
int main()
{
std::wstring FilePath = L"C:\\Documents and Settings\\whatever";
CreateDirectoryW(FilePath.c_str(), NULL);
return 0;
}
If you're not building a Unicode executable, calling c_str() on the std::string will result in a const char* (aka non-Unicode LPCTSTR) that you can pass into CreateDirectory().
The code would look like this:
CreateDirectory(FilePath.c_str(), NULL):
Please note that this will result in a compile error if you're trying to build a Unicode executable.
If you have to append to FilePath I would recommend that you either continue to use std::string or use Microsoft's CString to do the string manipulation as that's less painful that doing it the C way and juggling raw char*. Personally I would use std::string unless you are already in an MFC application that uses CString.

Why is compiling this code producing an error?

I believe this is the right header:
#include <cstdio>
Note, there is a difference between the above declaration and this one:
#include <stdio.h>
The first one puts everything in the "std" namespace, the 2nd one doesn't. So I am using the first one.
Below is the code which I am compiling using g++4.4.6 on aix6.1:-
#include <cstdarg> //< va_list
#include <cstdio> //< vsnprintf()
#include "virtual_utils.h"
namespace VS
{
const char* format_str( const char* str, ... ) throw()
{
static char buf[10][1024];
static unsigned long buf_no = 0;
char* cur_buf = buf[ ++buf_no % 10 ];
buf_no %= 10;
va_list vl;
va_start( vl, str );
#ifdef _MSC_VER
std::_vsnprintf( cur_buf, sizeof(buf), str, vl );
#else
std::vsnprintf( cur_buf, sizeof(buf), str, vl );
#endif
return cur_buf;
}
} //< namespace VS
These are the following errors which I am getting:-
virtual_utils.C: In function 'const char* VS::format_str(const char*, ...)':
virtual_utils.C:28: error: 'vsnprintf' is not a member of 'std'
Edit:
Modifying the above code to remove the #include "virtual_utils.h" and to add a main(), it compiles with a warning under gcc4.3.4 on Ideone and cleanly under gcc4.5.1.
Compile with --save-temps, and examine the .ii file it produces. That should make it clear what's defined in what namespace, and what isn't.