Convert unsigned long long to wchar_t * and concatenate - c++

There are tons of questions on this issue and I have been attempting the various solutions. There seems to be dozens of ways to do this however none of them are working. I am very new to C++ and VS, working for about a month, and I am trying to code an auto Excel program using VC++. I am stuck trying to concatenate a wchar_t * and an unsigned long long. I assume the first step is to "convert" the unsigned long long to wchar_t *. I apologize for throwing in the whole code but I think it may help with showing what I am aiming for and if there are any other weaknesses in the code.
wchar_t * ex(wchar_t * dest, unsigned long long num);
int main()
{
unsigned long long num = 10;
wchar_t *dest= L"A2:B";
wchar_t * Path=ex(dest, num);
VARIANT param;
param.vt = VT_BSTR;
// param.bstrVal = SysAllocString(L"A2:B10");
param.bstrVal = SysAllocString(Path);
getchar();
return 0;
}
wchar_t * ex(wchar_t * dest, unsigned long long num)
{
// Convert num to wchar_t *
wchar_t *rangeMax = (wchar_t *)num;
// I think this is used to eliminate extra space in other solutions
// but not here. It could be useful.
const int MAX_CHARS = 50;
size_t count = wcsnlen_s(dest, MAX_CHARS);
wprintf(L"The length of the string is %ld characters\n", count);
// Throw dest into buf
wchar_t buf[25] = { 0 };
int r = wcscpy_s(buf, 25, dest);
if (r != 0) {
wprintf(L"wcscpy_s() failed %ld", r);
}
r = wcscat_s(buf, 25, rangeMax);
if (r != 0) {
wprintf(L"wcscat_s() failed %ld", r);
}
wprintf_s(buf);
return buf;
}
ex is an edited example from zetcode. I think it is close to being the solution, however when combining buf and rangeMax the code throws all sorts of memory exceptions and fails.
As you can see the final destination for the concatenated wchar_t * is as a BSTR in a VARIANT through SysAllocString.
I appreciate any suggestions on code improvement as well as how to make the code actually run!

As suggested using wstring functioned as intended. Thank you for pointing out I was returning a pointer to a local variable! Once back in main the type was changed to wchar_t * which passed nicely to SysAllocString() for use with my main program.
std::wstring ex(wchar_t * dest, unsigned long long num);
int main()
{
unsigned long long num = 10;
wchar_t *dest= L"A2:B";
std::wstring PathString= ex(dest, num);
wchar_t *wPath = (WCHAR *)PathString.c_str();
std::wcout << L"In main\n";
std::wcout << wPath << L'\n';
VARIANT param;
param.vt = VT_BSTR;
//param.bstrVal = SysAllocString(L"A2:B10");
param.bstrVal = SysAllocString(wPath);
getchar();
return 0;
}
std::wstring ex(wchar_t * dest, unsigned long long num)
{
std::wstring rangeMax = std::to_wstring(num);
std::wstring string(dest);
string += rangeMax;
std::wcout << L"In function\n";
std::wcout<<string<<L'\n';
return string;
}

Related

Convert HEX to printable string/char

I'm using CNG to generate a hash.
Result of BCryptFinishHash call is MD5 of a input in hex form.
Example:
char *outHash = "\x02\x34\x75\01..."
I want to convert it to printable string: 02347501...
How can I do that?
To encode a byte array in hex and write the encoded data to a std::string, do this:
static inline char
hex_digit(unsigned int n)
{
if (n < 10) return '0' + n;
if (n < 16) return 'a' + (n - 10);
abort();
}
std::string
encode_bytes(const unsigned char *bytes, size_t len)
{
std::string rv;
rv.reserve(len * 2);
for (size_t i = 0; i < len; i++) {
rv.push_back(hex_digit((bytes[i] & 0xF0) >> 4));
rv.push_back(hex_digit((bytes[i] & 0x0F) >> 0));
}
return rv;
}
Note that you must know the length of the byte array. It is not safe to treat it as a NUL-terminated "C string", because binary data can contain internal zero bytes. To know the length of a hash generated by CNG, call BCryptGetProperty to get the BCRYPT_HASH_LENGTH property.
we can use CryptBinaryToString here with CRYPT_STRING_HEXASCII or CRYPT_STRING_HEX or CRYPT_STRING_HEXRAW or CRYPT_STRING_HEX | CRYPT_STRING_NOCRLF or CRYPT_STRING_HEXRAW | CRYPT_STRING_NOCRLF depen how you want format string. for example
void print(PUCHAR pbHash, ULONG cbHash, DWORD dwFlags = CRYPT_STRING_HEXRAW | CRYPT_STRING_NOCRLF)
{
ULONG cch = 0;
if (CryptBinaryToStringW(pbHash, cbHash, dwFlags, 0, &cch))
{
if (PWSTR sz = (PWSTR)_malloca(cch * sizeof(WCHAR)))
{
if (CryptBinaryToStringW(pbHash, cbHash, dwFlags, sz, &cch))
{
DbgPrint("%S\n", sz);
}
_freea(sz);
}
}
}
If you need an easy, one time solution, this is a useful tool:
https://codebeautify.org/hex-string-converter
However, if you're looking to do this within your code itself, I found this from an earlier thread (AKA, this is not my work but that of #KEINE LUST from here )
int main(void)
{
unsigned char readingreg[4];
readingreg[0] = 0x4a;
readingreg[1] = 0xaa;
readingreg[2] = 0xaa;
readingreg[3] = 0xa0;
char temp[4];
sprintf(temp, "%x", readingreg[0]);
printf("This is element 0: %s\n", temp);
return 0;
}
You can print it like this:
for(const char *wsk=outHash; *wsk; ++wsk){
printf("%02hhx", *wsk);
}
Edit based that cstring can have 0x00 numbers.
C
const char outHash[] = "\x02\x34\x75";
const int size = sizeof(outHash)/sizeof(char) - 1;
for(int i = 0; i < size; ++i){
printf("%02hhx", outHash [i]);
}
C++
std::string outHash = "\x02\x34\x75";
for(int i = 0; i < outHash.size(); ++i) {
printf("%02hhx", outHash [i]);
}
Loop over the characters and print the numerical value (in hex).
#include <iostream>
#include <iomanip>
int main()
{
char* outHash = "\x02\x34\x75\x01\x23\xff"; // Get from your Hash function.
int sizeOfHash = 6; // Use appropriate size for BCryptFinishHash()
// Set up the characteristics of the stream.
// setw(2): Each printed object will use a min width of 2
// setfill('0'): If the object is less than 2 char then fill the space with '0'
// hex: Print numbers in hex.
std::cout << std::setw(2) << std::setfill('0') << std::hex;
// Create a view of the object.
// Makes it simpler to loop over.
std::string_view view(outHash, sizeOfHash);
// Loop over the string.
for(unsigned char val: view) {
// Convert to `unsigned char` to make sure you don't print
// negative numbers. Then convert from there to `int` so that
// the `std::hex will kick in and convert to hex value.
std::cout << static_cast<int>(val);
}
std::cout << "\n";
}
I am working on C++ wrapper around Windows Crypto API & CNG which I am using in my projects. I plan to move all of it to github but for now it is just a work in progress, but you can find it useful for Crypto basics like HEX / Base64 encode / decode etc.
https://github.com/m4x1m1l14n/Crypto
You can use Crypto::Hex::Encode() method to achieve what you want.
#include <Crypto\Hex.hpp>
#include <Crypto\Random.hpp>
using namespace m4x1m1l14n;
char arr[] = { 0xaa, 0xbb, 0xcc, 0xdd, 0x99, 0x00 };
encoded = Crypto::Hex::Encode(arr, sizeof(arr));
/* encoded = "aabbccdd9900" */
Also you can use wrapper for MD5 which is located in Hash namespace, like this. (If you are not using large amount of data)
#include <Crypto\Hex.hpp>
#include <Crypto\Hash.hpp>
using namespace m4x1m1l14n;
encoded = Crypto::Hex::Encode(Crypto::Hash::MD5("Whatever you want to hash"));

Why is my driver only reading part of the string?

I'm attempting to read a string from a dummy program with a kernel driver. But only the first 4 char's are being read, I can't figure out why.
Part of the IOCTL code for reading the string:
else if (ControlCode = IO_READ_STRING_REQUEST)
{
PREAD_REQUEST Values = (PREAD_REQUEST)buffer;
PREAD_REQUEST ValuesOutput = (PREAD_REQUEST)buffer;
PEPROCESS process;
if (NT_SUCCESS(PsLookupProcessByProcessId(PID, &process)))
{
KeReadProcessMemory(process, Values->Address, &ValuesOutput->buffer, Values->Size);
DbgPrint((PCSTR)Values->buffer);
status = STATUS_SUCCESS;
}
else
status = STATUS_INVALID_PARAMETER;
BytesIO = sizeof(READ_REQUEST);
}
This is the read struct:
typedef struct ReadStruct
{
ULONGLONG Address;
ULONGLONG Response;
ULONGLONG Size;
char buffer[128];
} READ_REQUEST, *PREAD_REQUEST;
The DbgPrint always prints stri when it's supposed to print stringChar, and stri is returned to the usermode.
This is how it's called from usermode:
void ReadString(std::string *string, DWORD64 address)
{
ReadValues Values;
std::memset(Values.buffer, '\0', 128);
Values.Address = address;
Values.Response = 0;
Values.Size = sizeof(128);
if (!(DeviceIoControl(hDriver, IO_READ_STRING_REQUEST, &Values, sizeof(Values), &Values, sizeof(Values), 0, 0)))
{
std::cout << "RPM Failed!\n";
exit(1);
}
*string = (std::string)Values.buffer;
}
The struct is the same:
struct ReadValues
{
ULONGLONG Address;
ULONGLONG Response;
ULONGLONG Size;
char buffer[128];
};
I thought it was the size, but when I specified the size to 11 (10 + \0) it also read only 4 chars.
The problem is here:
Values.Size = sizeof(128);
^^^^^^^^^^^
This is the same as sizeof(int) (which, I would guess, is 4 in your platform).
Either use 128 or sizeof(buffer) (the latter is arguably better since you won't be hard-coding the same constant in several places).

c++ sqllite3 reading blob won't work

Can someone guide me please,
I can't seem to read my blob correctly.
I don't know what's wrong, can somebody help?
this is my function:
what i'm trying to do is:
read the bob as binary and store the bytes in a char *data;
can someone please help?
int baramdb::dbreadblob(int pid)
{
sqlite3_stmt *res;
const char *tail;
int count = 0;
this->dbopen(this->dbfile);
if (sqlite3_prepare_v2(this->db, "SELECT * FROM Packet_Send_Queue", 128, &res, &tail) != SQLITE_OK)
{
printf("[Baram] Can't retrieve data: %s\n", sqlite3_errmsg(db));
sqlite3_close(db);
return(1);
}
while (sqlite3_step(res) == SQLITE_ROW)
{
int *plength = 0;
*plength = sqlite3_column_bytes(res, 2);
unsigned char **pbuffer = (unsigned char **)malloc(*plength);
memcpy(*pbuffer, sqlite3_column_blob(res, 0), *plength);
count++;
}
sqlite3_close(this->db);
this->lastresult = count;
return count;
}
It seems you don't understand what "pointer" really is and how to use it.
Then, sqlite3_column_bytes returns int not int*:
int length = sqlite3_column_bytes(res, 2);
This is absolutely incorrect in current case:
unsigned char **pbuffer = (unsigned char **)malloc(*plength);
If you're using C++ - try to not explicitly use malloc/new, use smart pointer or STL containers instead:
std::vector<char> data( length );
const char *pBuffer = reinterpret_cast<const char*>( sqlite3_column_blob(res, 2) );
std::copy( pBuffer, pBuffer + data.size(), &data[0] );
This is it.

Why does my OpenSSL C++ code create binary encryption output?

I'm trying to encrypt a file using AES from OpenSSL and then write the output to a file. But I'm getting messy outputs, sometimes decipherable and sometimes not.
The main code is based from here: https://github.com/shanet/Crypto-Example/blob/master/crypto-example.cpp
Here's the code:
int Crypt::__aesEncrypt(const unsigned char *msg, size_t msgLen, unsigned char **encMsg) {
EVP_CIPHER_CTX *aesEncryptCtx = (EVP_CIPHER_CTX*)malloc(sizeof(EVP_CIPHER_CTX));
EVP_CIPHER_CTX_init(aesEncryptCtx);
unsigned char *aesKey = (unsigned char*)malloc(AES_KEYLEN/8);
unsigned char *aesIV = (unsigned char*)malloc(AES_KEYLEN/8);
unsigned char *aesPass = (unsigned char*)malloc(AES_KEYLEN/8);
unsigned char *aesSalt = (unsigned char*)malloc(8);
if(RAND_bytes(aesPass, AES_KEYLEN/8) == 0) {
return FAILURE;
}
if(RAND_bytes(aesSalt, 8) == 0) {
return FAILURE;
}
if(EVP_BytesToKey(EVP_aes_256_cbc(), EVP_sha1(), aesSalt, aesPass, AES_KEYLEN/8, AES_ROUNDS, aesKey, aesIV) == 0) {
return FAILURE;
}
strncpy((char*)aesKey, (const char*)"B374A26A71490437AA024E4FADD5B4AA", AES_KEYLEN/8);
strncpy((char*)aesIV, (const char*)"7E892875A52C59A3B588306B13C31FBD", AES_KEYLEN/16);
size_t blockLen = 0;
size_t encMsgLen = 0;
*encMsg = (unsigned char*)malloc(msgLen + AES_BLOCK_SIZE);
if(encMsg == NULL) return FAILURE;
if(!EVP_EncryptInit_ex(aesEncryptCtx, EVP_aes_256_cbc(), NULL, aesKey, aesIV)) {
return FAILURE;
}
if(!EVP_EncryptUpdate(aesEncryptCtx, *encMsg, (int*)&blockLen, (unsigned char*)msg, msgLen)) {
return FAILURE;
}
encMsgLen += blockLen;
if(!EVP_EncryptFinal_ex(aesEncryptCtx, *encMsg + encMsgLen, (int*)&blockLen)) {
return FAILURE;
}
EVP_CIPHER_CTX_cleanup(aesEncryptCtx);
free(aesEncryptCtx);
free(aesKey);
free(aesIV);
return encMsgLen + blockLen;
}
Im calling like this:
unsigned char *encMsg = NULL;
__aesEncrypt((const unsigned char*)decrypted_string.c_str(), decrypted_string.size(), &encMsg);
std::stringstream ss;
ss << encMsg;
//write ss to file...
Thanks.
I'm actually the author of the example you've based your code off of. As WhozCraig pointed out in the comments above, you are using a stringstream to write the encrypted message to a file. The problem with this is that encrypted messages are not regular ASCII strings. They are binary data (values greater than 127, hence the need for an unsigned char array) and binary data cannot be treated the same as ASCII strings.
I'm not much of a C++ person, so I would write the data to a file the C way with fwrite, but if you want to do it the C++ way, I think you're looking for ifstream rather than stringstream.
Side note, I'm betting this is just for debugging, but I'll point it out anyway just to make sure: Hardcoding your AES key and IV (strncpy((char*)aesKey, (const char*)"B374A26A71490437AA024E4FADD5B4AA", AES_KEYLEN/8)) completely defeats the purpose of encryption. If you want to avoid the PBKDF (EVP_BytesToKey) you can just use RAND_Bytes to get random data for your AES key.

simulate ulltoa() with a radix/base of 36

I need to convert an unsigned 64-bit integer into a string. That is in Base 36, or characters 0-Z. ulltoa does not exist in the Linux manpages. But sprintf DOES. How do I use sprintf to achieve the desired result? i.e. what formatting % stuff?
Or if snprintf does not work, then how do I do this?
You can always just write your own conversion function. The following idea is stolen from heavily inspired by this fine answer:
char * int2base36(unsigned int n, char * buf, size_t buflen)
{
static const char digits[] = "0123456789ABCDEFGHI...";
if (buflen < 1) return NULL; // buffer too small!
char * b = buf + buflen;
*--b = 0;
do {
if (b == buf) return NULL; // buffer too small!
*--b = digits[n % 36];
n /= 36;
} while(n);
return b;
}
This will return a pointer to a null-terminated string containing the base36-representation of n, placed in a buffer that you provide. Usage:
char buf[100];
std::cout << int2base36(37, buf, 100);
If you want and you're single-threaded, you can also make the char buffer static -- I guess you can figure out a suitable maximal length:
char * int2base36_not_threadsafe(unsigned int n)
{
static char buf[128];
static const size_t buflen = 128;
// rest as above