STL/Boost Replacement for WideCharToMultiByte-function - c++

In the following code (snippet) WideCharToMultiByte is a Windows specific function.
Is there a suitable replacement for the function using STL or Boost?
//in function parameters: (..., WCHAR* szNameOfDll, ...)
char szSourceTemp[MAX_PATH + 1] = {0};
WideCharToMultiByte(CP_ACP,0,szNameOfDLL,-1, szSourceTemp,MAX_PATH,NULL,NULL);
Any help appreciated!

std::wctomb/std::wcstombs
std::mbtowc/std::mbstowcs
or
#include <boost/locale.hpp>
#include <iostream>
std::string utf8_string = to_utf<char>(latin1_string,"Latin1");
std::wstring wide_string = to_utf<wchar_t>(latin1_string,"Latin1");
std::string latin1_string= from_utf(wide_string,"Latin1");
std::string utf8_string2 = utf_to_utf<char>(wide_string);

Related

Xcode C++ MD5 hash

I would like to hash a simple string using MD5 in Xcode c++. I searched a lot but I couldn't find a tutorial. I have to #import <CommonCrypto/CommonDigest.h>. Is that all? How can I call MD5 after that?
I have found this code but it gives me an error. How will I get my hashed value is it updated in the string variable?
unsigned char digest[16];
const char* string = "Hello World";
struct MD5Context context; **(error: variable has incomplete type
MD5Init(&context);
MD5Update(&context, string, strlen(string));
MD5Final(digest, &context);
I'm just using a simple command line application no headers inside just the basic main.cpp.
I really appreciate any help!!!!
You're using the wrong API. I'm not sure where you're getting those from (they look like OpenSSL calls), but it should look like this:
#include <stdio.h>
#include <string.h>
#include <CommonCrypto/CommonDigest.h>
int main()
{
unsigned char digest[CC_MD5_DIGEST_LENGTH];
const char string[] = "Hello World";
CC_MD5_CTX context;
CC_MD5_Init(&context);
CC_MD5_Update(&context, string, (CC_LONG)strlen(string));
CC_MD5_Final(digest, &context);
for (size_t i=0; i<CC_MD5_DIGEST_LENGTH; ++i)
printf("%.2x", digest[i]);
fputc('\n', stdout);
return 0;
}
Output
b10a8db164e0754105b7a99be72e3fe5
Validated here.
There is a one-shot version:
#include <CommonCrypto/CommonDigest.h>
unsigned char digest[16];
const char* string = "Hello World";
CC_MD5(string, (CC_LONG)strlen(string), digest);
You will need to include the Security.framework (or at lease the applicable library file).

How to convert a string encoded in utf16 to a string encoded in UTF-8?

I have a string or a char[], but the it is encoded in utf-16, like this:
Now I want to convert it to utf-8 in a new string, Please help me! I already tried like this:
But the compiler tells me I have a problem. How to solve this problem?
The problem is evident: you define u16_str as a std::string when cvt.to_bytes() expect a std::u16string (as the name of the variable suggest).
The following code works for me
#include <locale>
#include <codecvt>
#include <iostream>
int main ()
{
std::u16string u16_str { u"aeiuoàèìòùAEIOU" };
std::wstring_convert<std::codecvt_utf8<char16_t>, char16_t> cvt;
std::string u8_str = cvt.to_bytes(u16_str);
std::cout << u8_str << std::endl;
return 0;
}

Can not convert LPTSTR to std::string in release mode

Just like the title says, I can not convert LPTSTR to std::string in release mode. In other words, when I'm doing this:
LPTSTR lpt;
std::string str = lpt;
This only works when I'm in debug mode. The compiler says that no matching constructor could be found. Did I forget to include something?
I tried this function
#include <string>
using namespace std;
string LPTSTRToString(LPTSTR Input)
{
string Output;
for (int i=0;i<((wstring)Input).length();i++)
Output+=Input[i];
return Output;
}

Converting 'const char*' to 'LPCTSTR' for CreateDirectory

#include "stdafx.h"
#include <string>
#include <windows.h>
using namespace std;
int main()
{
string FilePath = "C:\\Documents and Settings\\whatever";
CreateDirectory(FilePath, NULL);
return 0;
}
Error: error C2664: 'CreateDirectory' : cannot convert parameter 1 from 'const char *' to 'LPCTSTR'
How do I make this conversion?
The next step is to set today's date as a string or char and concatenate it with the filepath. Will this change how I do step 1?
I am terrible at data types and conversions, is there a good explanation for 5 year olds out there?
std::string is a class that holds char-based data. To pass a std::string data to API functions, you have to use its c_str() method to get a char* pointer to the string's actual data.
CreateDirectory() takes a TCHAR* as input. If UNICODE is defined, TCHAR maps to wchar_t, otherwise it maps to char instead. If you need to stick with std::string but do not want to make your code UNICODE-aware, then use CreateDirectoryA() instead, eg:
#include "stdafx.h"
#include <string>
#include <windows.h>
int main()
{
std::string FilePath = "C:\\Documents and Settings\\whatever";
CreateDirectoryA(FilePath.c_str(), NULL);
return 0;
}
To make this code TCHAR-aware, you can do this instead:
#include "stdafx.h"
#include <string>
#include <windows.h>
int main()
{
std::basic_string<TCHAR> FilePath = TEXT("C:\\Documents and Settings\\whatever");
CreateDirectory(FilePath.c_str(), NULL);
return 0;
}
However, Ansi-based OS versions are long dead, everything is Unicode nowadays. TCHAR should not be used in new code anymore:
#include "stdafx.h"
#include <string>
#include <windows.h>
int main()
{
std::wstring FilePath = L"C:\\Documents and Settings\\whatever";
CreateDirectoryW(FilePath.c_str(), NULL);
return 0;
}
If you're not building a Unicode executable, calling c_str() on the std::string will result in a const char* (aka non-Unicode LPCTSTR) that you can pass into CreateDirectory().
The code would look like this:
CreateDirectory(FilePath.c_str(), NULL):
Please note that this will result in a compile error if you're trying to build a Unicode executable.
If you have to append to FilePath I would recommend that you either continue to use std::string or use Microsoft's CString to do the string manipulation as that's less painful that doing it the C way and juggling raw char*. Personally I would use std::string unless you are already in an MFC application that uses CString.

clang: converting const char16_t* (UTF-16) to wstring (UCS-4)

I'm trying to convert UTF-16 encoded strings to UCS-4
If I understand correctly, C++11 provides this conversion through codecvt_utf16.
My code is something like:
#include <iostream>
#include <locale>
#include <memory>
#include <codecvt>
#include <string>
using namespace std;
int main()
{
u16string s;
s.push_back('h');
s.push_back('e');
s.push_back('l');
s.push_back('l');
s.push_back('o');
wstring_convert<codecvt_utf16<wchar_t>, wchar_t> conv;
wstring ws = conv.from_bytes(reinterpret_cast<const char*> (s.c_str()));
wcout << ws << endl;
return 0;
}
Note: the explicit push_backs to get around the fact that my version of clang (Xcode 4.2) doesn't have unicode string literals.
When the code is run, I get terminate exception. Am I doing something illegal here? I was thinking it should work because the const char* that I passed to wstring_convert is UTF-16 encoded, right? I have also considered endianness being the issue, but I have checked that it's not the case.
Two errors:
1) from_bytes() overload that takes the single const char* expects a null-terminated byte string, but your very second byte is '\0'.
2) your system is likely little-endian, so you need to convert from UTF-16LE to UCS-4:
#include <iostream>
#include <locale>
#include <memory>
#include <codecvt>
#include <string>
using namespace std;
int main()
{
u16string s;
s.push_back('h');
s.push_back('e');
s.push_back('l');
s.push_back('l');
s.push_back('o');
wstring_convert<codecvt_utf16<wchar_t, 0x10ffff, little_endian>,
wchar_t> conv;
wstring ws = conv.from_bytes(
reinterpret_cast<const char*> (&s[0]),
reinterpret_cast<const char*> (&s[0] + s.size()));
wcout << ws << endl;
return 0;
}
Tested with Visual Studio 2010 SP1 on Windows and CLang++/libc++-svn on Linux.