How to convert _bstr_t to CString - c++

I have a _bstr_t variable bstrErr and I am having a CString variable csError. How do I set the value which come in bstrErr to csError?

Is it not possible just to cast it:
_bstr_t b("Steve");
CString cs;
cs = (LPCTSTR) b;
I think this should work when the project is Unicode.

CString has contructors and assignment operators for both LPCSTR and LPCWSTR, so there is never a need to call WideCharToMultiByte, and you can't get the casting wrong in unicode or non-unicode mode.
You can just assign the string this way:
csError = bstrErr.GetBSTR();
Or use the constructor
CString csError( bstrErr.GetBSTR() );
I'm using GetBSTR. It's the same thing as casting bstrErr with (LPCWSTR), but I prefer it for legibility.

If you compile for Unicode - just assign the encapsulated BSTR to the CString. If you compile for ANSI you'll have to use WideCharToMultiByte() for conversion.
Also beware that the encapsulated BSTR can be null which corresponds to an empty string. If you don't take care of this your program will run into undefined behaviour.

BSTR myBSTRVal;
CString BSTRasCString("")
char szValue[MAX_PATH] = "";
// This will map the BSTR to a new character string (szValue)
WideCharToMultiByte(CP_ACP, 0, myBSTRVal, -1, szValue, sizeof(szValue), NULL,
NULL);
BSTRasCString.Format("%s", szValue);
BSTRasCString.TrimLeft();
BSTRasCString.TrimRight();

CStringT,CString, CStringA, and CStringW:
CStringT is a complicated class template based on an arbitrary character type and helper class templates for managing the storage and the features.
The class CString is a typedef of the template class that uses the TCHAR character type. TCHARis a generic type that resolves to wchar if the macro UNICODE is set, else to char.
The class CStringA is a typedef of the template class that uses internally the narrow character type char.
The class CStringW is a typedef of the template class that uses internally the wide character type wchar_t.
I never use CString in code, instead I always use the explicit classes CStringA or CStringW.
The classes CString* have constructors that accept narrow and wide strings. The same is true for _bstr_t. Strings of type BSTR must be allocated by the function SysAllocString() that expects an OLECHAR string, hence in Win32/64 a wide string. If you want to copy a _bstr_t that contains Unicode to a CStringA you must convert it to UTF8. I use the classes CW2A and CA2W for conversion.
In the following event function of a Word add-in, I show the use of these types:
STDMETHODIMP CConnect::TestButtonClicked(IDispatch* Command)
{
BSTR smi = SysAllocString(L"Two smileys 😊 in a row: ");
_bstr_t ley = L"😊\U0001F60A";
/* Either using CStringA, UTF16 -> UTF8 conversion needed */
CStringA smiley(CW2A(smi, CP_UTF8));
smiley += CW2A(ley.GetBSTR(), CP_UTF8);
MessageBoxW(NULL, CA2W(smiley, CP_UTF8), L"Example", MB_OK | MB_TASKMODAL);
/* Or using CStringW, use ctor and += operator directly
CStringW smiley = smi;
smiley += ley.GetBSTR();
MessageBoxW(NULL, smiley, L"Example", MB_OK | MB_TASKMODAL);
*/
SysFreeString(smi);
return S_OK;
}

Related

cannot convert CString to const char * [duplicate]

How do I convert from CString to const char* in my Unicode MFC application?
To convert a TCHAR CString to ASCII, use the CT2A macro - this will also allow you to convert the string to UTF8 (or any other Windows code page):
// Convert using the local code page
CString str(_T("Hello, world!"));
CT2A ascii(str);
TRACE(_T("ASCII: %S\n"), ascii.m_psz);
// Convert to UTF8
CString str(_T("Some Unicode goodness"));
CT2A ascii(str, CP_UTF8);
TRACE(_T("UTF8: %S\n"), ascii.m_psz);
// Convert to Thai code page
CString str(_T("Some Thai text"));
CT2A ascii(str, 874);
TRACE(_T("Thai: %S\n"), ascii.m_psz);
There is also a macro to convert from ASCII -> Unicode (CA2T) and you can use these in ATL/WTL apps as long as you have VS2003 or greater.
See the MSDN for more info.
If your CString is Unicode, you'll need to do a conversion to multi-byte characters. Fortunately there is a version of CString which will do this automatically.
CString unicodestr = _T("Testing");
CStringA charstr(unicodestr);
DoMyStuff((const char *) charstr);
Note: This answer predates the Unicode requirement; see the comments.
Just cast it:
CString s;
const TCHAR* x = (LPCTSTR) s;
It works because CString has a cast operator to do exactly this.
Using TCHAR makes your code Unicode-independent; if you're not concerned about Unicode you can simply use char instead of TCHAR.
There is an explicit cast on CString to LPCTSTR, so you can do (provided unicode is not specified):
CString str;
// ....
const char* cstr = (LPCTSTR)str;
I had a similar problem. I had a char* buffer with the .so name in it.
I could not convert the char* variable to LPCTSTR. Here's how I got around it...
char *fNam;
...
LPCSTR nam = fNam;
dll = LoadLibraryA(nam);
I recommendo to you use TtoC from ConvUnicode.h
const CString word= "hello";
const char* myFile = TtoC(path.GetString());
It is a macro to do conversions per Unicode
Generic Conversion Macros (TN059 Other Considerations section is important):
A2CW (LPCSTR) -> (LPCWSTR)
A2W (LPCSTR) -> (LPWSTR)
W2CA (LPCWSTR) -> (LPCSTR)
W2A (LPCWSTR) -> (LPSTR)
TN059: Using MFC MBCS/Unicode Conversion Macros
I used this conversion:
CString cs = "TEST";
char* c = cs.GetBuffer(m_ncs me.GetLength())
I hope this is useful.

How would you convert a std::string to BSTR*?

How would you convert a std::string to BSTR*?
STDMETHODIMP CMyRESTApp::rest(BSTR data, BSTR* restr)
{
RESTClient restclient;
RESTClient::response resp = restclient.get(data);
Log("Response Status code: %s", resp.code);
Log("Response Body: %s", resp.body);
*restr = // here
return S_OK;
}
I need convert the resp.body and this then to be returned for the *restr here.
An ATL based approach is to use ATL::CComBSTR and then a Detach() (or CopyTo(...)) the resultant CComBSTR to the BSTR*
Something like:
CComBSTR temp(stlstr.c_str());
*restr = temp.Detach();
Else in general for std::basic_string you can use the Win32 API Sys* family of functions, such as SysAllocStringByteLen and SysAllocString;
// For the `const char*` data type (`LPCSTR`);
*restr = SysAllocStringByteLen(stlstr.c_str(), stlstr.size());
// More suitable for OLECHAR
*restr = SysAllocString(stlwstr.c_str());
OLECHAR depends on the target platform, but generally it is wchar_t.
Given your code, the shortest snippet could just be;
*restr = SysAllocStringByteLen(resp.body.c_str(), resp.body.size());
Note these Windows API functions use the "usual" windows code page conversions, please see further MSDN documentation on how to control this if required.
std::string is made by chars; BSTR is usually a Unicode UTF-16 wchar_t-based string, with a length prefix.
Even if one could use a BSTR as a simple way to marshal a byte array (since the BSTR is length-prefixed, so it can store embedded NULs), and so potentially a BSTR could be used also to store non-UTF-16 text, the usual "natural" behavior for a BSTR is to contain a Unicode UTF-16 wchar_t-string.
So, the first problem is to clarify what kind of encoding the std::string uses (for example: Unicode UTF-8? Or some other code page?). Then you have to convert that string to Unicode UTF-16, and create a BSTR containing that UTF-16 string.
To convert from UTF-8 (or some other code page) to UTF-16, you can use the MultiByteToWideChar() function. If the source std::string contains a UTF-8 string, you can use the CP_UTF8 code page value with the aforementioned API.
Once you have the UTF-16 converted string, you can create a BSTR using it, and pass that as the output BSTR* parameter.
The main Win32 API to create a BSTR is SysAllocString(). There are also some variants in which you can specify the string length.
Or, as a more convenient alternative, you can use the ATL's CComBSTR class to wrap a BSTR in safe RAII boundaries, and use its Detach() method to pass the BSTR as an output BSTR* parameter.
CComBSTR bstrResult( /* UTF-16 string from std::string */ );
*restr = bstrResult.Detach();
Bonus reading:
Eric's Complete Guide To BSTR Semantics
This is very much possible :
std::string singer("happy new year 2016");
_bstr_t sa_1(singer.c_str()); //std::string to _bstr_t
_bstr_t sa_2("Goodbye 2015");
std::string kapa(sa_2); //_bstr_t to std::string
size_t sztBuffer = (resp.body.length() + 1) * sizeof(wchar_t);
wchar_t* pBuffer = new wchar_t[resp.body.length() + 1];
ZeroMemory(&pBuffer[0], sztBuffer);
MultiByteToWideChar(CP_ACP, 0, resp.body.c_str(), resp.body.length(), pBuffer, sString.length());
SysAllocString((OLECHAR*)pBuffer);
delete[] pBuffer;
Do not forget to deallocate it afterward.

How to concatenate char* and LPWSTR string?

I want to use MoveFile function, this function use two LPWSTR arguments, but I have one char* and LWSTR, how to concatenate them?
//move file
LPWSTR latestFile = L"test.SPL";
char* spoolFolder = "C:\\Windows\\System32\\spool\PRINTERS\\";
LPWSTR fileToMove = spoolFolder + latestFile;
BOOL moved = MoveFile(latestFile, L"C:\\UnprocessedFiles\\" + latestFile);
Just for clarification, LPWSTR is a typedef for wchar_t*. You can use wcscat_s to conctenate strings of this form. Your one char* string should just be changed to be of the same type, since you have it there as a simple literal (just prefix the literal with L and change the declared type). Since you tagged this as C++, however, you can do all of this more simply by using the std::wstring class.
std::wstring latestFile = wstring("test.SPL");
std::wstring spoolFolder = wstring("C:\\Windows\\System32\\spool\PRINTERS\\");
std::wstring fileToMove = spoolFolder + latestFile;
BOOL moved = MoveFile(latestFile.c_str(), fileToMove.c_str());
In deed, LPWSTR is just a typdef for w_char*. so if you consult MSDN you will see that:
typded wchar_t* LPWSTR;
here the w_char* means that your string will be encoded as UNICODE not ANSI scheme. So under windows a UNICODE string will be an UTF16 one ( 2 bytes for each char).
std::wstring is also a typedef for std::basic_string<wchar_t,char_traits<>> so by declaring your inputs as wstring and calling wasting.c_str() this will do the stuff.

Convert CString to const char*

How do I convert from CString to const char* in my Unicode MFC application?
To convert a TCHAR CString to ASCII, use the CT2A macro - this will also allow you to convert the string to UTF8 (or any other Windows code page):
// Convert using the local code page
CString str(_T("Hello, world!"));
CT2A ascii(str);
TRACE(_T("ASCII: %S\n"), ascii.m_psz);
// Convert to UTF8
CString str(_T("Some Unicode goodness"));
CT2A ascii(str, CP_UTF8);
TRACE(_T("UTF8: %S\n"), ascii.m_psz);
// Convert to Thai code page
CString str(_T("Some Thai text"));
CT2A ascii(str, 874);
TRACE(_T("Thai: %S\n"), ascii.m_psz);
There is also a macro to convert from ASCII -> Unicode (CA2T) and you can use these in ATL/WTL apps as long as you have VS2003 or greater.
See the MSDN for more info.
If your CString is Unicode, you'll need to do a conversion to multi-byte characters. Fortunately there is a version of CString which will do this automatically.
CString unicodestr = _T("Testing");
CStringA charstr(unicodestr);
DoMyStuff((const char *) charstr);
Note: This answer predates the Unicode requirement; see the comments.
Just cast it:
CString s;
const TCHAR* x = (LPCTSTR) s;
It works because CString has a cast operator to do exactly this.
Using TCHAR makes your code Unicode-independent; if you're not concerned about Unicode you can simply use char instead of TCHAR.
There is an explicit cast on CString to LPCTSTR, so you can do (provided unicode is not specified):
CString str;
// ....
const char* cstr = (LPCTSTR)str;
I had a similar problem. I had a char* buffer with the .so name in it.
I could not convert the char* variable to LPCTSTR. Here's how I got around it...
char *fNam;
...
LPCSTR nam = fNam;
dll = LoadLibraryA(nam);
I recommendo to you use TtoC from ConvUnicode.h
const CString word= "hello";
const char* myFile = TtoC(path.GetString());
It is a macro to do conversions per Unicode
Generic Conversion Macros (TN059 Other Considerations section is important):
A2CW (LPCSTR) -> (LPCWSTR)
A2W (LPCSTR) -> (LPWSTR)
W2CA (LPCWSTR) -> (LPCSTR)
W2A (LPCWSTR) -> (LPSTR)
TN059: Using MFC MBCS/Unicode Conversion Macros
I used this conversion:
CString cs = "TEST";
char* c = cs.GetBuffer(m_ncs me.GetLength())
I hope this is useful.

How do you convert CString and std::string std::wstring to each other?

CString is quite handy, while std::string is more compatible with STL container. I am using hash_map. However, hash_map does not support CStrings as keys, so I want to convert the CString into a std::string.
Writing a CString hash function seems to take a lot of time.
CString -----> std::string
How can I do this?
std::string -----> CString:
inline CString toCString(std::string const& str)
{
return CString(str.c_str());
}
Am I right?
EDIT:
Here are more questions:
How can I convert from wstring to CString and vice versa?
// wstring -> CString
std::wstring src;
CString result(src.c_str());
// CString -> wstring
CString src;
std::wstring des(src.GetString());
Is there any problem with this?
Additionally, how can I convert from std::wstring to std::string and vice versa?
According to CodeGuru:
CString to std::string:
CString cs("Hello");
std::string s((LPCTSTR)cs);
BUT: std::string cannot always construct from a LPCTSTR. i.e. the code will fail for UNICODE builds.
As std::string can construct only from LPSTR / LPCSTR, a programmer who uses VC++ 7.x or better can utilize conversion classes such as CT2CA as an intermediary.
CString cs ("Hello");
// Convert a TCHAR string to a LPCSTR
CT2CA pszConvertedAnsiString (cs);
// construct a std::string using the LPCSTR input
std::string strStd (pszConvertedAnsiString);
std::string to CString: (From Visual Studio's CString FAQs...)
std::string s("Hello");
CString cs(s.c_str());
CStringT can construct from both character or wide-character strings. i.e. It can convert from char* (i.e. LPSTR) or from wchar_t* (LPWSTR).
In other words, char-specialization (of CStringT) i.e. CStringA, wchar_t-specilization CStringW, and TCHAR-specialization CString can be constructed from either char or wide-character, null terminated (null-termination is very important here) string sources.
Althoug IInspectable amends the "null-termination" part in the comments:
NUL-termination is not required.
CStringT has conversion constructors that take an explicit length argument. This also means that you can construct CStringT objects from std::string objects with embedded NUL characters.
Solve that by using std::basic_string<TCHAR> instead of std::string and it should work fine regardless of your character setting.
If you want something more C++-like, this is what I use. Although it depends on Boost, that's just for exceptions. You can easily remove those leaving it to depend only on the STL and the WideCharToMultiByte() Win32 API call.
#include <string>
#include <vector>
#include <cassert>
#include <exception>
#include <boost/system/system_error.hpp>
#include <boost/integer_traits.hpp>
/**
* Convert a Windows wide string to a UTF-8 (multi-byte) string.
*/
std::string WideStringToUtf8String(const std::wstring& wide)
{
if (wide.size() > boost::integer_traits<int>::const_max)
throw std::length_error(
"Wide string cannot be more than INT_MAX characters long.");
if (wide.size() == 0)
return "";
// Calculate necessary buffer size
int len = ::WideCharToMultiByte(
CP_UTF8, 0, wide.c_str(), static_cast<int>(wide.size()),
NULL, 0, NULL, NULL);
// Perform actual conversion
if (len > 0)
{
std::vector<char> buffer(len);
len = ::WideCharToMultiByte(
CP_UTF8, 0, wide.c_str(), static_cast<int>(wide.size()),
&buffer[0], static_cast<int>(buffer.size()), NULL, NULL);
if (len > 0)
{
assert(len == static_cast<int>(buffer.size()));
return std::string(&buffer[0], buffer.size());
}
}
throw boost::system::system_error(
::GetLastError(), boost::system::system_category);
}
It is more efficient to convert CString to std::string using the conversion where the length is specified.
CString someStr("Hello how are you");
std::string std(someStr, someStr.GetLength());
In a tight loop, this makes a significant performance improvement.
Is there any problem?
There are several issues:
CString is a template specialization of CStringT. Depending on the BaseType describing the character type, there are two concrete specializations: CStringA (using char) and CStringW (using wchar_t).
While wchar_t on Windows is ubiquitously used to store UTF-16 encoded code units, using char is ambiguous. The latter commonly stores ANSI encoded characters, but can also store ASCII, UTF-8, or even binary data.
We don't know the character encoding (or even character type) of CString (which is controlled through the _UNICODE preprocessor symbol), making the question ambiguous. We also don't know the desired character encoding of std::string.
Converting between Unicode and ANSI is inherently lossy: ANSI encoding can only represent a subset of the Unicode character set.
To address these issues, I'm going to assume that wchar_t will store UTF-16 encoded code units, and char will hold UTF-8 octet sequences. That's the only reasonable choice you can make to ensure that source and destination strings retain the same information, without limiting the solution to a subset of the source or destination domains.
The following implementations convert between CStringA/CStringW and std::wstring/std::string mapping from UTF-8 to UTF-16 and vice versa:
#include <string>
#include <atlconv.h>
std::string to_utf8(CStringW const& src_utf16)
{
return { CW2A(src_utf16.GetString(), CP_UTF8).m_psz };
}
std::wstring to_utf16(CStringA const& src_utf8)
{
return { CA2W(src_utf8.GetString(), CP_UTF8).m_psz };
}
The remaining two functions construct C++ string objects from MFC strings, leaving the encoding unchanged. Note that while the previous functions cannot cope with embedded NUL characters, these functions are immune to that.
#include <string>
#include <atlconv.h>
std::string to_std_string(CStringA const& src)
{
return { src.GetString(), src.GetString() + src.GetLength() };
}
std::wstring to_std_wstring(CStringW const& src)
{
return { src.GetString(), src.GetString() + src.GetLength() };
}
(Since VS2012 ...and at least until VS2017 v15.8.1)
Since it's a MFC project & CString is a MFC class, MS provides a Technical Note TN059: Using MFC MBCS/Unicode Conversion Macros and Generic Conversion Macros:
A2CW (LPCSTR) -> (LPCWSTR)
A2W (LPCSTR) -> (LPWSTR)
W2CA (LPCWSTR) -> (LPCSTR)
W2A (LPCWSTR) -> (LPSTR)
Use:
void Example() // ** UNICODE case **
{
USES_CONVERSION; // (1)
// CString to std::string / std::wstring
CString strMfc{ "Test" }; // strMfc = L"Test"
std::string strStd = W2A(strMfc); // ** Conversion Macro: strStd = "Test" **
std::wstring wstrStd = strMfc.GetString(); // wsrStd = L"Test"
// std::string to CString / std::wstring
strStd = "Test 2";
strMfc = strStd.c_str(); // strMfc = L"Test 2"
wstrStd = A2W(strStd.c_str()); // ** Conversion Macro: wstrStd = L"Test 2" **
// std::wstring to CString / std::string
wstrStd = L"Test 3";
strMfc = wstrStd.c_str(); // strMfc = L"Test 3"
strStd = W2A(wstrStd.c_str()); // ** Conversion Macro: strStd = "Test 3" **
}
--
Footnotes:
(1) In order to for the conversion-macros to have space to store the temporary length, it is necessary to declare a local variable called _convert that does this in each function that uses the conversion macros. This is done by invoking the USES_CONVERSION macro. In VS2017 MFC code (atlconv.h) it looks like this:
#ifndef _DEBUG
#define USES_CONVERSION int _convert; (_convert); UINT _acp = ATL::_AtlGetConversionACP() /*CP_THREAD_ACP*/; (_acp); LPCWSTR _lpw; (_lpw); LPCSTR _lpa; (_lpa)
#else
#define USES_CONVERSION int _convert = 0; (_convert); UINT _acp = ATL::_AtlGetConversionACP() /*CP_THREAD_ACP*/; (_acp); LPCWSTR _lpw = NULL; (_lpw); LPCSTR _lpa = NULL; (_lpa)
#endif
This works fine:
//Convert CString to std::string
inline std::string to_string(const CString& cst)
{
return CT2A(cst.GetString());
}
to convert CString to std::string. You can use this format.
std::string sText(CW2A(CSText.GetString(), CP_UTF8 ));
CString has method, GetString(), that returns an LPCWSTR type if you are using Unicode, or LPCSTR otherwise.
In the Unicode case, you must pass it through a wstring:
CString cs("Hello");
wstring ws = wstring(cs.GetString());
string s = string(ws.begin(), ws.end());
Else you can simply convert the string directly:
CString cs("Hello");
string s = string(cs.GetString());
From this post (Thank you Mark Ransom )
Convert CString to string (VC6)
I have tested this and it works fine.
std::string Utils::CString2String(const CString& cString)
{
std::string strStd;
for (int i = 0; i < cString.GetLength(); ++i)
{
if (cString[i] <= 0x7f)
strStd.append(1, static_cast<char>(cString[i]));
else
strStd.append(1, '?');
}
return strStd;
}
This is a follow up to Sal's answer, where he/she provided the solution:
CString someStr("Hello how are you");
std::string std(somStr, someStr.GetLength());
This is useful also when converting a non-typical C-String to a std::string
A use case for me was having a pre-allocated char array (like C-String), but it's not NUL terminated. (i.e. SHA digest).
The above syntax allows me to specify the length of the SHA digest of the char array so that std::string doesn't have to look for the terminating NUL char, which may or may not be there.
Such as:
unsigned char hashResult[SHA_DIGEST_LENGTH];
auto value = std::string(reinterpret_cast<char*>hashResult, SHA_DIGEST_LENGTH);
You can use CT2CA
CString datasetPath;
CT2CA st(datasetPath);
string dataset(st);
All other answers didn't quite address what I was looking for which was to convert CString on the fly as opposed to store the result in a variable.
The solution is similar to above but we need one more step to instantiate a nameless object. I am illustrating with an example. Here is my function which needs std::string but I have CString.
void CStringsPlayDlg::writeLog(const std::string &text)
{
std::string filename = "c:\\test\\test.txt";
std::ofstream log_file(filename.c_str(), std::ios_base::out | std::ios_base::app);
log_file << text << std::endl;
}
How to call it when you have a CString?
std::string firstName = "First";
CString lastName = _T("Last");
writeLog( firstName + ", " + std::string( CT2A( lastName ) ) );
Note that the last line is not a direct typecast but we are creating a nameless std::string object and supply the CString via its constructor.
One interesting approach is to cast CString to CStringA inside a string constructor. Unlike std::string s((LPCTSTR)cs); this will work even if _UNICODE is defined. However, if that is the case, this will perform conversion from Unicode to ANSI, so it is unsafe for higher Unicode values beyond the ASCII character set. Such conversion is subject to the _CSTRING_DISABLE_NARROW_WIDE_CONVERSION preprocessor definition. https://msdn.microsoft.com/en-us/library/5bzxfsea.aspx
CString s1("SomeString");
string s2((CStringA)s1);
If you're looking to convert easily between other strings types, perhaps the _bstr_t class would be more appropriate? It supports converstion between char, wchar_t and BSTR.
Works for me:
std::wstring CStringToWString(const CString& s)
{
std::string s2;
s2 = std::string((LPCTSTR)s);
return std::wstring(s2.begin(),s2.end());
}
CString WStringToCString(std::wstring s)
{
std::string s2;
s2 = std::string(s.begin(),s.end());
return s2.c_str();
}