How would you convert a std::string to BSTR*?
STDMETHODIMP CMyRESTApp::rest(BSTR data, BSTR* restr)
{
RESTClient restclient;
RESTClient::response resp = restclient.get(data);
Log("Response Status code: %s", resp.code);
Log("Response Body: %s", resp.body);
*restr = // here
return S_OK;
}
I need convert the resp.body and this then to be returned for the *restr here.
An ATL based approach is to use ATL::CComBSTR and then a Detach() (or CopyTo(...)) the resultant CComBSTR to the BSTR*
Something like:
CComBSTR temp(stlstr.c_str());
*restr = temp.Detach();
Else in general for std::basic_string you can use the Win32 API Sys* family of functions, such as SysAllocStringByteLen and SysAllocString;
// For the `const char*` data type (`LPCSTR`);
*restr = SysAllocStringByteLen(stlstr.c_str(), stlstr.size());
// More suitable for OLECHAR
*restr = SysAllocString(stlwstr.c_str());
OLECHAR depends on the target platform, but generally it is wchar_t.
Given your code, the shortest snippet could just be;
*restr = SysAllocStringByteLen(resp.body.c_str(), resp.body.size());
Note these Windows API functions use the "usual" windows code page conversions, please see further MSDN documentation on how to control this if required.
std::string is made by chars; BSTR is usually a Unicode UTF-16 wchar_t-based string, with a length prefix.
Even if one could use a BSTR as a simple way to marshal a byte array (since the BSTR is length-prefixed, so it can store embedded NULs), and so potentially a BSTR could be used also to store non-UTF-16 text, the usual "natural" behavior for a BSTR is to contain a Unicode UTF-16 wchar_t-string.
So, the first problem is to clarify what kind of encoding the std::string uses (for example: Unicode UTF-8? Or some other code page?). Then you have to convert that string to Unicode UTF-16, and create a BSTR containing that UTF-16 string.
To convert from UTF-8 (or some other code page) to UTF-16, you can use the MultiByteToWideChar() function. If the source std::string contains a UTF-8 string, you can use the CP_UTF8 code page value with the aforementioned API.
Once you have the UTF-16 converted string, you can create a BSTR using it, and pass that as the output BSTR* parameter.
The main Win32 API to create a BSTR is SysAllocString(). There are also some variants in which you can specify the string length.
Or, as a more convenient alternative, you can use the ATL's CComBSTR class to wrap a BSTR in safe RAII boundaries, and use its Detach() method to pass the BSTR as an output BSTR* parameter.
CComBSTR bstrResult( /* UTF-16 string from std::string */ );
*restr = bstrResult.Detach();
Bonus reading:
Eric's Complete Guide To BSTR Semantics
This is very much possible :
std::string singer("happy new year 2016");
_bstr_t sa_1(singer.c_str()); //std::string to _bstr_t
_bstr_t sa_2("Goodbye 2015");
std::string kapa(sa_2); //_bstr_t to std::string
size_t sztBuffer = (resp.body.length() + 1) * sizeof(wchar_t);
wchar_t* pBuffer = new wchar_t[resp.body.length() + 1];
ZeroMemory(&pBuffer[0], sztBuffer);
MultiByteToWideChar(CP_ACP, 0, resp.body.c_str(), resp.body.length(), pBuffer, sString.length());
SysAllocString((OLECHAR*)pBuffer);
delete[] pBuffer;
Do not forget to deallocate it afterward.
Related
Is there an icu function to create a std::wstring from an icu UnicodeString ? I have been searching the ICU manual but haven't been able to find one.
(I know i can convert UnicodeString to UTF8 and then convert to platform dependent wchar_t* but i am looking for one function in UnicodeString which can do this conversion.
The C++ standard doesn't dictate any specific encoding for std::wstring. On Windows systems, wchar_t is 16-bit, and on Linux, macOS, and several other platforms, wchar_t is 32-bit. As far as C++'s std::wstring is concerned, it is just an arbitrary sequence of wchar_t in much the same way that std::string is just an arbitrary sequence of char.
It seems that icu::UnicodeString has no in-built way of creating a std::wstring, but if you really want to create a std::wstring anyway, you can use the C-based API u_strToWCS() like this:
icu::UnicodeString ustr = /* get from somewhere */;
std::wstring wstr;
int32_t requiredSize;
UErrorCode error = U_ZERO_ERROR;
// obtain the size of string we need
u_strToWCS(nullptr, 0, &requiredSize, ustr.getBuffer(), ustr.length(), &error);
// resize accordingly (this will not include any terminating null character, but it also doesn't need to either)
wstr.resize(requiredSize);
// copy the UnicodeString buffer to the std::wstring.
u_strToWCS(wstr.data(), wstr.size(), nullptr, ustr.getBuffer(), ustr.length(), &error);
Supposedly, u_strToWCS() will use the most efficient method for converting from UChar to wchar_t (if they are the same size, then it is just a straightfoward copy I suppose).
I am making use of the EAGetMail Library but instead of hard coding the username and password, I am trying to pass two CString values as the credentials but it doesn't seem to like that.
CString username;
pObject->GetDlgItemText(IDC_EDIT1, username);
CString password;
pObject->GetDlgItemText(IDC_EDIT2, password);
IMailServerPtr oServer = NULL;
oServer.CreateInstance(__uuidof(EAGetMailObjLib::MailServer));
oServer->User = _T("myusername"); //THIS WORKS HARD CODED
Doesn't work:
oServer->User = username //Error, cannot be called with given argument list
I also tried:
oServer->User = _T(username); //L Username is undefined.
I guess I need to convert the CString somehow?
Tried the following:
//Get Email Credentials
CString username;
pObject->GetDlgItemText(IDC_EDIT1, username);
CString password;
pObject->GetDlgItemText(IDC_EDIT2, password);
_bstr_t usernamea(pObject->GetDlgItemText(IDC_EDIT1, username));
_bstr_t passworda(pObject->GetDlgItemText(IDC_EDIT2, password));
This worked.
//Get Email Credentials
CString username;
pObject->GetDlgItemText(IDC_EDIT1, username);
CString password;
pObject->GetDlgItemText(IDC_EDIT2, password);
CComBSTR bstrUsername; bstrUsername = username.AllocSysString();
_bstr_t usernamea(bstrUsername);
CComBSTR bstrPassword; bstrPassword = password.AllocSysString();
_bstr_t passworda(bstrPassword);
oServer->User = usernamea;
oServer->Password = passworda;
Using C-style character pointers is the price we have to pay for making lower-level system calls to the OS. Microsoft makes it more complicated because it has pairs of system calls, one for char * and wchar_t for the other. Typically you want to be consistent with one set or the other, so Microsoft provides a host of macros so that you can refer to the typical type of character TCHAR in the abstract.
_T is a Microsoft-specific macro that you should use only on string literals. It prepends an L to the literal when compiling a Unicode program (when TCHAR IS wchar_t) and does nothing for other programs (where TCHAR is char).
You've demonstrated that IMailServer::User will accept a string literal of type TCHAR const * but you store your data in a n MFC CString.
From the documentation of the MFC CString object:
A CString object keeps character data in a CStringData object. CString accepts NULL-terminated C-style strings. CString tracks the string length for faster performance, but it also retains the NULL character in the stored character data to support conversion to LPCWSTR. CString includes the null terminator when it exports a C-style string.
You can convert a CString to a TCHAR * but for this you have to do it explicitly:
oServer->User = (LPCTSTR)username;
or the C++ way
oServer->User = static_cast<TCHAR const *>(username);
this will call CStringT<TCHAR>::operator PCXSTR to get the raw character pointer.
(Personally I'd use a std::basic_string<TCHAR> rather than a CString but that's just me).
Now I am doing a Windows Metro App job, and I am developing a C++ component to get the input method's candidate List.
An issue occurs at the last step to output the result list. I can get each member of the candidate list, but the type is the "Bstr", to get them in the Metro App, I have to convert them into the type of "Platform::String".
How to convert Bstr to Platform::String?
I really appreciate any help you can provide.
The Platform::String constructor overloads make this very easy:
BSTR comstr = SysAllocString(L"Hello world");
auto wrtstr = ref new Platform::String(comstr);
To be perfectly complete, both BSTR and a WinRT string can contain an embedded 0. If you want to handle that corner case then:
BSTR comstr = SysAllocStringLen(L"Hello\0world", 11);
auto wrtstr = ref new Platform::String(comstr, SysStringLen(comstr2));
According to MSDN ([1],[2]) we have this definitions:
#if !defined(_NATIVE_WCHAR_T_DEFINED)
typedef unsigned short WCHAR;
#else
typedef wchar_t WCHAR;
#endif
typedef WCHAR OLECHAR;
typedef OLECHAR* BSTR;
So BSTR is of type wchar_t* or unsigned short*, or a null terminated 16-bit string.
From what I see from the documentation of Platform::String the constructor accepts a null-terminated 16-bit string as const char16*
While char16 is guaranteed represent UTF16, wchar is not.
UPDATE: As Cheers and hth. - Alf pointed out the line above isn't true for Windows/MSVC. I couldn't find any information on whether it is safe to cast between both types in this case. Since both wchar_t and char16_t are different integrated types I would still recommend to use std::codecvt to avoid problems.
So you should use std::codecvt to convert from wchar_t* to char16_t* and pass the result to the constructor of your Platform::String.
When creating a BSTR (using _bstr_t as wrapper class) I have to use some of the constructors of _bstr_t. Since a BSTR is a length prefixed string that may contains null-characters, there must be a possiblity to create such a string using a native string without relying on the null-termination of the given native string.
To give an example:
wchar_t* pwNativeString = L"abc\0def\0\0ghi\0\0\0"; // + automatic "\0"
// Now I want to create a BSTR using _bstr_t by this string.
_bstr_t spBSTR = _bstr_t(pwNativeString);
The problem is that the constructor relies on the null-termination of pwNativeString. So the resulting BSTR is just "abc" and nothing more. So my question is: How to create a BSTR or _bstr_t and deliver a pointer to an array with a specific length? In the following a pseudo-code example:
_bstr_t spBSTR = _bstr_t(pwNativeString, 15);
Use SysAllocStringLen to allocate the BSTR, and then use the two-argument _bstr_t constructor to create a _bstr_t object from it. If you set the second parameter to true, then you'll need to call SysFreeString afterward. Otherwise, the _bstr_t object owns the string and will free it for you.
BSTR bstrIn = SysAllocStringLen(L"abc\0def\0\0ghi\0\0\0", 15);
_bstr_t spBSTR(bstrIn, false);
I have a _bstr_t variable bstrErr and I am having a CString variable csError. How do I set the value which come in bstrErr to csError?
Is it not possible just to cast it:
_bstr_t b("Steve");
CString cs;
cs = (LPCTSTR) b;
I think this should work when the project is Unicode.
CString has contructors and assignment operators for both LPCSTR and LPCWSTR, so there is never a need to call WideCharToMultiByte, and you can't get the casting wrong in unicode or non-unicode mode.
You can just assign the string this way:
csError = bstrErr.GetBSTR();
Or use the constructor
CString csError( bstrErr.GetBSTR() );
I'm using GetBSTR. It's the same thing as casting bstrErr with (LPCWSTR), but I prefer it for legibility.
If you compile for Unicode - just assign the encapsulated BSTR to the CString. If you compile for ANSI you'll have to use WideCharToMultiByte() for conversion.
Also beware that the encapsulated BSTR can be null which corresponds to an empty string. If you don't take care of this your program will run into undefined behaviour.
BSTR myBSTRVal;
CString BSTRasCString("")
char szValue[MAX_PATH] = "";
// This will map the BSTR to a new character string (szValue)
WideCharToMultiByte(CP_ACP, 0, myBSTRVal, -1, szValue, sizeof(szValue), NULL,
NULL);
BSTRasCString.Format("%s", szValue);
BSTRasCString.TrimLeft();
BSTRasCString.TrimRight();
CStringT,CString, CStringA, and CStringW:
CStringT is a complicated class template based on an arbitrary character type and helper class templates for managing the storage and the features.
The class CString is a typedef of the template class that uses the TCHAR character type. TCHARis a generic type that resolves to wchar if the macro UNICODE is set, else to char.
The class CStringA is a typedef of the template class that uses internally the narrow character type char.
The class CStringW is a typedef of the template class that uses internally the wide character type wchar_t.
I never use CString in code, instead I always use the explicit classes CStringA or CStringW.
The classes CString* have constructors that accept narrow and wide strings. The same is true for _bstr_t. Strings of type BSTR must be allocated by the function SysAllocString() that expects an OLECHAR string, hence in Win32/64 a wide string. If you want to copy a _bstr_t that contains Unicode to a CStringA you must convert it to UTF8. I use the classes CW2A and CA2W for conversion.
In the following event function of a Word add-in, I show the use of these types:
STDMETHODIMP CConnect::TestButtonClicked(IDispatch* Command)
{
BSTR smi = SysAllocString(L"Two smileys 😊 in a row: ");
_bstr_t ley = L"😊\U0001F60A";
/* Either using CStringA, UTF16 -> UTF8 conversion needed */
CStringA smiley(CW2A(smi, CP_UTF8));
smiley += CW2A(ley.GetBSTR(), CP_UTF8);
MessageBoxW(NULL, CA2W(smiley, CP_UTF8), L"Example", MB_OK | MB_TASKMODAL);
/* Or using CStringW, use ctor and += operator directly
CStringW smiley = smi;
smiley += ley.GetBSTR();
MessageBoxW(NULL, smiley, L"Example", MB_OK | MB_TASKMODAL);
*/
SysFreeString(smi);
return S_OK;
}