I am trying to convert a C++ library which is using widely DWORD, CString and BYTE in the program, and now I am converting the code from C++ Win32 library to linux program .
Also I am using openSUSE 12.3 and Anjuta IDE to do this , please help me which types I should use instead of mentioned types ?
I think I should use unsigned int for DWORD and string for CString and unsigned char instead of BYTE is it right ?
CString will not convert directly to std::string, but it is a rough equivalent.
BYTE is indeed unsigned char and DWORD is unsigned int. WORD is unsigned short.
You should definitely use typedef actual_type WINDOWS_NAME; to fix the code up, don't go through everywhere to replace the types. I would add a new headerfile that is called something like "wintypes.h", and include that everywhere the "windows.h" is used.
Edit for comment:
With CString, it really depends on how it is used (and whether the code is using what MS calls "Unicode" or "ASCII" strings). I would be tempted to create a class CString and then use std::string inside that. Most of it can probably be done by simply calling the equivalent std::string function, but some functions may need a bit more programming - again, it does depend on what member functions of CString are actually being used.
For LP<type>, that is just a pointer to the <type>, so typedef BYTE* LPBYTE; and typedef DWORD* LPDWORD; will do that.
DWORD = uint32_t
BYTE = uint8_t
These types are not OS specifics and were added to C++11. You need to include <cstdint> to get them. If you have an old compiler you could use boost/cstdint, which is header only.
Use std::string instead CString, but you will need to change some code.
With these changes your code should compile on both Windows and Linux.
I would suggest to use uint32_t and uint8_t from <stdint.h> for DWORD and BYTE and normal char * or const char * for strings (or the std:string class for C++).
Probably best thought is to use typedefs for existing code:
typedef unsigned char BYTE;
These can be changed easily.
If you rewrite code use char, int, long were useful and the (u)intX_t types, were you need a defined size.
typedef unsigned long DWORD;
typedef unsigned char BYTE;
CString -> maybe basic_string<TCHAR> ?
Related
I am using SQLITE3 and can successfully read data from a SQLITE database
table and display it in C++ like so:
cout << sqlite3_column_text(dbResult, 1);
However, I need to convert the column result into a string.
Is there perhaps an easy way in C++ to convert char into string?
Have been trying to find a solution, but to no avail.
Any suggestion would be much appreciated.
According to doc. sqlite3_column_text() is declared as:
const unsigned char *sqlite3_column_text(sqlite3_stmt*, int iCol);
For any reason, it returns a const unsigned char*. (This might be for historical reasons to emphasize the fact that the returned string is UTF-8 encoded.)
Thus, for assignment to a std::string (which can be assigned with const char* expressions among others), a small dirty trick does the job:
std::string myResult = (const char*)sqlite3_column_text(dbResult, 1);
This reclaims the sequence of unsigned chars to be a sequence of chars.
Please, note that the signedness of char is left to the compiler implementation and may be signed or unsigned. (In the major compilers MSVC, g++, clang, it's in fact signed.) Hence, it's accompanied by types signed char and unsigned char to make the signedness explicit (and independent of the used compiler) when necessary. The conversion in the above snippet doesn't change any contents of the returned string — it just makes it compatible for the assignment to std::string.
Googling a bit, I found another Q/A where the answer explains that the "small dirty trick" is legal according to the C++ standard:
Can I turn unsigned char into char and vice versa?
Now I am doing a Windows Metro App job, and I am developing a C++ component to get the input method's candidate List.
An issue occurs at the last step to output the result list. I can get each member of the candidate list, but the type is the "Bstr", to get them in the Metro App, I have to convert them into the type of "Platform::String".
How to convert Bstr to Platform::String?
I really appreciate any help you can provide.
The Platform::String constructor overloads make this very easy:
BSTR comstr = SysAllocString(L"Hello world");
auto wrtstr = ref new Platform::String(comstr);
To be perfectly complete, both BSTR and a WinRT string can contain an embedded 0. If you want to handle that corner case then:
BSTR comstr = SysAllocStringLen(L"Hello\0world", 11);
auto wrtstr = ref new Platform::String(comstr, SysStringLen(comstr2));
According to MSDN ([1],[2]) we have this definitions:
#if !defined(_NATIVE_WCHAR_T_DEFINED)
typedef unsigned short WCHAR;
#else
typedef wchar_t WCHAR;
#endif
typedef WCHAR OLECHAR;
typedef OLECHAR* BSTR;
So BSTR is of type wchar_t* or unsigned short*, or a null terminated 16-bit string.
From what I see from the documentation of Platform::String the constructor accepts a null-terminated 16-bit string as const char16*
While char16 is guaranteed represent UTF16, wchar is not.
UPDATE: As Cheers and hth. - Alf pointed out the line above isn't true for Windows/MSVC. I couldn't find any information on whether it is safe to cast between both types in this case. Since both wchar_t and char16_t are different integrated types I would still recommend to use std::codecvt to avoid problems.
So you should use std::codecvt to convert from wchar_t* to char16_t* and pass the result to the constructor of your Platform::String.
Is there any c++ class that can be used like a string. Which has all stuff needed like comparators and etc?
I want to have something like string class that works on array of bytes instead of chars. I'm just asking because I don't want to write again something that already exists. I will use this class in std::map and etc.
That's exactly what an std::string is. A char is essentially a byte. It takes up one byte of space and it accepts all logical and bitwise operators (bit shifting: <<, >>; logical comparisons: &, |; etc.).
If for some reason you need something like an std::string but for a different datatype, simply use std::basic_string<DATATYPE>. In the STL, string itself is a typedef for basic_string<char>.
There is no such thing as byte in c++. You can use std::vector with unsigned char which has similar effect as byte in Java for example.
typedef unsigned char BYTE;
typedef std::vector<BYTE> ByteString;
Here's an interesting one. I'm writing an AES encryption algorithm, and have managed to get it making accurate encryptions. The trouble comes when I attempt to write the result to a file. I was getting files with incorrect output. Hex values would be mangled and it was just generally nonsensical (even by encrypted standards).
I did some debugging by sampling my encryption output before sending it to the file. What I found was that I was getting some type of overflow somewhere. When the correct hex value was supposed to be 9e, I would get ffffff9e. It would do this only to hex values above 7F, i.e. characters in the "extended" character set weren't being handled properly. This had happened to me earlier in my project as well, and the problem then had been using a char[][] container instead of an unsigned char[][] container.
My code uses strings to pass the encrypted data between the user interface and AES encryption class. I'm guessing that std::strings don't support the extended character set. So my question is: is there a way to instantiate an unsigned string, or will I have to find a way to replace all of my usage of strings?
std::string is really just a typedef, something like:
namespace std {
typedef basic_string<char> string;
}
It's fairly easy to create a variant for unsigned char:
typedef basic_string<unsigned char> ustring;
You will, however, have to change your code to use a ustring (or whatever name you prefer) instead of std::string though.
Depending on how you've written your code, that may not require editing all the code though. In particular, if you have something like:
namespace crypto {
using std::string;
class AES {
string data;
// ..
};
}
You can change the string type by changing only the using declaration:
namespace unsigned_types {
typedef std::basic_string<unsigned char> string;
}
// ...
namespace crypto {
using unsigned_types::string;
class AES {
string data;
};
}
Also note that different instantiations of a template are entirely separate types, even when the types over which they're intantiated are related, so the fact that you can convert implicitly between char and unsigned char doesn't mean you'll get a matching implicit conversion between basic_string<char> and basic_string<unsigned char>.
std::string is nothing more or less than a specialization of the std::basic_string<> template, so you can simply do a
typedef std::basic_string<unsigned char> ustring;
to get what you want.
Note that the C/C++ standards do not define whether char is the signed or the unsigned variety, so any program that casts a char directly to a larger type invokes implementation defined behaviour.
Cast your value to unsigned char first:
char input = 250; // just an example
unsigned int n = static_cast<unsigned char>(input); // NOT: "unsigned int n = input;"
// ^^^^^^^^^^^^^^^^^^^^^^^^^^
The problem is that your char happens to be signed, and so its value is not the "byte value" that you want -- you have to convert to unsigned char to get that.
I'm working with a Microsoft Kinect SDK where functions return BSTR. I need to get a QString or std::string.
Here's what I've tried:
BSTR bstr = s->NuiUniqueId();
// QString qs((QChar*)bstr, SysStringLen(bstr));
std::wstring ws(bstr);
ui->lblDetails->setText(QString::fromStdWString(ws));
With this solution the program crashes. With the line that is commented out I get "unresolved external symbol SysStringLen".
Is SysStringLen the way to go, but I need to add some additional libraries (wouldn't the API include it already) or is there another solution?
Additional question: why does Microsoft do it? I mean:
#if !defined(_NATIVE_WCHAR_T_DEFINED)
typedef unsigned short WCHAR;
#else
typedef wchar_t WCHAR;
#endif
typedef WCHAR OLECHAR;
typedef OLECHAR* BSTR;
typedef BSTR* LPBSTR;
What's the reason behind stuff like this? And even if they find it beneficial to use it internally, couldn't they just use normal char array or std::(w)string in the API to make other's life easier?
You can convert the BSTR object to char *, then convert it to QString. Here:
QString *getQStringFromBstr(BSTR bstrVal){
char *p= const_cast<char *>(_com_util::ConvertBSTRToString(bstrVal));
return new QString(p);
}
COM was designed to be Language-agnostic binary equalizer. Which means I could use a VB function in C++ and C++ function in, say, C#(with COM interop). This is the reason most of the strings and few functions were changed to language neutral strings IIRC.