explain this macro - c++

#define __T(x) L ## x
Found in code from one of the MFC source header file. It is mostly used for converting strings to ........ (I don't know what). If I am correct it converts strings to LPCTSTR...don't know what that type is either...
I can't seem to convert char* into LPCTSTR. While MFC file handling, the following code will always return error while trying to open the file...
char* filepath = "C:\\Program Files\\Microsoft Office\\Office12\\BITMAPS\\STYLES\\GLOBE.WMF";
if( !file.Open((LPCTSTR)filepath , CFile::modeRead, &fexp) )
{
fexp.ReportError();
return 1;
}
But instead if I wrote it this way, it doesn't give error:
if( !file.Open( _T("C:\\Program Files\\Microsoft Office\\Office12\\BITMAPS\\STYLES\\GLOBE.WMF") , CFile::modeRead, &fexp) )
{
fexp.ReportError();
return 1;
}
I am looking at passing a variable as the first argument to the CFile::Open() method.

The ## operator is a preprocessor concatenation operator. That is, this is valid code:
#define DECLARE_PTR(X) typedef std::auto_ptr<X> X##Ptr
DECLARE_PTR(int); // gets expanded to typedef std::auto_ptr<int> intPtr
intPtr i(new int(1));
In your case, the _T macro prepends the Long conversion symbol (L) to the input given. This only works with string literals. That means you can't write
char* str = "ABC";
wchar_t* wstr = _T(str); // error: Lstr is undefined
but you can safely write
char* str = "ABC";
LPTSTR wstr = _T("ABC"); // OK, gets expanded to wchar_t * wstr = L"ABC";
// when UNICODE is defined
// and char * wstr = "ABC"; when unicode is not defined
The L operator is a convertor of char and char* literals to a Long representation (from byte-wide representation to sizeof(wchar_t)-wide representation).

The macro is simply stringizing L with the argument so that:
_T("xyz")
becomes:
L"xyz"
This is the way to make a wstring but, in the non-Unicode versions, _T will map to nothing, so you'll get regular strings there.

_T() allows you to set up your string literals so that you can build as either Unicode or non-unicode.
In non-unicode builds it evaluates to nothing so a string literal is represented as "XYZ" which is a normal narrow string. In a unicode build it evaluates to L (L"XYZ") which tells the compiler that the string literal is a wide character string. This and the various "T" string typedefs LPCTSTR etc. Allow you to write code that builds correctly for unicode and non-unicode builds.
Note that google is your friend, simply typing _T into google gives several useful results...

Related

Conversion (const char*) var goes wrong

I need to convert from CString to double in Embedded Visual C++, which supports only old style C++. I am using the following code
CString str = "4.5";
double var = atof( (const char*) (LPCTSTR) str )
and resutlt is var=4.0, so I am loosing decimal digits.
I have made another test
LPCTSTR str = "4.5";
const char* var = (const char*) str
and result again var=4.0
Can anyone help me to get a correct result?
The issue here is, that you are lying to the compiler, and the compiler trusts you. Using Embedded Visual C++ I'm going to assume, that you are targeting Windows CE. Windows CE exposes a Unicode API surface only, so your project is very likely set to use Unicode (UTF-16 LE encoding).
In that case, CString expands to CStringW, which stores code units as wchar_t. When doing (const char*) (LPCTSTR) str you are then casting from a wchar_t const* to a char const*. Given the input, the first byte has the value 52 (the ASCII encoding for the character 4). The second byte has the value 0. That is interpreted as the terminator of the C-style string. In other words, you are passing the string "4" to your call to atof. Naturally, you'll get the value 4.0 as the result.
To fix the code, use something like the following:
CStringW str = L"4.5";
double var = _wtof( str.GetString() );
_wtof is a Microsoft-specific extension to its CRT.
Note two things in particular:
The code uses a CString variant with explicit character encoding (CStringW). Always be explicit about your string types. This helps read your code and catch bugs before they happen (although all those C-style casts in the original code defeats that entirely).
The code calls the CString::GetString member to retrieve a pointer to the immutable buffer. This, too, makes the code easier to read, by not using what looks to be a C-style cast (but is an operator instead).
Also consider defining the _CSTRING_DISABLE_NARROW_WIDE_CONVERSION macro to prevent inadvertent character set conversions from happening (e.g. CString str = "4.5";). This, too, helps you catch bugs early (unless you defeat that with C-style casts as well).
CString is not const char* To convert a TCHAR CString to ASCII, use the CT2A macro - this will also allow you to convert the string to UTF8 (or any other Windows code page):
// Convert using the local code page
CString str(_T("Hello, world!"));
CT2A ascii(str);
TRACE(_T("ASCII: %S\n"), ascii.m_psz);
// Convert to UTF8
CString str(_T("Some Unicode goodness"));
CT2A ascii(str, CP_UTF8);
TRACE(_T("UTF8: %S\n"), ascii.m_psz);
Found a solution using scanf
CString str="4.5"
double var=0.0;
_stscanf( str, _T("%lf"), &var );
This gives a correct result var=4.5
Thanks everyone for comments and help.

C++ Convert string to TCHAR*

Below is my code :
string szDllPath = "12345";
size_t size = szDllPath.length();
TCHAR* wArr = new TCHAR[size];
for (size_t i = 0; i < size; ++i)
wArr[i] = _T(szDllPath[i]);
cout<<szDllPath<<endl;
cout<<wArr<<endl;
And I get this :
I don't understand why I get the extra characters, and how to solve it ?
You forgot to provide space for and add a terminating zero.
Preface:
New code should not be using Generic-Text Mappings. They were invented in the 90's to ease porting code from ANSI systems (Win9x) to Unicode systems (Windows NT). TCHAR expands to either char or wchar_t controlled by the UNICODE and _UNICODE preprocessor symbols.
Don't use TCHAR, unless you have to support Win9x.
Analysis:
You are making this way more complicated, than it is. Since this code
TCHAR* wArr = ...;
cout << wArr << endl;
writes a character string to cout (vs. a pointer value), you haven't defined the preprocessor symbols UNICODE or _UNICODE. In other words: You are using MBCS character encoding (see Unicode and MBCS), and the following would produce identical output:
string szDllPath = "12345";
cout << szDllPath.c_str() << endl;
If that is really what you want (and I doubt it is), then simply calling std::basic_string::cstr() is sufficient (use std::basic_string::data(), if you need a mutable string).
Recommended solution:
Use Unicode throughout. The following solves your immediate problem, if you are using a character string literal.
wstring szDllPath = L"12345";
wcout << szDllPath.c_str() << endl;
If you aren't using a character string literal and do need to convert from a std::string to a wide character string, you can use MFC's CString class to perform the conversion:
CStringW wideString(szDllPath.c_str()); // Conversion c'tor, taking a const char*
const wchar_t* = (LPCWSTR)wideString; // Invoke operator LPCWSTR()
You need to be careful when working with char* variants of strings in C and C++. They're required to have a terminating 0 at the and, and you didn't provide the space for it
TCHAR* wArr = new TCHAR[size+1];
...
/*add a terminating zero after the loop*/
wArr[size] = 0;
It would work as expected.
Also, avoid such bare char* operations if you can.

Converting string to wchar_t (wide character) C++ [duplicate]

Is there any method?
My computer is AMD64.
::std::string str;
BOOL loadU(const wchar_t* lpszPathName, int flag = 0);
When I used:
loadU(&str);
the VS2005 compiler says:
Error 7 error C2664:: cannot convert parameter 1 from 'std::string *__w64 ' to 'const wchar_t *'
How can I do it?
First convert it to std::wstring:
std::wstring widestr = std::wstring(str.begin(), str.end());
Then get the C string:
const wchar_t* widecstr = widestr.c_str();
This only works for ASCII strings, but it will not work if the underlying string is UTF-8 encoded. Using a conversion routine like MultiByteToWideChar() ensures that this scenario is handled properly.
If you have a std::wstring object, you can call c_str() on it to get a wchar_t*:
std::wstring name( L"Steve Nash" );
const wchar_t* szName = name.c_str();
Since you are operating on a narrow string, however, you would first need to widen it. There are various options here; one is to use Windows' built-in MultiByteToWideChar routine. That will give you an LPWSTR, which is equivalent to wchar_t*.
You can use the ATL text conversion macros to convert a narrow (char) string to a wide (wchar_t) one. For example, to convert a std::string:
#include <atlconv.h>
...
std::string str = "Hello, world!";
CA2W pszWide(str.c_str());
loadU(pszWide);
You can also specify a code page, so if your std::string contains UTF-8 chars you can use:
CA2W pszWide(str.c_str(), CP_UTF8);
Very useful but Windows only.
If you are on Linux/Unix have a look at mbstowcs() and wcstombs() defined in GNU C (from ISO C 90).
mbs stand for "Multi Bytes String" and is basically the usual zero terminated C string.
wcs stand for Wide Char String and is an array of wchar_t.
For more background details on wide chars have a look at glibc documentation here.
Need to pass a wchar_t string to a function and first be able to create the string from a literal string concantenated with an integer variable.
The original string looks like this, where 4 is the physical drive number, but I want that to be changeable to match whatever drive number I want to pass to the function
auto TargetDrive = L"\\\\.\\PhysicalDrive4";
The following works
int a = 4;
std::string stddrivestring = "\\\\.\\PhysicalDrive" + to_string(a);
std::wstring widedrivestring = std::wstring(stddrivestring.begin(), stddrivestring.end());
const wchar_t* TargetDrive = widedrivestring.c_str();

convert string to _T in cpp

I want to convert string or char* to the _T but not able to do.
if i write
_tcscpy(cmdline,_T ("hello world"));
it works perfectly, but if i write
char* msg="hello world";
_tcscpy(cmdline,_T (msg));
it shows an error like: error C2065: 'Lmsg' : undeclared identifier
Please give me a solution.
Thanx in advance.
_T is a macro, defined as (if UNICODE is defined):
#define _T(a) L ## a
which can work only with string-literals. So when you write _T("hi") it becomes L"hi" which is valid, as expected. But when you write _T(msg) it becomes Lmsg which is an undefined identifier, and you didn't intend that.
All you need is this function mbstowcs as:
const char* msg="hello world"; //use const char*, instead of char*
wchar_t *wmsg = new wchar_t[strlen(msg)+1]; //memory allocation
mbstowcs(wmsg, msg, strlen(msg)+1);
//then use wmsg instead of msg
_tcscpy(cmdline, wmsg);
//memory deallocation - must do to avoid memory leak!
delete []wmsg;
_T only works with string literals. All it does is turn the literal into an L"" string if the code's being compiled with Unicode support, or leave it alone otherwise.
Take a look at http://msdn.microsoft.com/en-us/library/dybsewaf(v=vs.80).aspx
You need to use mbtowcs function.
You should also look at this article.
_T is a macro that makes string literals into wide-char string literals by prepending an L before the literal in UNICODE builds.
In other words, when you write _T("Hello") it is as if you had written "Hello" on an ANSI build or L"Hello" on a UNICODE build. The type of the resulting expression is char* or wchar_t* respectively.
_T can not convert a string variable (std::string or char*) to a wchar_t* -- for this, you have to use a function like mbstowcs or MultiByteToWideChar.
Suggestion: It will be much easier for you (and in no way worse) to always make a UNICODE build and forget about _T, TCHAR and all other T-derivatives. Just use wide-character strings everywhere.
_T is not an actual type. It's a macro that prepends string literals with L so that they would be wchar_t*s instead of char*. If you need to convert a char* string to wchar_t* one at runtime, you need mbtowcs for example.
The _T modifier is just a declaration to tell the compiler that the string literal must be interpreted as a utf-16 encoding. The reason it doesn't work on the variable is because the contents of that variable have already been declared as ascii.
As already mentioned the mbstowcs function is what you need to perform the conversion of a char data into utf-16 (wide char) data.

I want to convert std::string into a const wchar_t *

Is there any method?
My computer is AMD64.
::std::string str;
BOOL loadU(const wchar_t* lpszPathName, int flag = 0);
When I used:
loadU(&str);
the VS2005 compiler says:
Error 7 error C2664:: cannot convert parameter 1 from 'std::string *__w64 ' to 'const wchar_t *'
How can I do it?
First convert it to std::wstring:
std::wstring widestr = std::wstring(str.begin(), str.end());
Then get the C string:
const wchar_t* widecstr = widestr.c_str();
This only works for ASCII strings, but it will not work if the underlying string is UTF-8 encoded. Using a conversion routine like MultiByteToWideChar() ensures that this scenario is handled properly.
If you have a std::wstring object, you can call c_str() on it to get a wchar_t*:
std::wstring name( L"Steve Nash" );
const wchar_t* szName = name.c_str();
Since you are operating on a narrow string, however, you would first need to widen it. There are various options here; one is to use Windows' built-in MultiByteToWideChar routine. That will give you an LPWSTR, which is equivalent to wchar_t*.
You can use the ATL text conversion macros to convert a narrow (char) string to a wide (wchar_t) one. For example, to convert a std::string:
#include <atlconv.h>
...
std::string str = "Hello, world!";
CA2W pszWide(str.c_str());
loadU(pszWide);
You can also specify a code page, so if your std::string contains UTF-8 chars you can use:
CA2W pszWide(str.c_str(), CP_UTF8);
Very useful but Windows only.
If you are on Linux/Unix have a look at mbstowcs() and wcstombs() defined in GNU C (from ISO C 90).
mbs stand for "Multi Bytes String" and is basically the usual zero terminated C string.
wcs stand for Wide Char String and is an array of wchar_t.
For more background details on wide chars have a look at glibc documentation here.
Need to pass a wchar_t string to a function and first be able to create the string from a literal string concantenated with an integer variable.
The original string looks like this, where 4 is the physical drive number, but I want that to be changeable to match whatever drive number I want to pass to the function
auto TargetDrive = L"\\\\.\\PhysicalDrive4";
The following works
int a = 4;
std::string stddrivestring = "\\\\.\\PhysicalDrive" + to_string(a);
std::wstring widedrivestring = std::wstring(stddrivestring.begin(), stddrivestring.end());
const wchar_t* TargetDrive = widedrivestring.c_str();