I'm trying to compile the MongoDB c++ driver into my project and I've run across an interesting error.
in util/text.h, you can find this code:
/* like toWideString but UNICODE macro sensitive */
# if !defined(_UNICODE)
#error temp error
inline std::string toNativeString(const char *s) { return s; }
# else
inline std::wstring toNativeString(const char *s) { return toWideString(s); }
# endif
It looks like you should be able to compile it without the _UNICODE define, yet there is this seemingly arbitrary line #error temp error which causes the failure. On Github, this seems to have been the case for the lifetime of the file. Does anyone know if it's safe to remove it?
Unfortunately I can't just compile this project in unicode because there are a number of unicode incompatible sources in the project as well.
Cheers
Kyle
Related
I have the below program, which tries to print Unicode characters on a console by enabling the _O_U16TEXT mode for the console:
#include <iostream>
#include <fcntl.h>
#include <io.h>
int main()
{
_setmode(_fileno(stdout), _O_U16TEXT);
wprintf(L"test\n \x263a\x263b Hello from C/C++\n");
return 0;
}
What is unclear to me is that, I have seen many C++ projects (with the same code running on Windows and Linux) and using a macro called _UNICODE. I have the following questions:
Under what circumstance do I need to define the _UNICODE macro?
Does enabling the _UNICODE macro mean I need to separate the ASCII related code by using #ifdef _UNICODE? In case of the above program, do I need to put any #ifdef UNICODE and the ASCII code processing in #else?
What extra do I need to enable the code to have Unicode support in C/C++? Do I need to link to any specific libraries on Windows and Linux?
Why does my sample program above not need to define the _UNICODE macro?
When I define the _UNICODE macro, how does the code know whether it uses UTF-8, UTF-16 or UTF-32? How do I decide between these Unicode types?
I have a vc++ dll that is compiled with charset set to 'Use Unicode Character Set'.
Now i want to use this dll in my vc++ exe whose charset is 'Use Multi-Byte Character Set'. I know that theoretically nothing stops me from doing this as after compiling the vc++ dll, all the function signatures would be either wchar_t or LPCWSTRs .. And in my vc++ exe i just create strings of that format and call the exported functions. But, The problem i am facing is , The header in unicode dll takes TCHAR parameters like for ex:
class MYLIBRARY_EXPORT PrintableInt
{
public:
PrintableInt(int value);
...
void PrintString(TCHAR* somestr);
private:
int m_val;
};
Now i want to use this PrintString() in my exe. So, I included the header and used it like below:
#include "unicodeDllHeaders.h"
PrintableInt p(2);
wchar_t* someStr = L"Some str";
p.PrintString(someStr);
This as expected gives me a compiler error :
error C2664: 'PrintableInt::PrintString' : cannot convert parameter 1 from 'wchar_t *' to 'TCHAR *'
As the exe is built with MBCS TCHAR is defined to char . So, what i thought would solve this issue is :
#define _UNICODE
#include "unicodeDllHeaders.h"
#undef _UNICODE
But after defining _UNICODE also i still get the compilation error. So, my next guess was that probably TCHAR.h was already included before the #include "unicodeDllHeaders.h" , when i searched for the inclusion of TCHAR.h it was there else where in the project. So, I moved the inclusion to after definition of _UNICODE , This solved the compilation error here but it is failing in the other places where TCHAR is expected to be made char . So, My question is :
Can i somehow make TCHAR resolve to char and wchar_t in the same project ? I tried #define TCHAR char, #undef TCHAR , #define TCHAR wchar_t but its failing in the c headers like xutils
You can not retroactively change the binary interface of your DLL, regardless of what do to your header file by using macros. The typical solution is to firstly dump the whole legacy MBCS stuff. This stopped being interesting 10 years ago, nowadays all targets supporting the win32 API support the wide character interface with full Unicode support.
If you want that retro feeling of the nineties, you can also compile your DLL twice, once with CHAR and once with WCHAR. The latter then typically get a "u" suffix (for Unicode). In your header, you then check the charset and delegate to the according DLL using #pragma comment lib...
The article Unicode apps in the MinGW-w64 wiki explains the following example for an Unicode application, e.g. _main.c_:
#define UNICODE
#define _UNICODE
#include <tchar.h>
int _tmain(int argc, TCHAR * argv[])
{
_tprintf(argv[1]);
return 0;
}
The above code makes use of tchar.h mapping, which allows it to both compile in Unicode and non-Unicode mode. [...] The -municode option is still required when linking if Unicode mode is used.
So I used
C:\> i686-w64-mingw32-gcc main.c -municode -o hello
_tmain.c:1:0: warning: "UNICODE" redefined
#define UNICODE
^
<command-line>:0:0: note: this is the location of the previous definition
to compile a Unicode application. But, when I run it, it returns
C:\> hello Süßer
S³▀er
So the Unicode string is wrong. I used the latest version 4.9.2 of MinGW-w64, i686 architecture and tried the Win32 and POSIX theads variants, both result in the same error. My operating system is 32-bit German Windows 7. When I used the Unicode codepage (chcp 65001), I have to use the font "Lucida Console". With this setting I get a similar error:
C:\> hello Süßer
S��er
I want to use a parameter with "ü" or "ß" in a Windows C++ program.
Solution
nwellnhof is right: The problem is the output on the console. This problem is explained in Unicode part 1: Windows console i/o approaches und Unicode part 2: UTF-8 stream mode. The latter gives a solution for Visual C++ - it worked also with Intel C++ 15. This blog post does "not yet consider the g++ compiler. All this code is Visual C++ specific. However, [the blog author has] done generally the same with g++, and [he] will probably discuss that in a third installment."
I want to open a file, which name is given by a parameter. This works simple, e. g. main.c:
#include <iostream>
#include <fstream>
using namespace std;
int main(int argc, char* argv[])
{
if ( argc > 1 ) {
// The output will be wrong, ...
cout << argv[1] << endl;
// but the name of this file will be right:
fstream fl_rsl(argv[1], ios::out);
fl_rsl.close();
}
return 0;
}
and the compilation without unicode mode
C:\> g++ main.cpp -o hello && hello Süßer
It's console output is still wrong, but the created filename is right. This is okay for me.
I have a game that is sold on Steam, and as such uses the Steamworks SDK. This has an automatic error-collecting tool as described briefly here.
Every time my game generates an unhandled exception, it is logged on the tool's web site. I've noticed that when the crash occurs on MY development build, the logged crash includes filenames and line numbers. However, when the crash occurs on a user machine, this info is absent.
Is this probably because I have the PDBs on my machine but not the user's machine?
Are there any compilation flags that might bake limited information into the EXE, so that the error reporting tool might be able to grab it?
I realize this is a bit of a longshot question and asked in relation to a specific tool. I asked because I'm hoping there is general knowledge (about compilation flags, etc) which I can apply to my specific situation.
I don't know Steamworks SDK, but will at least try to explain common usage of preprocessor NDEBUG, _DEBUG, __FILE__ and __LINE__ on classic assert.h (taken from Windows SDK / VC include):
#include <crtdefs.h>
#undef assert
#ifdef NDEBUG
#define assert(_Expression) ((void)0)
#else
#ifdef __cplusplus
extern "C" {
#endif
_CRTIMP void __cdecl _wassert(_In_z_ const wchar_t * _Message, _In_z_ const wchar_t *_File, _In_ unsigned _Line);
#ifdef __cplusplus
}
#endif
#define assert(_Expression) (void)( (!!(_Expression)) || (_wassert(_CRT_WIDE(#_Expression), _CRT_WIDE(__FILE__), __LINE__), 0) )
#endif /* NDEBUG */
Release Build usually disables asserts by defining NDEBUG while Debug Build usually leave NDEBUG undefined (to enable asserts) and include _DEBUG for additional checks (while Work Build may have both undefined). Look at the definition of assert:
#define assert(_Expression) (void)( (!!(_Expression)) \
|| (_wassert(_CRT_WIDE(#_Expression), \
_CRT_WIDE(__FILE__), __LINE__), 0) )
If everything else fails (defining/undefining NDEBUG / _DEBUG) you can use __FILE__ and __LINE__ yourself - to include that in whatever message string you are passing to the engine (or to those exceptions you may throw).
I'm going to presume you export code in Release Mode in Visual Studio, as opposed to Debug.
Visual Studio removes (by optimizing) some debugging elements, such as Memory Logging (_CrtDumpMemoryLeaks), but I am not an expert in what it does and doesn't remove. I would start with the link below, which covers debugging in release mode.
http://msdn.microsoft.com/en-us/library/fsk896zz.aspx
I am trying to migrate existing c++ 32 code to 64 code on windows7 with visual studio 2010.i never did 64bit compilation before. with the help of internet references i did the setup for 64 bit compilation. like VS2010 with 64 bit compiler etc and other configuration changes.
In the preprocessor i removed WIN32 and added WIN64. i have some other pre-processors like OS_WIN_32 and some other which are specific in my code.
In the code wherever WIN32 was getting used i added extra condition as || WIN64 this is just to ensure that application should get compiled with win32 as well as win64.
When i am trying to compile the code i am getting the compilation error saying
fatal error C1189: #error : Only one of the WIN32 and WIN64 symbols should be defined
this error is coming from the local code where we have a check whether both WIN32 and WIN64 are defined. that code is as shown below.
#if defined WIN32 && defined WIN64
# error Only one of the WIN32 and WIN64 symbols should be defined
#endif
in VS2010 if macros are not enabled then the code inside the macro gets greyed out. in my code also the above error is greyed out. but still i am getting that error.
The code where i added WIN64 is including windows.h. for reference givine it below.
#if defined WIN32 || defined WIN64
#include <windows.h>
#include <process.h>
#endif
So my question is why i am getting this error? shouldnt we add windows.h for 64bit compilation.? i tried by commenting this inclusion but i am getting other errors wrt HANDLE which are used in the code.
If i go to WIN32 definition VS2010 is pointing to a definition in windef.h file. This file is present in Microsoft SDKs\windows\v7.0A\include folder i.e. not my local code.
for referance that definition is given below.
#ifndef WIN32
#define WIN32
#endif
So i want to know why compiler is getting both pre-processors WIN32 and WIN64.
Thanks in advance for your help.
You shouldn't define either yourself. The macro's that should be used to check this are
_WIN32 // always defined for Windows apps
_WIN64 // only defined for x64 compilation
These are defined by the compiler (see here).
Often, the IDE adds the unprefixed macros to the commandline to not let legacy projects which use the non-documented unprefixed versions fail to build. The fact that they work is not a reason to use them, when documented alternatives are present.
It boils down to this:
#ifdef _WIN32
// We're on Windows, yay!
#ifdef _WIN64
// We're on x64! Yay!
#else // _WIN64
// We're on x86 (or perhaps IA64, but that one doesn't matter anymore). Yay!
#endif // _WIN64
#else // _WIN32
// We're not on Windows, maybe WindowsCE or WindowsPhone stuff, otherwise some other platform
#endif