I use Visual Studio 2013 and I set my project's Character Set to Use Multi-Byte Character Set but still if I write
LPTSTR foo = new TCHAR[1000];
foo becomes a wchar_t*. On https://msdn.microsoft.com/en-us/library/windows/desktop/aa383751%28v=vs.85%29.aspx it says that
#ifdef UNICODE
typedef LPWSTR LPTSTR;
#else
typedef LPSTR LPTSTR;
#endif
hence I would suspect that since I do not use UNICODE foo would be a char*
Have I misunderstood anything conceptual, or something in the studio?
Thanks in advance!
Related
I have the below program, which tries to print Unicode characters on a console by enabling the _O_U16TEXT mode for the console:
#include <iostream>
#include <fcntl.h>
#include <io.h>
int main()
{
_setmode(_fileno(stdout), _O_U16TEXT);
wprintf(L"test\n \x263a\x263b Hello from C/C++\n");
return 0;
}
What is unclear to me is that, I have seen many C++ projects (with the same code running on Windows and Linux) and using a macro called _UNICODE. I have the following questions:
Under what circumstance do I need to define the _UNICODE macro?
Does enabling the _UNICODE macro mean I need to separate the ASCII related code by using #ifdef _UNICODE? In case of the above program, do I need to put any #ifdef UNICODE and the ASCII code processing in #else?
What extra do I need to enable the code to have Unicode support in C/C++? Do I need to link to any specific libraries on Windows and Linux?
Why does my sample program above not need to define the _UNICODE macro?
When I define the _UNICODE macro, how does the code know whether it uses UTF-8, UTF-16 or UTF-32? How do I decide between these Unicode types?
I have the following code:
#define UNICODE
// so strange??
GetModuleBaseName( hProcess, hMod, szProcessName,
sizeof(szProcessName)/sizeof(TCHAR) );
But the compiler still report error like this:
error C2664: “DWORD K32GetModuleBaseNameA(HANDLE,HMODULE,LPSTR,DWORD)”: 无法将参数 3 从“wchar_t [260]”转换为“LPSTR” [E:\source\mh-gui\build\src\mhgui.vcxproj]
Which means cant convert param 3 from wchar_t[260] to LPSTR. It's look's like that still looking for A version api?
You must put
#define UNICODE
#define _UNICODE
BEFORE
#include <Windows.h>
The Windows header uses #ifdef UNICODE (et al), so if you want to make the distinction count, the #defines must occur before the #include.
edit: Because these #defines are functionally global, the most reliable place to add them is in your compiler options, so the ordering doesn't matter then.
Since you are using visual studio, instead of defining UNICODE yourself you should enable the W version by right clicking the project in the solution explorer -> properties -> Advanced -> Change the "Character Set" option to "Use Unicode Character Set"
I am trying a Create a Linux C++ project using the same header and .cpp files from a Windows C++ project using Visual Studio. I am using below function to load a DLL dynamically in Windows
HINSTANCE hGetProcIDDLL = LoadLibraryA(sDllPath.c_str());
GetPluginInfoList GetInfoList = (GetPluginInfoList)GetProcAddress(hGetProcIDDLL, "GetPluginInfoList");
I think these functions hail from <windows.h>
When it comes to Linux C++ project I am not getting those functionalities.
For Linux C++, what is the replacement for HINSTANCE and LoadLibraryA?
I am posting my answer here. Thanks everyone for the support
typedef CPluginInfoList(*GetPluginInfoList)(void);
#if _WINDLL
HINSTANCE hGetProcIDDLL = LoadLibraryA(sDllPath.c_str());
#else
void* hGetProcIDDLL = dlopen(sDllPath.c_str(), RTLD_LAZY);
#endif
#if _WINDLL
GetPluginInfoList GetInfoList = (GetPluginInfoList)GetProcAddress(hGetProcIDDLL, "GetPluginInfoList");
#else
GetPluginInfoList GetInfoList = (GetPluginInfoList)dlsym(hGetProcIDDLL, "GetPluginInfoList");
#endif
GetInfoList(); //Function Call
I have a vc++ dll that is compiled with charset set to 'Use Unicode Character Set'.
Now i want to use this dll in my vc++ exe whose charset is 'Use Multi-Byte Character Set'. I know that theoretically nothing stops me from doing this as after compiling the vc++ dll, all the function signatures would be either wchar_t or LPCWSTRs .. And in my vc++ exe i just create strings of that format and call the exported functions. But, The problem i am facing is , The header in unicode dll takes TCHAR parameters like for ex:
class MYLIBRARY_EXPORT PrintableInt
{
public:
PrintableInt(int value);
...
void PrintString(TCHAR* somestr);
private:
int m_val;
};
Now i want to use this PrintString() in my exe. So, I included the header and used it like below:
#include "unicodeDllHeaders.h"
PrintableInt p(2);
wchar_t* someStr = L"Some str";
p.PrintString(someStr);
This as expected gives me a compiler error :
error C2664: 'PrintableInt::PrintString' : cannot convert parameter 1 from 'wchar_t *' to 'TCHAR *'
As the exe is built with MBCS TCHAR is defined to char . So, what i thought would solve this issue is :
#define _UNICODE
#include "unicodeDllHeaders.h"
#undef _UNICODE
But after defining _UNICODE also i still get the compilation error. So, my next guess was that probably TCHAR.h was already included before the #include "unicodeDllHeaders.h" , when i searched for the inclusion of TCHAR.h it was there else where in the project. So, I moved the inclusion to after definition of _UNICODE , This solved the compilation error here but it is failing in the other places where TCHAR is expected to be made char . So, My question is :
Can i somehow make TCHAR resolve to char and wchar_t in the same project ? I tried #define TCHAR char, #undef TCHAR , #define TCHAR wchar_t but its failing in the c headers like xutils
You can not retroactively change the binary interface of your DLL, regardless of what do to your header file by using macros. The typical solution is to firstly dump the whole legacy MBCS stuff. This stopped being interesting 10 years ago, nowadays all targets supporting the win32 API support the wide character interface with full Unicode support.
If you want that retro feeling of the nineties, you can also compile your DLL twice, once with CHAR and once with WCHAR. The latter then typically get a "u" suffix (for Unicode). In your header, you then check the charset and delegate to the according DLL using #pragma comment lib...
In C:\Program Files\Microsoft SDKs\Windows\v7.0A\Include\WinCrypt.h, the definition for CERT_CHAIN_ENGINE_CONFIG is
typedef struct _CERT_CHAIN_ENGINE_CONFIG {
DWORD cbSize;
HCERTSTORE hRestrictedRoot;
HCERTSTORE hRestrictedTrust;
HCERTSTORE hRestrictedOther;
DWORD cAdditionalStore;
HCERTSTORE* rghAdditionalStore;
DWORD dwFlags;
DWORD dwUrlRetrievalTimeout; // milliseconds
DWORD MaximumCachedCertificates;
DWORD CycleDetectionModulus;
*#if (NTDDI_VERSION >= NTDDI_WIN7)
HCERTSTORE hExclusiveRoot;
HCERTSTORE hExclusiveTrustedPeople;
#endif*
} CERT_CHAIN_ENGINE_CONFIG, *PCERT_CHAIN_ENGINE_CONFIG;
I am using visual studio 2010 in an XP sp3 machine, in which case, i expect that the following two members in the above structure gets greyed out. But this is not happening,
#if (NTDDI_VERSION >= NTDDI_WIN7)
HCERTSTORE hExclusiveRoot;
HCERTSTORE hExclusiveTrustedPeople;
#endif
NTDDI_VERSION in-turn is defined in sdkddkver.h as follows, and _WIN32_WINNT somehow takes the value of NTDDI_WIN7 which in my case is incorrect as mine is a XP SP3 machine.
#if !defined(_WIN32_WINNT) && !defined(_CHICAGO_)
#define _WIN32_WINNT 0x0601
#endif
#ifndef NTDDI_VERSION
#ifdef _WIN32_WINNT
// set NTDDI_VERSION based on _WIN32_WINNT
#define NTDDI_VERSION NTDDI_VERSION_FROM_WIN32_WINNT(_WIN32_WINNT)
#else
#define NTDDI_VERSION 0x06010000
#endif
#endif
The above two members of the structure CERT_CHAIN_ENGINE_CONFIG in question is not present in C:\Program Files\Microsoft SDKs\Windows\v6.0A\Include\WinCrypt.hBut my 2010 visual studio project automatically pulls in the header and lib files from C:\Program Files\Microsoft SDKs\Windows\v7.0A\Include\WinCrypt.h Because of the conflicting structures, i am getting parameter is incorrect
Please advise how i can over come this issue?
Should i have to install visual studio 2010 sp1?
I found one reference in the web where it says initialising the structure will resolve the issue, but it will not, as the two parameters in question will not be greyed out and will be taken in while building.
UPDATE1:
Settings of my project:
$(VCInstalDir) - >C:\Program Files\Microsoft Visual Studio 10.0\VC
$(WindowsSdkDir) ->C:\Program Files\Microsoft SDKs\Windows\v7.0A
$(FrameworkSdkDir) ->C:\Program Files\Microsoft SDKs\Windows\v7.0A
Library file settings,
$(VCInstallDir)lib
$(VCInstallDir)atlmfc\lib
$(WindowsSdkDir)lib
$(FrameworkSDKDir)\lib
UPDATE 2:
My preprocessor definitions are
WIN32;_DEBUG;_WINDOWS;_USRDLL;MY_DLL_EXPORTS;%(PreprocessorDefinitions)
%(PreprocessorDefinitions) inherited values as follows
_WINDLL
_MBCS
Thanks
The problem which you have can be very easy explained. If you use v7.0A or v7.1 you are able to compile your project so that it will run under Windows 7. So the default value for the _WIN32_WINNT is 0x0601.
If you want co compile the program so that it will run on Windows XP you can define WINVER and _WIN32_WINNT explicitly. Typically one do this in the settings of the Visual Studio project inside of the preprocessor definitions. If you will do this the corresponding part of CERT_CHAIN_ENGINE_CONFIG structure will be displayed gray like you as want.
In the most cases and in the case of CERT_CHAIN_ENGINE_CONFIG it is not really needed. The Windows API are designed mostly so, that you will have no problems in the usage of CERT_CHAIN_ENGINE_CONFIG defined for Windows 7 in case of the start of the program on Windows XP. If you do define
#define WINVER 0x0500
#define _WIN32_WINNT 0x0500
(or 0x0501 instead of 0x0500) you will be able to run your program in the Windows 7, but you will be not able to use the hExclusiveRoot and the hExclusiveTrustedPeople members. The reason is the cbSize field which you should initialize as sizeof(CERT_CHAIN_ENGINE_CONFIG). It gives for the CertCreateCertificateChainEngine function enough information about the size of the input structure CERT_CHAIN_ENGINE_CONFIG. In case of small value of cbSize, the last HCERTSTORE members hExclusiveRoot and hExclusiveTrustedPeople will be just not used.
the value of NTDDI_WIN7 which in my case is incorrect as mine is a XP SP3 machine.
As I understand it, the variables are initialized according to what system you are targeting, not what system you are compiling the code on. So you need to look at your project settings and see, what is your target platform, what headers are referenced etc. .