In VS2010, I'm working on updating an application to a new version of a third party library that requires _WIN32_WINNT to be at least 0x501 but another third party shared library that provides binary shared libraries defines it as 0x500 in a header that is included in the application.
If this is modified, can there be a binary incompatibility or is this an insignificant change? Will I have to request new binaries from the library that defines it as 0x500? I'm not sure how to tell if this requires new bins -- I would think if any classes/structs change in size or naming, or any method/function signatures change then a new compile is necessary.
Short answer: Probably not, but if it does you're in a pretty pickle.
Long answer:
_WIN32_WINNT controls the version of the WinAPI (and related libraries such as MFC) that your code is going to use. The intent is to ensure that compiler errors are generated if you use Windows features that were introduced after the Windows version you're targeting.
Mostly this controls which functions, structs etc. are visible to you. This part cannot cause binary incompatibilities except with the Windows versions you're not targeting. However...
There are some structs in the WinAPI that were extended over the life of Windows. Take a look, for example, at the definition of OPENFILENAME:
typedef struct tagOFN {
DWORD lStructSize;
HWND hwndOwner;
HINSTANCE hInstance;
LPCTSTR lpstrFilter;
LPTSTR lpstrCustomFilter;
DWORD nMaxCustFilter;
DWORD nFilterIndex;
LPTSTR lpstrFile;
DWORD nMaxFile;
LPTSTR lpstrFileTitle;
DWORD nMaxFileTitle;
LPCTSTR lpstrInitialDir;
LPCTSTR lpstrTitle;
DWORD Flags;
WORD nFileOffset;
WORD nFileExtension;
LPCTSTR lpstrDefExt;
LPARAM lCustData;
LPOFNHOOKPROC lpfnHook;
LPCTSTR lpTemplateName;
#if (_WIN32_WINNT >= 0x0500)
void *pvReserved;
DWORD dwReserved;
DWORD FlagsEx;
#endif
} OPENFILENAME, *LPOPENFILENAME;
See that last bit at the end? That spells potential trouble -- one part of your code is going to assume that this struct is smaller than the other if one is compiled with _WIN32_WINNT set to 0x400 and the other to 0x500.
The WinAPI designers did think about this problem. You will notice that the first member of OPENFILE is lStructSize; you are supposed to initialize this with sizeof(OPENFILE). For you, sizeof(OPENFILE) is a compile time constant, for functions in the Windows runtime library, it's the tag by which they decide which version of the OPENSTRUCT struct you're passing into them.
This spells potential trouble in one scenario: If the binary library and the rest of your code exchange WinAPI types or pointers to such types, and if those types were extended between 0x500 and 0x501, then things are going to explode. Happily, I don't expect there to be many such structs because the version range is very narrow. If this is a worry, however, then you should definitely request new binaries because working around it is going to be difficult and tedious with many opportunities to make mistakes.
Other than that, I think you're (probably) safe.
Related
Im reading some of the game fix codes which deal with memory manipulation to fix a game's "issue". I found out that the code uses 2 macros that are WINAPI and STDMETHODCALLTYPE. These macros all get evaluated into __stdcall which specifies the calling convention for a function. I also found out that APIENTRY is also another macro alias for WINAPI. So are there any differences between these macros ? It seems to me that they are just aliases. Why are there so many of them ?
All data types and calling conventions in the Windows API are defined as aliases (preprocessor macros or typedefs). This allows for a stable ABI, irrespective of the compiler or toolchain. It also allows to change the ABI without breaking existing code (e.g. when 64-bit Windows was introduced).
Both WINAPI and STDMETHODCALLTYPE expand to the same thing: __stdcall for x86 and nothing for everything else. So why have 2 aliases for the same thing then? Because they control the calling convention for different subsets:
WINAPI specifies the calling convention for the flat C-based Windows API.
STDMETHODCALLTYPE, on the other hand, controls the calling convention for COM (Component Object Model).
COM and the Windows API are independent. Having 2 aliases to control the calling convention for either makes perfect sense. You wouldn't want to break all COM, just because you're moving to a new ABI for Win128.
Because back in 16-bit all these were different conventions. Since x86 flat mode, everything windows-related is __stdcall (Push right to left, callee clears the stack). __cdecl, __fastcall also exists.
Since x64, there is practically only one and all these are ignored.
The same occurs to many other Windows elements, like WPARAM and LPARAM. Once WPARAM was 16 bit, in x86 they are both 32-bit and in x64 they are both 64 bit.
In MS Visual Studio, when you do not set the character set, the likes of AfxMessageBox() (and countless other API functions) will happily accept a CStringA argument. But the moment you set the character set to Unicode, what appear to be the very same functions will only accept CStringW arguments.
Now this is precisely what the documentation says should happen... but...
where exactly did those non-Unicode API functions go? Are they still there to be linked to under other names (AfxMessageBoxA() perhaps?). By what magic does one API disappear and another one appear in its place... or alternatively... by what mischievous hacker trick can one make them reappear? And if it is possible to make them reappear in the presence of Unicode, should one (judiciously) use such hacker mischief?
The declaration of AfxMessageBox() in afxwin.h is:
int AFXAPI AfxMessageBox(LPCTSTR lpszText, UINT nType = MB_OK,
UINT nIDHelp = 0);
It is LPCTSTR that adapts the string type. If you compile with UNICODE in effect then it is an alias for const wchar_t*. Without it is const char*. There is no AfxMessageBoxA() version.
This is very different from the way the winapi functions work, necessarily so since this is a C++ function that mangles differently. Technically they could have provided another overload of the function, they didn't. You'll also have a different link demand, you need to link the non-Unicode version of the MFC library to keep the linker happy. Notable is that it is deprecated and no longer ships with recent VS editions, but still available (right now) as a separate download.
This should answer your question, it doesn't go anywhere, it simply doesn't exist. Mixing cannot work, you'll need A2W() to convert the string. You could of course simply write your own overload if necessary.
I've been puzzling over this for quite a while, and as of yet I haven't managed to find a suitable rationale.
The Win32 API provides a function for "logical string comparison" for which the prototype is:
StrCmpLogicalW( _In_ PCWSTR psz1, _In_ PCWSTR psz2 );
This function then uses digits as numbers rather than as plain text and thus provides a more 'logical' comparison of two strings.
However, most functions in the Win32 API seem to be typedef'd to use with either Multibyte or Unicode strings, for instance SendMessage is a macro which expands into SendMessageW for Unicode or SendMessageA for ANSI encodings (depending on which macro switch is enabled), so why does this function only have a wide-string version? I've searched the internet, but have been unable to find anything that explains this, so I'd be grateful if anyone can enlighten me.
Thanks in advance!
The documentation says "Behavior of this function, and therefore the results it returns, can change from release to release. It should not be used for canonical sorting applications." so it does not seem meant for general usage.
For example:
int WINAPI WinMain ( HINSTANCE instance, HINSTANCE prev_instance, PSTR cmd_line, int cmd_show )
WINAPI is a a define that looks like this:
#define WINAPI __stdcall
why can't you just do:
int __stdcall WinMain ( HINSTANCE instance, HINSTANCE prev_instance, PSTR cmd_line, int cmd_show )
actually I think my problem is that I'm sort of confusing defines with typedef's. Can someone explain this to me? what does the define do and why can't you just write __stdcall in its place?
Because the WINAPI calling convention is not guaranteed to be __stdcall. Code that uses WINAPI will still be correct even when it isn't.
You can write the function as in your latter example, and it'd work fine - it's just not good practice and would not be portable to a platform where the calling convention is something else.
This was originally done during the switchover from 16-bit to 32-bit code. In the 16-bit version of <windows.h> it was:
#define WINAPI __pascal
WINAPI let you compile for either without modifying the source code. Of course, 16-bit Windows is no longer a factor (at least for most people), but it's still not worth changing all the source code to use __stdcall directly (especially since it could change again someday).
You can just write __stdcall in its place, but don't. They've seen fit to #define WINAPI to __stdcall to make that opaque, which is just good programming practice.
Also, back in the day, some Windows libraries used Pascal calling convention and other libraries used C convention. A preprocessor define helped gloss over that.
Because WINAPI is a macro (well a #define anyway) it can be "pre-processed" to mean something else or even nothing at all.
That means you can write more portable code, as you can put in WINAPI when it is required by Win32 to mean __stdcall but, or if it is required in another environment to mean something else or nothing.
Am wondering how to normalize strings (containing utf-8/utf-16) in C/C++.
In .NET there is a function String.Normalize .
I used UTF8-CPP in the past but it does not provide such a function.
ICU and Qt provide string normalization but I prefer lightweight solutions.
Is there any "lightweight" solution for this?
As I wrote in another question, utf8proc is a very nice, lightweight, library for basic Unicode functionality, including Unicode string normalization.
For Windows, there is the NormalizeString() function (unfortunately for Vista and later only - as far as I see on MSDN):
http://msdn.microsoft.com/en-us/library/windows/desktop/dd319093%28v=vs.85%29.aspx
It's the simplest way to go that I have found so far.
I guess it's quite lightweight too.
int NormalizeString(
_In_ NORM_FORM NormForm,
_In_ LPCWSTR lpSrcString,
_In_ int cwSrcLength,
_Out_opt_ LPWSTR lpDstString,
_In_ int cwDstLength
);
You could build ICU with minimal (or possibly, no other data- I think all of the normalization data is now internal), and then statically link. I haven't tried this recently, but I believe the total size is pretty small in that case.
A good UTF-8 solution is glib's g_utf8_normalize() function. Would require to convert std::wstring to std::string (utf16 to utf8) if you need this for wstring too (which would make it quite an expensive solution, hence I'm looking myself for a better solution, if possible with pure C++(11) means).
"Lightweight" in your context means "with limited functionality". I would use ICU source as an example, and reference http://unicode.org/reports/tr15/ to implement this "lightweight" functionality.