I'm new to, and learning C++ (know a lot of Java) and the following code confuses me...
I know this code fragment is dealing with a pointer-to-function (it's a callback, that makes sense) but what is throwing me off is the argument between the return type and the function name. What the bloody hell is that?
It looks like a type of function, but I have never heard of that and even after searching and reading about pointer-to-functions I was not able to find anything mentioning that functions could have a type.
If this is true, how does one define a function type?
Thanks, -Cody
GLFWCALL is not a type, it's a macro which is expanded to a calling convention specific to the platform, or an empty string. Here's a trimmed fragment of glfw.h:
#if defined(_WIN32) && defined(GLFW_BUILD_DLL)
#define GLFWCALL __stdcall
#elif defined(_WIN32) && defined(GLFW_DLL)
#define GLFWCALL __stdcall
#else
/* We are either building/calling a static lib or we are non-win32 */
#define GLFWCALL
#endif
Using a correct calling convention is important on x86/win32, since some of them expect the stack to be cleaned by callee and others by the caller. There can also be differences in the order of passing the arguments.
On Windows, GLFWCALL is a macro for __stdcall, and on other platforms, it's a macro for nothing.
__stdcall implements a particular calling convention, and is a compiler extension on top of normal C or C++.
Macros are pieces of code that do replacements on your code before the lexer and parser of your compiler interact with them.
The GLFWCALL is a macro that can expand to a calling convention if one is needed. Because this function will be called by external code, it has to use the calling convention that external code expects. For example, if the function puts its return value on the stack and the external code expects it in a register, boom.
The question marked part of the function signature is a preprocessor macro that is defined somewhere else in the header. Certain features on certain platforms have extra requirements.
For example functions in DLL files on the Windows platform often make use of the __declspec(dllexport) modifier but when the same header is included in a user's project they need to use __declspec(dllimport). Using a preprocessor macro for that purpose means they can just use that macro on all relevant functions and simply define the macro differently when compiling their own DLL or a user's DLL and on platforms where __declspec is irrelevant it can be defined to nothing. There are many other reasons for macros like that one.
In this particular case you can effectively pretend that macro is blank and ignore it entirely.
Related
I have a member function that returns a reference to a CString member of the class. It looks something like this:
const CString& GetDateFormat() const { return MyString; }
I am using Visual Studio 2015 and whenever I type this function, the IDE automatically changes it to GetDateFormatA(). When I go to the function definition and hover over the function name I see this:
#define GetDateFormat GetDateFormatA
So it's like VS automatically created this macro.
The function works fine but I have already seen this multiple times (I mean a function — written by others — with a macro renaming it by appending an A) and I am quite curious and a bit confused. What's the purpose? Is it something related to character encoding or at least to strings?
GetDateFormat is a WinAPI function which takes at least one parameter whose type involves TCHAR (in this case, it's the parameters LPCTSTR lpFormat and LPTSTR lpDateStr). All such WinAPI functions actually exist in two forms: one with an A appended, and one with W appended. The A variant is used when TCHAR means char, and the W one is used when TCHAR means wchar_t. To support this, <windows.h> actually defines a macro for each such function, resolving to one or the other based on whether _UNICODE is defined. In your case, there's a definition somewhere in WinAPI headers similar to this:
#ifdef _UNICODE
# define GetDateFormat GetDateFormatW
#else
# define GetDateFormat GetDateFormatA
#endif
This is where your program's occasional reference to GetDataFormatA comes from. You can read more about working with TCHARs on MSDN.
How to solve this depends on whether you need to call WinAPI's GetDateFormat and use its char/wchar_t distinction. If so, you will have to rename your function. However, my impression is that you're not interested in the WinAPI function. In that case, a solution would be to add the following lines to your header which declares your GetDateFormat:
#include <windows.h>
#undef GetDateFormat
That way, nobody consuming your header will see the macro and GetDateFormat will remain GetDateFormat.
These are defined in minwindef.h (which is often located at Program Files (x86)\Windows Kits\8.1\Include\shared\minwindef.h)
#ifndef IN
#define IN
#endif
#ifndef OUT
#define OUT
#endif
And I often see parameters decorated with these macros like this:
void SomeFunction(IN const MyClass& obj)
What is the significance of these macros and why one should decorate parameters with it?
These macros may be defined as nothing for compatibility with Standard C and Standard
C++ or they can be defined as MS-specific SAL (Microsoft source code annotation language) annotations
for Annotating Function Parameters and Return Values, e.g.
#define IN _In_
#define OUT _Out_
with the documented meanings:
_In_
Annotates input parameters that are scalars, structures, pointers to structures and the like. Explicitly may be used on simple scalars. The parameter must be valid in pre-state and will not be modified.
_Out_
Annotates output parameters that are scalars, structures, pointers to structures and the like. Do not apply this to an object that cannot return a value—for example, a scalar that's passed by value. The parameter does not have to be valid in pre-state but must be valid in post-state.
SAL annotations are parsed by MS compilers, of course. MSDN Windows API documentation
employs the SAL annotations, for example
As you can see from their definitions, these macros have no functional use whatsoever.
They are used there only for documentation purposes, to indicate that the function's logical semantics expect, for example, obj to be an "in" parameter, as opposed to an "out" parameter.
C# has actual keywords in and out, so it's possible that the Windows code includes these equivalents for "consistency" … though, personally, I think having "equivalents" that actually don't do anything, likely does more harm than good.
But, hey, you're the one who works for Microsoft, so perhaps you can tell us. :)
I am looking for the most fool-safe way to pass a VB6 boolean variable to a function (written in C++, stdcall).
The C++ function will set the "bool" variable of a struct using this VB6 variable.
I have tried declaring it like this in C++:
extern "C" __declspec(dllexport) int SetParameter( BOOL bErrorView)
{
DLL_sSettings nSet;
nSet.bErrorView =(bErrorView != FALSE);
int ret = stSetParameter(sizeof(DLL_sSettings), nSet);
return (ret);
}
stSetParameter is declared as
extern "C" int ST_COMDLL_API stSetParameter(int DataLen, DLL_sSettings Settings);
The DLL_sSetting is declared as
typedef struct
{
bool bErrorView; // true: Show
// false: Don't show
(...)
} DLL_sSettings;
However, I am unable to get it to work.
I call it in VB6 using
Private Declare Function SetParameter Lib "MyDLL.dll" Alias "_SetParameter#4" (ByVal bErrorView As Boolean) As Long
But it does not work as expected, I guess somewhere the VB6 Boolean gets lost or is being incorrectly converted.
I am currently using VB6 Boolean, C++ BOOL and C++ bool.
I know that is not so nice, but I don't see any other way.
Does anybody spot something wrong in my code?
VB6 uses the StdCall calling convention by default (cdecl convention is supported if you create a type library with a module section describing your imports, instead of using Declare Function). And C++ supports a whole host of calling conventions: stdcall, fastcall, cdecl, thiscall.
It is important to note that calling functions using stdcall in another library is not enough to change your functions to stdcall. You can use a command-line switch to the compiler, but the most robust is to include the __stdcall keyword in your source code. It gets applied to the name of the function, like so:
int __stdcall functionname(int args);
Since you will also want to export those functions for VB6 to find them, you'll want extern "C" to reduce name mangling, and __declspec(dllexport) to place them in the exports table.
It's pretty common to use a macro to do all of the above at once. It looks like the library you are wrapping does this with ST_COMDLL_API. Inside the library, that will expand to __declspec(dllexport) __stdcall. For consumers, it will use dllimport instead.
Any time you are defining an API to be used across different compilers (or even different languages), it's a good idea to be very explicit about calling convention and structure packing (#pragma pack). Otherwise you are at the mercy of options specified in the project file and other headers files. Even if you are happy with the defaults the compiler uses, your public headers should be explicit, because eventually someone will try to use two libraries in the same program, and the other library may demand changes to the compile options.
BOOL is type definition for int.
it declared in windef.h as follows:
typedef int BOOL;
#ifndef FALSE
#define FALSE 0
#endif
#ifndef TRUE
#define TRUE 1
#endif
bool is C++ type, which can't be used in function prototype if you declare function with extern "C".
so VB should treat BOOL as Long (32 bit integer), not as Bolean. 0 means false otherwise (usually 1) it is true.
A lot o functions in OpenNI return like this:
XN_C_API XnInt32 XN_C_DECL xnVersionCompare (const XnVersion *pVersion1, const XnVersion *pVersion2)
I'd like to now what is XN_C_API and XN_C_DECL.
Just curiosity!
It's not anything to do with the return value. Without looking at the source, I suspect that when compiling for Windows:
XN_C_API gets defined to either __declspec(dllexport) or __declspec(dllimport) depending on whether you're compiling the DLL or importing functions from it.
XN_C_DECL gets defined to the desired calling convention for the function (probably __stdcall or __cdecl)
Without having looked at the definitions, I'd guess:
XN_C_API is effectively extern "C" for a C++ compiler so that a C function can be called from C++.
XN_C_DECL deals with the calling conventions and export/import issues imposed by Windows. That might be __declspec(ddlexport) or __declspec(dllimport), and it might include __stdcall etc, and might also worry about FAR etc (but probably doesn't need to any more).
After browsing some old code, I noticed that some classes are defined in this manner:
MIDL_INTERFACE("XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX")
Classname: public IUnknown {
/* classmembers ... */
};
However, the macro MIDL_INTERFACE is defined as:
#define MIDL_INTERFACE(x) struct
in C:/MinGW/include/rpcndr.h (somewhere around line 17). The macro itself is rather obviously entirely pointless, so what's the true purpose of this macro?
In the Windows SDK version that macro expands to
struct __declspec(uuid(x)) __declspec(novtable)
The first one allows use of the __uuidof keyword which is a nice way to get the guid of an interface from the typename. The second one suppresses the generation of the v-table, one that is never used for an interface. A space optimization.
This is because MinGW does not support COM (or rather, supports it extremely poorly). MIDL_INTERFACE is used when defining a COM component, and it is generated by the IDL compiler, which generates COM type libraries and class definitions for you.
On MSVC, this macro typically expands to more complicated initialization and annotations to expose the given C++ class to COM.
If I had to guess, it's for one of two use cases:
It's possible that there's an external tool that parses the files looking for declarations like these. The idea is that by having the macro evaluate to something harmless, the code itself compiles just fine, but the external tool can still look at the source code and extract information out of it.
Another option might be that the code uses something like the X Macro Trick to selectively redefine what this preprocessor directive means so that some other piece of the code can interpret the data in some other way. Depending on where the #define is this may or may not be possible, but it seems reasonable that this might be the use case. This is essentially a special-case of the first option.