Exporting functions from a DLL using __declspec(dllexport) - c++

I 'm working on a project (VS 2012 solution) including joystic.cpp. Project has been created as DLL, in order to be able to call library from another project. The application receives immediate joypad data in exclusive mode via a dialog timer, displaying them in a dialog box.
I edited the resource.h file with macro:
#ifdef LIBRARY_EXPORTS
# define LIBRARY_API __declspec(dllexport)
#else
# define LIBRARY_API __declspec(dllimport)
#endif
in order to be able to export functions by ordering:
LIBRARY_API function();
However, should I export all functions of the program, or not?

Should I export all functions of the program, or not?
Generally when designing the interface to a library, you should export just the functions that are needed by consumers of the library.
If you exported every function then you would need to document every function. However, many functions may not be needed by library consumers. Many functions will be private to the implementation of the library.
What's more, by exporting all functions you would be making future development of the library more difficult. What if you wanted to change the interface to one of these private functions that you exported? You would have to either introduce another function with a different name, or use the existing name and so break binary backwards compatibility. Neither of these options is particularly appealing.
In essence, the larger the surface area of your library's public interface, the larger your maintenance task. So you generally aim to provide the smallest possible public interface that provides the required functionality to library consumers.

Related

Header only library as a module?

I'm authoring a templated header only library. It has no state, no global variables, no .cpp that needs to be compiled.
Is it possible to export/consume this as a module? How? What are the benefits? What are the pitfalls?
There are some convenience macros that I probably want the user to have. What about those?
I have found an example using #ifdef ... to cater for both module and old-school cases. I think I want to avoid that.
I have found an example using #ifdef ... to cater for both module and old-school cases. I think I want to avoid that.
Broadly speaking, that is not the best way to do this. You can do it by just importing the header as a header unit with import <header_file.hpp>; but you'll be losing some important aspects of modules. You're going to need to do a limited amount of #defineing if you want your module version to work well.
Just about every header-only library will have some declarations that are considered part of the library, but it will also have some which are not. These are usually put into a detail namespace or other attempts are made to hide them from users.
Modules have a mechanism for doing that: export. Or more to the point, they have a way to not put something in your interface: don't export it. But this requires explicitly tagging your interfaces with the export keyword... which won't work in non-module builds. So you need a macro to say if the build is a module build or not, so that you can have a #define that resolves to export or to nothing.
You could do an export namespace my_namespace {}; to try to export everything, but that can have... unpleasant side effects.
You also may need to explicitly inline certain class members of exported classes, as modules do not implicitly inline defined members of non-template classes the way a non-module build does. Fortunately, adding inline will work fine on non-module builds.
Writing a header-only library and a module interface file for it, such that a user can include whichever they prefer, isn't too difficult. But it does require some care, and to best take advantage of module features, you should use some macros.
The primary module interface unit would look like this:
module; //Begin global module fragment.
<external header includes>
export module MyModuleName; //Begin the actual module purview
#define MY_MODULE_NAME_EXPORT export;
#include "my_header1.hpp"
#include "my_header2.hpp"
The <external header includes> is very important. You need to #include every file that your library explicitly includes. If you include parts of the C++ standard library, then include them here.
The point of this is to stop those headers from shoving the contents of those headers into the module's purview. You want your inclusion of your library to be the only code that gets shoved into the module. So you include those headers in the global module fragment, and use their include-guards to prevent later inclusion.
And yes, this does mean you have to keep two separate lists of these headers. You can write a tool to look through all of your headers and build that list for you, but one way or another it's still a thing you need to have.
At the top of your library headers, after any include guards, you need to have this:
#ifndef MY_MODULE_NAME_EXPORT
#define MY_MODULE_NAME_EXPORT
#endif
This allows you to decorate anything you want to export with MY_MODULE_NAME_EXPORT:
MY_MODULE_NAME_EXPORT void some_function()
{
...
}
If you're in a module build, that will export the function. If you aren't, then it won't.

UWP - Suggested way for reusing string parsing functions across WinRT modules

I have a number of C++ projects in my solution that were written by other teams before I started working on my UWP app, and all of these projects use std::strings. So, to ease the communication between the other projects and my WinRT modules, I wrote some string conversion functions to go from std::strings to Platform::Strings and vice versa.
I'm in the process of converting my UWP codebase into WinRT modules and I'm coming across a recurring problem: because WinRT modules don't allow you to have classes or functions with public native types, I am unable to have my string functions publicly accessible. A private, protected, or internal declaration is fine for passing around native types, just not public.
Many of my modules need to communicate down into the native C++ code and I don't want to have to redefine my string functions again and again for each individual file that needs a std::string.
Is there anything I can do so I can reuse my string functions across WinRT modules? Has anyone else had a similar problem? Any suggestions are greatly appreciated!
Thank you
You have two options.
Make those functions inline, and define all of them in a header file. Then, include the header file everywhere you want to consume them. This is a more straightforward solution without requiring you to mess with your build system.
You can compile those functions into one of your DLLs, and import them to other ones. Let's call the DLL where you put your functions in "StringModule.dll". You'll need to put those functions in a .cpp/.h header file pair, then compile that .cpp file into StringModule.dll. Then, annotate your functions with a define that evaluates to __declspec(dllexport) when building StringModule.dll, and __declspec(dllimport) when building all the other DLLs. For instance:
#ifndef BUILDING_STRING_CONVERSIONS_DLL // This should be defined to 1 when building StringModule.dll
#define BUILDING_STRING_CONVERSIONS_DLL 0
#endif
#if BUILDING_STRING_CONVERSIONS_DLL
#define MY_STRING_API __declspec(dllexport)
#else
#define MY_STRING_API __declspec(dllimport)
#endif
namespace MyStringFunctions
{
MY_STRING_API Platform::String^ ConvertStdStringToPlatformString(const std::string& str);
MY_STRING_API std::string ConvertPlatformStringToStdString(Platform::String^ str);
}
When you build StringModule.dll, there will be StringModule.lib file created next to it. You'll have to pass its path to the linker as an argument when building all the DLLs that consume your string functions. In all the places where you want to use your DLL, just include that header file and use them as usual.

C++ API - what is the right approach

I have to build an API for a C++ framework which do some simulation stuff. I already have created a new class with __declspec(dllexport) functions and built the framework to a DLL.
This works fine and I can use the framework within a C# application.
But is there another or a better approach to create an API with C++?
If you want to create a C++-API, exporting a set of classes from a DLL/shared library is the way to go. Many libraries written in C++ decide to offer a C interface though, because pure C interfaces are much easier to bind to foreign languages. To bind foreign languages to C++, a wrapper generator such as SWIG is typically required.
C++-APIs also have the problem that, due to C++ name-mangling, the same compiler/linker needs to be used to build the framework and the application.
It is important to note that the __declspec(dllexport)-mechanism of telling the compiler that a class should be exported is specific to the Microsoft Compiler. It is common practice to put it into a preprocessor macro to be able to use the same code on other compilers:
#ifdef _MSC_VER
# define MY_APP_API __declspec(dllexport)
#else
# define MY_APP_API
#endif
class MY_APP_API MyClass {}
The solution with exporting classes have some serious drawbacks. You won't be able to write DLLs in another languages, because they don't support name mangling. Furthermore, you won't be able to use other compilers than VS (because of the same reason). Furthermore, you may not be able to use another version of VS, because MS doesn't guarantee, that mangling mechanism stays the same in different versions of the compiler.
I'd suggest using flattened C-style interface, eg.
MyClass::Method(int i, float f);
Export as:
MyClass_MyMethod(MyClass * instance, int i, float f);
You can wrap it inside C# to make it a class again.

How can design a config utility to work both statically and dynamically linked?

I feel this issue may have a simple solution that's just not obvious to me - I have a config class that is used to store various configuration options loaded from an ini file, among other places. In my application, I have a library and client, and 2 configurations - build the library as a DLL and have the client link dynamically, or build them both together as a single binary. So how can I have/use my config object in both the library and client? If I include the config class definition in both, I assume it will give me link errors due to redefinitions.
If I include the config class definition in both, I assume it will give me link errors due to redefinitions.
No, it won't. Windows DLLs do not respect the one definition rule. Basically, in Windows, ODR stops at the module boundary, i.e., ODR is respected within an executable and within a DLL, but not across them. Whether that's a good thing or not is not relevant, that's just how it is. So, you can include the definition of your config class in both the DLL and the executable. However, there will be two separate instances of the singleton (as I assume it is a singleton class, as config classes usually are), one in each module. In that sense, it won't be a true singleton, at least, not across the modules.
If you want a true singleton across the modules, you'll have to do a bit more work. You have two choices: master-slave or merging (or a "daisy chain").
The first option is to designate one module (e.g., the executable) as the module that instantiates (and keeps) the singleton object, and then you pass a pointer to that instance to all the "slave" modules which can then use it via a common interface (so both modules have the same declaration for the config class, but only one module creates it and passes it to the others). It would look something like this:
In the header file "config_class.h":
class ConfigClass {
// a bunch of declarations...
public:
static ConfigClass& getInstance(); // the access-point for the singleton.
};
#ifdef MY_LIB_NOW_BUILDING_MASTER
extern "C" __declspec(dllimport) void setConfigClassInstance(ConfigClass* pobj);
#else
extern "C" __declspec(dllexport) void setConfigClassInstance(ConfigClass* pobj);
#endif
In the cpp file "config_class.cpp":
#include "config_class.h"
// a bunch of definitions for the config_class member functions.
#ifdef MY_LIB_NOW_BUILDING_MASTER
ConfigClass& ConfigClass::getInstance() {
static ConfigClass instance( /* */ );
return instance;
};
#else
static ConfigClass* masterInstance;
void setConfigClassInstance(ConfigClass* pobj) {
masterInstance = pobj;
};
ConfigClass& ConfigClass::getInstance() {
return *masterInstance;
};
#endif
where, in the above example, you would call setConfigClassInstance from the master module (most likely the main executable) to set the config object for the DLLs, but make sure the DLLs don't require the config class during their static initialization (loading).
The second option is to either merge or daisy-chain your singletons. In this case, each module creates its own singleton instance, but then, using a similar scheme as above, they pass pointers to each other's instances, allowing them to be either merged into one instance (cross-linked), or you chain them together (e.g., like a circular linked-list or ring-list) and you dispatch the calls to the appropriate instance.
I think that for your application, the first option is probably the easiest.
N.B.: In non-Windows environments, the situation is entirely different, and none of the above applies.

Macro variable after class keyword

I found this in Ogre Framework
class _OgreSampleClassExport Sample_Character : public SdkSample {
...
...
and it's define like this
#define _OgreSampleClassExport
Why we want to have this macro variable?
Presumably so a special qualifier, such as __declspec(dllexport), could be added to such classes by modifying (or conditionally defining) the define:
#define _OgreSampleClassExport __declspec(dllexport)
It's to allow for future exports. Ogre may just strictly be a statically linked library at the moment, but if the authors ever decide to support dynamically-linked libraries (aka shared libraries on some platforms), they will need to write code like:
class
#ifdef EXPORTING
__declspec(dllexport)
#else
__declspec(dllimport)
#endif
Sample_Character [...]
... and that's just for MSVC. Normally they'd have to go through the effort to do this to Sample_Character and all other classes they provide through their library. Making a single macro to be defined later is much easier as they'll only need to do this in one place.