How to dynamically load dlls using macros - c++

Sorry if the title is confusing. Anyways I'm working on a small game engine in C++ and I want to modularize it. I want there to be a core module and then you can install other modules based on what your using (2D Graphics, 3D Graphics, Audio, Physics, etc) and I was thinking the best (maybe only) way to do this is make each module another project in my VS solution that compile to dlls while the core is a lib. The core lib could check which modules are installed by checking for a macro like ENGINE_2D_GRAPHICS_MODULE that could be defined in the dll. How would I do this and is there a better way? I'm pretty sure there's probably a better way but idk what it is.

Macros are compile-time constants, you can't influence your core module this way upon installing new modules. You probably want to detect whether additional modules are present in the filesystem and then load them with LoadLibrary (on Windows) or dlopen (on POSIX-conforming systems).
A different option would be to let the game developer (i.e. the the engine user) to link your library statically and, respectively, compile only the needed parts.

Related

Distribution of C++ application with dependencies in Visual Studio

I'm a junior programmer. I have developed a Visual Studio C++ project with a fair amount of dependencies: Boost, a fingerprint recognition library and Windows Biometrics Frameworks. As for today I know the Windows Biometric Framework can be downloaded from the standard Windows Update and I am not concerned about that, to my knowledge, the application is ready to search and link WBF dependencies on the computer by itself.
My concern is: which is the easiest (not most efficient, I need speed here) way to pack the executable file with all the resources and dependencies this .exe needs (Boost and the fingerprint recognition SDK) so that I can minimize distribution troubles, i.e this dll is missing, please reinstall the application, and things like that, without having to compile everything in the client's computer?
I've been able to see a couple ways here: copy the dlls listed in the project config, change to static linking... but I don't know if that is the simplest way. I have little to no trust in my abilities for this and those methods seem quite manual, wondering if there might be an automatic way for doing these things?
I'm not familiar with the fingerprint library or WBF, but most of Boost resides in headers so its compiled in when you compile your application. Some, like the threading library and system specific calls(e.g. getting CPU core count) are libraries that are statically linked to.
What format of the fingerprint library is provided? Dynamically, there would be at least a .dll with a corresponding import .lib file. Your application links statically to the importer after compiling, and binds to the library during run time. Or the library can be included in one large, single .lib that's linked to your application after its compiled. If you have both options available and you only want to distribute the binary file, use static linking.
Like in any systems, you will need to include every .dll libraries your app links and every external resources(images, config files, ...) your app uses. I usually make my Windows distributions by using http://www.jrsoftware.org/isinfo.php.
Very easy to use.

Releasing a program

So I made a c++ console game. Now I'd like to "release" the game. I want to only give the .exe file and not the code. How do i go about this. I'd like to make sure it will run on all windows devices.
I used the following headers-
iostream
windows.h
MMSystem.h
conio.h
fstream
ctime
string
string.h
*I used namespace std
*i used code::blocks 13.12 with mingw
& I used the following library-
libwinmm.a
Thank you in advance
EDIT
There are many different ways of installing applications. You could go with an installer like Inno or just go with a regular ZIP file. Some programs can even be standalone by packaging all resources within the executable, but this is not an easy option to my knowledge for C++.
I suppose the most basic way is to create different builds for different architectures with static libraries and then find any other DLLs specific to that architecture and bundle it together in one folder. Supporting x86/x86-64/ARM should be enough for most purposes. I do know that LLVM/Clang and GCC should have extensive support for many architectures, and if need be, you should be able to download the source code of the libraries you use and then compile them for each architecture you plan to support as well as the compilation options you need to compile to each one.
A virtual machine can also be helpful for this cross-compilation and compatibility testing.
tldr; Get all the libraries you need in either static or dynamic (DLL) format. Check that they are of the right architecture (x86 programs/code will not run on MIPS and vice versa). Get all your resources. Get a virtual machine, and then test your program on it. Keep testing until all the dependency problems go away.
Note: when I did this, I actually had some compatibility issues with, of all things, MinGW-w64. Just a note; you may need some DLLs from MinGW, or, if you're using Cygwin, of course you need the Cygwin DLL. I don't know much about MSVC, but I would assume that even they have DLLs needed on some level if you decide to support an outdated Windows OS.

Running Qt program without IDE

How can i run a program which already has been built and compiled before on Qt IDE, so that i can take that program and run on any computer I want without recompiling it on that computer. I am a beginner so bare answering this question.:)
Thanks
There are a few parts to your problem.
1) You need to compile it for each architecture you want it to be used on.
2) Each architecture will have a set of Qt dynamic libraries associated with it too that need to be available.
3) Some architectures have an easy-to-deploy mechanism. EG, on a mac you can run "macdeployqt" to get the libraries into the application directory. For nokia phones (symbian, harmattan (N9), etc) QtCreator has a deploy step that will build a package for the phone and even include an icon.
4) For systems without such a feature, like linux and windows, you'll either need to distribute the binary and require the user to have Qt available or to package up a directory/zip containing the needed Qt libraries and distribute that.
It doesn't launch because it cannot find the dependencies. As you are on Windows, these libraries can be moved in the same directory as your application. To find which library is missing, use dependency walker
I am pretty sure these libraries are not found:
The Qt dynamic libraries (can be found on Qt bin directory, take the dll)
The C dynamic libraries used for compilation. If you are on creator and use default setting it will be mingw-xxx(can be found in the Qt installation directory, don t know exactly where)
Every Architect has a set of CPU Instructions.
so it's like when you hear a language that you don't understand. like when i speak Arabic To Someone who don't Understand The Language.
Every Architect Has a set of Processor Instructions, The Compiler only convert the code into instruction only understood by The Architecture that your CPU is.
That's Why Python and the most of High level languages Use Interpreter Instead of a Compiler.
But There are some cross compilers like MinGw that Support Cross compiling To Windows (.exe files)
Simply QT Have some libraries important to be in the working directory for your project.

How can I strip down the Qt libraries to remove stuff not used by my application?

I'm shipping a stand-alone Linux application with Qt libraries compiled-in.
Is there a tool which would scan my source code, see which classes/methods my app uses, then it would pluck the unnecessary/unused stuff out of the Qt source code and compile Qt libraries tailor-made for my application without any extra bloat? This is the best case scenario, of course.
But what is the closest existing solution that would allow me to make my Linux stand-alone app with compiled-in qt libs as slim as possible?
Is there a tool which would scan my source code, see which classes/methods my app uses, then it would pluck the unnecessary/unused stuff out of the Qt source code and compile Qt libraries tailor-made for my application without any extra bloat?
The linker already does this for you. If you're statically linking to the Qt libraries, then only the code for the functions that you're calling will be embedded into the executable.
You don't need an external piece of software to do this. It doesn't matter how big the Qt libraries are on your development machine.
for additional size reduction of your program try UPX - it will make your application even smaller.
what is the closest existing solution […] to make my Linux stand-alone app with compiled-in Qt libs as slim as possible?
Specifically for Qt, since early 2019 there is the build process configuration option -ltcg to enable link-time code generation. According to the Qt company blog, it allows 15% size reduction for statically linked Qt and a smaller but still noticable effect for dynamically linked Qt libraries.

Where should I be using a static library in C++

What are the use cases of using static libraries in C++? I have seen that people create DLLs instead or some that use static libraries only. Whats your recommendation?
I'm a big fan of static libraries pretty much everywhere. The one big thing that DLLs get you that static libs cannot do is the ability to dynamically load and unload library functionality. So if your application is going to support some sort of hot swapping plugins, you need to use dynamic libs. Otherwise you can probably use static libs.
Static libs open the door to a lot of optimizations that you can't do with dynamic libs because they are performed at link-time. In the microsoft world Link Time Code Generation (LTCG) give you the ability to do whole program optimization and dead code stripping through not only your application, but also your libraries (in gcc this is called Link Time Optimization [LTO])
Additionally static libs tend to make your program easier to distribute because you aren't forced to pass around a lot of library files, and you can completely avoid DLL-hell if you ever were to version your library.
You should use shared libraries (DLL) if you have a significant functionality that needs to be shared between applications; AND this functionality may be improved independant of all the application and updates shipped seprately.
The 'AND' part is the hardest to fulfill: usually you ship your application with any new functionality added and never update the library without updating the application at the same time (I am not saying that never happens) but usually the two ship in lockstep.
Otherwise it is easier to just build normal libs and ship the application.
An Example of a good (I use the term loosely for example purposes) is DirectX. When a new version of DirectX is shipped (and the interface has not changed) you just need to update the DLL and all apllications that use DirectX get the benifit of the new version of the library. In reality it is not quite that simple but you get the idea.
In general, although there are always exceptions to the rule, I would say:
Advantages of DLLs
Less physical memory usage when running multiple instances of an application. (Copy on write optimisation of memory usage.)
Faster link times.
Smaller executables.
Better modularity.
Advantages of static libraries
Less virtual memory usage (and probably less physical memory usage) when running a single instance of an application.
Performance. Approximately 10% (more or less) improvement over DLLs, depending on your application.
Reliability. You tested your application against a specific version (or specific versions) of a library. An upgrade to a DLL could potentially break your application.
There is the advantage of not having to recompile your entire program if you make a change to a dynamically linked library. #Chris makes a good point about dll-hell but if it s a minor bug fix that doesn't affect the API, this can save you the recompilation.
There is a SO post that talks about Windows not being able to apply updates to your program if you statically link their libraries (link to come). Although i think you are more talking about statically linking your own modules.
Use static version of your libraries where you can. Use dynamic libraries where you need to (license, availability or plugin system).
I use static libraries to implement UML's "package" concept. All modules belonging to a package gets put into their own subdirectory, and I create an IDE subproject or makefile for that directory which builds a static library *.a file. Modern IDEs make it possible to work with your top-level package along with sub-packages within the same "workspace".
If a package (or a group of packages) can be deployed separately from the main executable, then I compile it into a shared library (*.so or *.dll) instead and consider it a "component" in UML jargon.
Well a Static DLL would be for holding huge libraries and also for using Multi-Os coode as i like to call it so it's able to be ran on Linux , Windows ...