Combine multiple DLL's into 1 - c++

I'm wondering if it's possible to combine multiple DLL's into 1. I'm currently working on a C++ project that is dependent on many dynamic link libraries,so would it be possible to combine them into 1 DLL file, and if so, how would I do that?

I do have the source code for these DLLs, yes.
Just combine all the source files from all the DLL projects into a single DLL project?
And if you have multiple *.def files (one for each project) then combine them into a single *.def file.

Realistically, no. In theory, if you wanted to badly enough you could do something like disassembling all of them, then re-assembling all the separate files into object files, then re-linking those object files into one big DLL. Getting this to actually work would usually be non-trivial though -- there are likely to be things like conflicting symbol names that would require considerable work to get around.
A rather cleaner possibility would be to package all the DLLs into a zip file (or whatever you prefer) and have a small program to unzip them to a temporary directory, run the main program, and then erase the DLLs from that directory. This has a few problems of its own though (e.g., leaving copies of the files if the machine crashes/loses power/whatever during a run).
Edit: Since you have the source code, using it to build all the code into a single DLL is much more reasonable. For the most part, it's just a matter of adding all the source files to a single project that creates one DLL as its output. You may (easily) run into some symbol conflicts. Given access to the source code, the obvious way to deal with this would be by putting things into namespaces.

Its certainly not infeasible. The Dll format contains all the information you need to merge the code and data from multiple dlls into one, and rebase the resulting code.
this is not a standard feature of any toolchain I can think of though.

Related

is it possible to link an exe file as a lib file to another project?

The question is in the title.
I have coded an .exe project, I would like to use one of the function of this project in another project.
Maybe it is a silly question, but if it is possible this would limit the number of projects in my solution...
I have given a simple try, I get an LNK1107 error.
I would say it is not possible, but it is hard to find a clear answer on the net.
No, it is not possible.
An executable is a standalone entity. It is the result of linking object files together to produce a self-contained, well, executable.
Linking two executables together will, at best, result in duplicate definitions of main (in reality it's a little more complicated, but…).
What you want to do is share the object files before they become an executable, and this is typically accomplished by moving your shared/common code into a "library" then link the library into both projects.
Alternatively, you could keep the executables all separate, but share the code at the version-control level, e.g. with SVN externals.

CMake way to generate multiple projects from common source tree

My C++ project is very large and results in 5 different binaries to be generated. In VStudio for example, my single solution has 5 different "projects". In XCode for example, my single project has 5 different targets.
The code is organized in a very deep "src" folder with many many levels of nested sub folders. This src folder is common to all 5 binaries because there is heavy reuse of much of the source, but each binary requires some of the source but not all of it.
I'd like to know how to efficiently create a CMakeList.txt that can create what I need here.
Notes:
Reorganizing the code into a different structure is not an option nor is making the code a bunch of static libraries.
A CMakeList.txt for each subfolder is not an option. There's too many of them and maintenance would be a nightmare.
A file(GLOB_RECURSE is not a great option either because it's going to pick up a ton of source files for each binary that are unnecessary to compile for that particular binary.
Ideally, ONE XCode project (with 5 targets) or ONE VStudio Solution (with 5 projects) would be generated. I don't want 5 different projects to open.
I would be completely content with having to manually add/remove source files from a giant list somewhere...ideally in an external file that could be sucked up by CMake. E.g. SourceFilesForBinary1.txt, SourceFilesForBinary2.txt etc. but i'm not sure how to do that or if that's insane.
Any advice would be appreciated.
CMake has an include function. You can use that to implement your "giant manually managed list somewhere" solution.
You know that GLOB_RECURSE an be given a pattern, right, so it excludes uninteresting files? Even if not, everywhere you can use a GLOB_RECURSE you could also use an include and an evil manually-managed list.
I'm not sure why you don't want static libraries. Those are a good solution to this problem. For a large pile of shared code like this, if you compile it once into a relocatable static library and then link that with LTO into your various uses, you avoid recompiling the source many times. If your use is a shared library (so the static library approach would make all your unused symbols disappear), you can use the --whole-archive compiler switch to preserve them.

How would I go about compiling an OpenCV program WITHOUT linking?

I've taken the "edge" sample file and moved the appropriate source files into the directory, changing #include(s) where needed to account for the directory structure and not being setup with the library and all that. The goal being to make a more portable batch of code to try some things out. I was wondering, given the list of linker errors (lots of undefined this and that.) Would it A, be possible to take the source and include it all in a way that I won't need linking? And if so B, what would be the suggested route to find which source files have the right code to counter all the undefined stuff I get while linking?
I understand this is a general question, but it requires a general answer and I haven't seen anyone answer this here or anywhere else. I would think it's entirely possible though, OpenCV is BSD and all the source to compile it into the library is available, so I would imagine you could skip the linking to an external library step if you had the source for the library in your project code. Thanks a million to whoever can help me out or lead me into the right direction, it's much appreciated.
If your project requires fully open source code, you can do what you want. Of course, to isolate what you need from OpenCV will be a demanding task. To do that, you need to manually locate the files including the missing objects. In MS explorer you search using "inside the file" query, in linux console you can use "find | grep" combo command.
I sometimes move source files(opencv/modules/*/src) locally in my projects to customize some functions. I also keep the linked libraries which compiler puts second in priority and they become inactive but they still exists in their original form occupying some negligible useless MBs.

GNU make: generate list of source files

Is it normal to generate the list of source files (.c, .cpp etc, not headers) automatically?
How would you do it? I am thinking to use find and sed.
EDIT: The project is a library. All source files are inside the project directory. No unrelated source files are there. All object files are generated using the same compiler options. I am thinking to generate the list of all source files and then a dependency file for each source file following Tromey's way. It is a viable approach?
MORE EDIT: It is a fairly large C++ library. The project is being developed. Minimising recompilation is highly desired.
Thanks.
With GNU make you can use wildcards. See this question for an example.
Normal? It is common, but not wise.
Many people use Make wildcards or find or something similar to generate a list of all the source files that exist in a certain directory tree, then feed them to the compiler and link the objects together. This is a brittle solution that will get you into trouble. If a conflict appears among the source files (e.g. two separate definitions of void foo()) the linker will complain and it may not be obvious how to fix the problem. You may find yourself with a forest of source files, many of them unnecessary to your project, slowing down your builds and causing conflicts. And if you want to make use of some of these sources (but not all) in another executable, you'll have to resort to symbolic links or some other kludgery.
A better approach is to specify in the makefile which objects are necessary to a given target, then let Make figure out which sources to use. This is what Make is good at. There is no reliable way to maintain the object lists automatically, you just have to do it by hand, but it's not that much work; if you're changing them often enough that this is a real chore, then you're doing something wrong.
EDIT:
If the project is a library as you describe, then yes, this is a viable method, and a pretty good one. And Tromey's method will work quite nicely to prevent unnecessary recompilation.

Sharing Pre-compiled Headers efficiently

I have a framework which is being used by several projects (which includes several samples to show how the framework works). The framework has components such as the core, graphics, physics, gui etc. Each one is a separate library. There are several configurations as well.
A main solution file compiles the complete project with all the possible configurations so that the projects can use the libraries. Since the framework is rarely recompiled, especially by someone (including me) working on a project that utilizes the framework, it makes sense to pre-compile the many headers.
Initially I had each project/sample have its own pre-compiled header used for the whole project. Each time I would have to rebuild the same pch (for example, Debug), So I decided that a shared PCH would reduce the redundant PCH compilation. So far so good. I have a project that compiles the PCH along with the libraries. All the subsequent projects/samples are now using the same PCH. This has worked wonderfully.
The only problem is that I have seen an increase in file size. This is not a roadblock, as if a project that uses the framework is intended to be released, it can sever itself from the shared PCH and make its own. I have done this for the sake of rapid development (I have actually created a tool which creates the VS project files and source files for a new project/sample ready to be built as well as facilitate in upgrading a previous project that was using an older version of the framework).
Anyway, (I am presuming that) the increase in file size is because the independant VS project file that is creating the shared PCH is including all the headers from all the libraries. My question is whether I can use conditional compilation (#ifndef) to reduce the size of the final executable? Or maybe share multiple PCH files somehow (as far I know though, that is not possible, but I maybe wrong) If I am not making sense, please say so (in kind words :) ) as my knowledge of PCH files is very limited.
Thanks!
Note: To re-iterate and make it clear, so far, I have one solution file that is compiling all the libraries including the shared PCH. Now if I recompile all the samples and projects, they compile in a couple of seconds or more at most. Before, each project would recreate a PCH file. Also, initially I wanted a PCH for each library, but then I found out that a source file cannot use multiple PCH files, so this option was not feasible. Another option is to compile all possible combinations of PCH files, but that is too time consuming and cumbersome and error prone.
It sounds like the size problem is coming from using headers you don't actually need, but that it still makes sense to use these headers when developing because of the faster turn around.
On using #ifndefs: Precompilation is crude. You lose the ability to share the precompilation work at the point where there is a difference. If using #ifndefs to make different variants of what you include, I.e. if you have
#ifndef FOO
Then the precompiled header must stop before the point where FOO is defined differently in two files that use that precompiled header. So #ifndef is not going to solve the problem. The net result is that FOO must be the same, or you're back to separate pch files for the different projects. Neither solves things.
As to sharing multiple .pch files: A fundamental limitation of .pch files is that each .obj can only use one. Of course .pch files can have arbitrary combinations of headers. You could have a .pch for core+graphics, a .pch for core+physics, core+ai etc. This would work just dandy if none of the source files needed to 'talk' to more than core+one-module at a time. That does not sound realistic to me. Such a scheme and variants on it sound like a lot of restructuring work for no real gain. You don't want to be building zillions of combinations and keeping track of them all. It's possible, but it is not going to save you time.
In my view you're doing exactly the right thing by sacrificing executable size for fast turn-around during development/debugging, and then having a slower but leaner way of building for the actual release.
In the past I've found that you quite quickly run into diminishing returns as you put more in the precompiled headers, so if you're trying to put more in to make it more useful in a larger number of projects then it will hit a point that it slows down. On our projects the PCH files take longer than most source files to compile, but still only a few seconds maximum. I would suggest making the PCH files specific to each project you are using. You are right in saying that a source file can only refer to a single PCH file, but one way of getting around this is to use the 'force include' option (in the Advanced tab I think) to ensure that all files include the PCH file for that project.