Handling binary dependencies across platforms - c++

I've got a C++ project where we have loads and loads of dependencies. The project should work on Linux and Windows, so we've ported it to CMake. Most dependencies are now included right into the source tree and build alongside the project, so there are no problems with those.
However, we have one binary which depends on Fortran code etc. and is really complicated to build. For Linux, it's also not available as a package, but only as precompiled binaries or with full source (needs a BLAS library installed and several other dependencies). For windows, the same library is available as binary, building for Windows seems even more complicated.
The question is, how do you handle such dependencies? Just check in the binaries for the supported platforms, and require the user to set up his build environment otherwise (that is, manually point to the binary location), or would you really try to get them compiled along (even if it requires installing like 10 libraries -- BLAS libraries are the biggest pain here), or is there some other recommended way to handle that?

If the binary is independant of the other part of your build process, you definitively should check-in it. But as you cannot include every version of the binary (I mean for every platform and compile flags the user might use) the build from source seems mandatory.
I have done something similar. I have checked-in the source code archives of the libraries/binaries I needed. Then I wrote makefile/scripts to build them according to the targeted platform/flags in a specific location (no standard OS location) and make my main build process to point to the right location. I have done that to be able to handle the correct versions and options of the libraries/binaries I needed. It's quite a hard work to make things works for different platforms but it's worth the time !
Oh, and of course it's easier if you use crossplatform build tools :)

One question to you. Does the users need to modify this binary, or are they just happy it's there so the can use/access it? If they don't need to modify it, check in the binaries.

I would agree, check in the binaries for each platform if they are not going to be modified very often. Not only will this reduce build times, but it will also reduce frustration from unnecessary compilations.

Related

Qt moc.exe - difference between 32 and 64-Bit version?

I am trying to find out what the difference is between the moc.exe (Qt meta object compiler) in the respective 32-bit and the 64-bit subfolder of Qt5.
Does it make any difference if I let my application with 64-Bit target architecture be built (and processed) by the 32-Bit or 64-Bit moc.exe version?
I couldn't find any info on this. If anybody has a clue or an idea what the difference is (beside being compiled for the according architecture and having a different file size of course) or if this makes a difference at all (as it only generates cpp files) I'd be very interested to know.
Thanks in advance
Samir
Does it make any difference if I let my application with 64-Bit target architecture be built (and processed) by the 32-Bit or 64-Bit moc.exe version?
Yes. Such mixing-and-matching is not supported. It might work, but it might break since nobody tests it to ensure that it works. Qt has an extensive test suite that runs in the continuous integration process. Something like this being untested is a big hint that you're on your own if you depend on it. Don't complain if you run into strange runtime bugs. Don't do it.
what the difference is
Anything and everything. All that Qt guarantees as a contract with you is that the moc output from a previous binary-compatible Qt version retains forward binary compatibility. E.g. if you have moc output from Qt 5.7, and you build a shared binary of your application, then replace Qt with binary-compatible 5.8, the old moc output is valid and Qt 5.8 knows how to use it safely. That's all.
Since, obviously, 32- and 64-bit Qt versions are not binary-compatible, you should not expect the moc output to be either. If it happens to be, it's a coincidence and nothing in the design guarantees it. It can break at any time.
You shouldn't be facing this problem since qmake/qbs/cmake will use the correct moc for you. It seems like perhaps you're trying to preprocess the sources using moc so that further use of moc won't be necessary while building the project. This strategy will not work. You need to learn how to leverage the build tools to build your project using the code generators it needs.
moc is not the only build tool provided by Qt you might end up using in a Qt-based project. Furthermore, projects of any significant size should be using many other code generators as well to make you more productive. Avoiding code generation as a policy is counter-productive. As in: it will cost you more.

Better way to give provide path of libraries while compiling in C++

I pretty new to C++. I was wondering, as what is considered generally a neat way to provide paths for various files/libraries while compiling or executing c++ codes.
For ex:
I have Boost libraries installed in some location on my system. Lets call it X
In order to execute anything I have to type in
c++ -I LongpathWhichisX/to/boost_1_60_0 example.cpp -o example
Similarly, also Long path for the input file while executing the code.
Is there a better way to address it. Is it possible to create environment variables lets Y, which refers to path 'X'. And we can use following command to compile code
c++ -I Y/to/boost_1_60_0 example.cpp -o example
Best way is to use build tools. For example you can use Make. You define all your include paths (and other options) in the Makefile. In console you just have to call make to build your project or something like make run to run your project.
The usual way is to make a Makefile where you can specify all needed paths and compile options in proper variables.
If you don't want/need a Makefile and rather want to run compiler from command-line, then you may use the CPATH environment variable to specify a colon-separated list of paths to include files.
This is a broad question but the other answers highlight the most important step. It is essential to learn build tools like make because they will make it easier to build your projects during development and for others to build it later. In the modern programming age though this is not enough. If you are using something like Boost (which targets many platforms) you will probably want to make your build cross-platform as well. For this you would use either cmake or autotools which both have scripts that make it much easier to locate the Boost libraries (and others).
Any other build systems, in my opinion, are a pain and are the bane of maintainers of Linux distributions. CMake used to be in that catergory but it has gained wide acceptance now. CMake targets building cross-platform projects across operating systems (Windows and Unixes) better (again in my opinion) because it attempts to provide the native build system on each platform (for example: Visual Studio in Windows, Make on all Unices, XCode on Mac). The autotools instead target the Unix environment with much greater depth (you have a bit of a harder time on Windows, but you can target embedded Unix systems to high end Unix server systems with much more flexibility).
Note: Autotools support for cross-compiling is superior in almost every way to other solutions. I always cringe when I download something that needs to be cross compiled for Arm Linux and it uses some weird build system. Funnily enough, boost is one of these.
This is a bit of a long winded answer. In summary, it is essential that you learn a build system for native development. It is part of your skill set and until you have that skill you can't really contribute to open-source projects or even your employer developing closed-source projects.

Releasing a program

So I made a c++ console game. Now I'd like to "release" the game. I want to only give the .exe file and not the code. How do i go about this. I'd like to make sure it will run on all windows devices.
I used the following headers-
iostream
windows.h
MMSystem.h
conio.h
fstream
ctime
string
string.h
*I used namespace std
*i used code::blocks 13.12 with mingw
& I used the following library-
libwinmm.a
Thank you in advance
EDIT
There are many different ways of installing applications. You could go with an installer like Inno or just go with a regular ZIP file. Some programs can even be standalone by packaging all resources within the executable, but this is not an easy option to my knowledge for C++.
I suppose the most basic way is to create different builds for different architectures with static libraries and then find any other DLLs specific to that architecture and bundle it together in one folder. Supporting x86/x86-64/ARM should be enough for most purposes. I do know that LLVM/Clang and GCC should have extensive support for many architectures, and if need be, you should be able to download the source code of the libraries you use and then compile them for each architecture you plan to support as well as the compilation options you need to compile to each one.
A virtual machine can also be helpful for this cross-compilation and compatibility testing.
tldr; Get all the libraries you need in either static or dynamic (DLL) format. Check that they are of the right architecture (x86 programs/code will not run on MIPS and vice versa). Get all your resources. Get a virtual machine, and then test your program on it. Keep testing until all the dependency problems go away.
Note: when I did this, I actually had some compatibility issues with, of all things, MinGW-w64. Just a note; you may need some DLLs from MinGW, or, if you're using Cygwin, of course you need the Cygwin DLL. I don't know much about MSVC, but I would assume that even they have DLLs needed on some level if you decide to support an outdated Windows OS.

C++ Compile on different platforms

I am currently developing a C++ command line utility to be distributed as an open-source utility on Github. However, I want people who download the program to be able to easily compile and run the program on any platform (specifically Mac, Linux, and Windows) in as few steps as possible. Assuming only small changes have to be made to the code to make it compatible with the various platform-independent C++ compilers (g++ and win32), how can I do this? Are makefiles relevant?
My advice is, do not use make files, maintaining the files for big enougth projects is tedious and errors happen sometimes which you don't catch immediatly (because the *.o file is still there).
See this question here
Makefiles are indeed highly relevant. You may find that you need (at least) two different makefiles to compensate for the fact that you have different compilers.
It's hard to be specific about how you solve this, since it depends on how complex the project is. It may be easiest to write a script/batchfile, and just document "Use the command build.sh on Linux/Unix, and build.bat on Windows") - and then let the respective files deal with for example setting up the name of the compiler and flags, etc.
Or you can have an include into the makefile, which is determined by the architecture. Or different makefiles.
If the project is REALLY simple, it may be just enough to provide a basic makefile - but it's unlikely, as a compile of x.cpp on Linux/MacOS makes an object file is called x.o, on windows the object file is called x.obj. Libraries have different names, dll's have differnet names, and on Linux/MacOS, the final executable has no extension (typically) so it's called "myprog", where the executable under windows is called "myprog.exe".
These sorts of differences mean that the makefile needs to be different.

Include only certain libraries on an operating system

When writing an app that one wants to have compile on mac, linux and windows, what is the best way of managing the different libraries that will need to be included on the various operating systems. For example, using the glut opengl toolkit requires different includes on each operating system.
Your question is actually two questions in one:
1) How do I write my C++ code to include the right include files on the right platform?
2) How do I write my Makefile to work on different platforms?
The C++ code question is already answered - find the platform-specific defines and use them to figure out what platform you're on.
Automake or scons are quite complex, and are worth your time only if you intend to release your code to a wide audience. In the case of in-house code, a "generic" makefile with per-platform include is usually sufficient. For Windows, you can get GNU Make for Windows (available from here, or use nmake and limit yourself to the subset of syntax common between all platforms.
If you just need to worry about header files, then the preprocessor will do everything you need. If you want to handle differing source files, and possibly different libraries you'll need a tool to handle it.
Some options include:
The Autotools
Scons
CMake
My personal favorite is CMake. The Autotools uses a multi-stage process that's relatively easy to break, and scons just feels weird to me. Cmake will also generate project files for a variety of IDEs, in addition to makefiles.
There is a good article on Macros. One of the answers how to use conditional compilation based on OS/COmpiler (its near the top).
The use of the Autoconfiguration tools is a nice addition on top of this but is not needed for small projects where it may be easier to detect the OS explicitly, though for larger projects that may need to run on many different types of OS you should also explore the Available autoconfiguration tools mentioned by Branan
Several projects I've worked on use an autoconf-based configure script which builds a Makefile, hence the reason you can build all of them from source with a simple:
./configure
make
make install
Scons has a configuring mechanism that will do a lot of what autotools do without as much complexity, and is pretty darn portable (although not as portable as autotools).
The compiler should have a set of preprocessor symbols it will provide that you can use. For example linux for gcc on a Linux system, _WIN32 for VC++. If you need something more complex then look at autoconf, but that works best for Unix based code.
I'd recommend checking out how some of the larger OpenSource projects handle this. See AutoSense.hpp from (an old release of) Apache Xerces.
If the libraries offer the same API on the different platforms, I would create a "proxy" include file containing all the necessary #ifdefs. That 'platform-independent' include file is then included in your client code instead of cluttering it with numerous and ugly-reading preprocessor commands. These will be contained in the ugly and cluttered platform-independent include.
If the API differs across platforms, you will need to create your own abstraction.
Perhaps this is a cop-out answer, but have you looked at how boost handles this? They build on quite a few platforms without autoconf, although they do have their own build system - bjam - that probably handles some of the same situations. They also do a nice auto-linking trick on windows that automatically selects the right version of libraries for linking depending on the version of the MSVC compiler. Based on your initial description, it sounds like just macro defs checking for various platforms/compilers might do the trick, but perhaps there is more to your problem that would prevent this.