Better way to give provide path of libraries while compiling in C++ - c++

I pretty new to C++. I was wondering, as what is considered generally a neat way to provide paths for various files/libraries while compiling or executing c++ codes.
For ex:
I have Boost libraries installed in some location on my system. Lets call it X
In order to execute anything I have to type in
c++ -I LongpathWhichisX/to/boost_1_60_0 example.cpp -o example
Similarly, also Long path for the input file while executing the code.
Is there a better way to address it. Is it possible to create environment variables lets Y, which refers to path 'X'. And we can use following command to compile code
c++ -I Y/to/boost_1_60_0 example.cpp -o example

Best way is to use build tools. For example you can use Make. You define all your include paths (and other options) in the Makefile. In console you just have to call make to build your project or something like make run to run your project.

The usual way is to make a Makefile where you can specify all needed paths and compile options in proper variables.
If you don't want/need a Makefile and rather want to run compiler from command-line, then you may use the CPATH environment variable to specify a colon-separated list of paths to include files.

This is a broad question but the other answers highlight the most important step. It is essential to learn build tools like make because they will make it easier to build your projects during development and for others to build it later. In the modern programming age though this is not enough. If you are using something like Boost (which targets many platforms) you will probably want to make your build cross-platform as well. For this you would use either cmake or autotools which both have scripts that make it much easier to locate the Boost libraries (and others).
Any other build systems, in my opinion, are a pain and are the bane of maintainers of Linux distributions. CMake used to be in that catergory but it has gained wide acceptance now. CMake targets building cross-platform projects across operating systems (Windows and Unixes) better (again in my opinion) because it attempts to provide the native build system on each platform (for example: Visual Studio in Windows, Make on all Unices, XCode on Mac). The autotools instead target the Unix environment with much greater depth (you have a bit of a harder time on Windows, but you can target embedded Unix systems to high end Unix server systems with much more flexibility).
Note: Autotools support for cross-compiling is superior in almost every way to other solutions. I always cringe when I download something that needs to be cross compiled for Arm Linux and it uses some weird build system. Funnily enough, boost is one of these.
This is a bit of a long winded answer. In summary, it is essential that you learn a build system for native development. It is part of your skill set and until you have that skill you can't really contribute to open-source projects or even your employer developing closed-source projects.

Related

Releasing a program

So I made a c++ console game. Now I'd like to "release" the game. I want to only give the .exe file and not the code. How do i go about this. I'd like to make sure it will run on all windows devices.
I used the following headers-
iostream
windows.h
MMSystem.h
conio.h
fstream
ctime
string
string.h
*I used namespace std
*i used code::blocks 13.12 with mingw
& I used the following library-
libwinmm.a
Thank you in advance
EDIT
There are many different ways of installing applications. You could go with an installer like Inno or just go with a regular ZIP file. Some programs can even be standalone by packaging all resources within the executable, but this is not an easy option to my knowledge for C++.
I suppose the most basic way is to create different builds for different architectures with static libraries and then find any other DLLs specific to that architecture and bundle it together in one folder. Supporting x86/x86-64/ARM should be enough for most purposes. I do know that LLVM/Clang and GCC should have extensive support for many architectures, and if need be, you should be able to download the source code of the libraries you use and then compile them for each architecture you plan to support as well as the compilation options you need to compile to each one.
A virtual machine can also be helpful for this cross-compilation and compatibility testing.
tldr; Get all the libraries you need in either static or dynamic (DLL) format. Check that they are of the right architecture (x86 programs/code will not run on MIPS and vice versa). Get all your resources. Get a virtual machine, and then test your program on it. Keep testing until all the dependency problems go away.
Note: when I did this, I actually had some compatibility issues with, of all things, MinGW-w64. Just a note; you may need some DLLs from MinGW, or, if you're using Cygwin, of course you need the Cygwin DLL. I don't know much about MSVC, but I would assume that even they have DLLs needed on some level if you decide to support an outdated Windows OS.

What is the difference between using a Makefile and CMake to compile the code?

I code in C/C++ and use a (GNU) Makefile to compile the code. I can do the same with CMake and get a Makefile. However, what is the difference between using a Makefile and CMake to compile the code?
Make (or rather a Makefile) is a buildsystem - it drives the compiler and other build tools to build your code.
CMake is a generator of buildsystems. It can produce Makefiles, it can produce Ninja build files, it can produce KDEvelop or Xcode projects, it can produce Visual Studio solutions. From the same starting point, the same CMakeLists.txt file. So if you have a platform-independent project, CMake is a way to make it buildsystem-independent as well.
If you have Windows developers used to Visual Studio and Unix developers who swear by GNU Make, CMake is (one of) the way(s) to go.
I would always recommend using CMake (or another buildsystem generator, but CMake is my personal preference) if you intend your project to be multi-platform or widely usable. CMake itself also provides some nice features like dependency detection, library interface management, or integration with CTest, CDash and CPack.
Using a buildsystem generator makes your project more future-proof. Even if you're GNU-Make-only now, what if you later decide to expand to other platforms (be it Windows or something embedded), or just want to use an IDE?
The statement about CMake being a "build generator" is a common misconception.
It's not technically wrong; it just describes HOW it works, but not WHAT it does.
In the context of the question, they do the same thing: take a bunch of C/C++ files and turn them into a binary.
So, what is the real difference?
CMake is much more high-level. It's tailored to compile C++, for which you write much less build code, but can be also used for general purpose build. make has some built-in C/C++ rules as well, but they are useless at best.
CMake does a two-step build: it generates a low-level build script in ninja or make or many other generators, and then you run it. All the shell script pieces that are normally piled into Makefile are only executed at the generation stage. Thus, CMake build can be orders of magnitude faster.
The grammar of CMake is much easier to support for external tools than make's.
Once make builds an artifact, it forgets how it was built. What sources it was built from, what compiler flags? CMake tracks it, make leaves it up to you. If one of library sources was removed since the previous version of Makefile, make won't rebuild it.
Modern CMake (starting with version 3.something) works in terms of dependencies between "targets". A target is still a single output file, but it can have transitive ("public"/"interface" in CMake terms) dependencies.
These transitive dependencies can be exposed to or hidden from the dependent packages. CMake will manage directories for you. With make, you're stuck on a file-by-file and manage-directories-by-hand level.
You could code up something in make using intermediate files to cover the last two gaps, but you're on your own. make does contain a Turing complete language (even two, sometimes three counting Guile); the first two are horrible and the Guile is practically never used.
To be honest, this is what CMake and make have in common -- their languages are pretty horrible. Here's what comes to mind:
They have no user-defined types;
CMake has three data types: string, list, and a target with properties. make has one: string;
you normally pass arguments to functions by setting global variables.
This is partially dealt with in modern CMake - you can set a target's properties: set_property(TARGET helloworld APPEND PROPERTY INCLUDE_DIRECTORIES "${CMAKE_CURRENT_SOURCE_DIR}");
referring to an undefined variable is silently ignored by default;
As mentioned in the other answers CMake can generate other project files. It refers to these projects as generators.
This lets users write/describe their build using a domain specific language, and use the generator to compile the project. It often results in simpler/better code than writing to these project files directly.
A big advantage is users can use the tool that they are the most comfortable with (Makefiles, Visual Studio, XCode, Ninja, etc). This is nice but arguable introduces complexity. Why not just use Ninja?
The answer is history. (As is the norm in C/C++)
Build systems like Visual Studio have tools that will only accept those project files.
For example Microsoft has a feature called "Static Driver Verifier". A tool to analyze the code of kernel mode windows drivers. However, this tool only works on Visual Studio projects since it works alongside msbuild.
msbuild /t:sdv /p:Inputs="Parameters" ProjectFile /p:Configuration=configuration /p:Platform=platform
If your build system can't generate Visual Studio project files, then you can't use the tool. This can be a very big deal for some projects/companies.

C++ Compile on different platforms

I am currently developing a C++ command line utility to be distributed as an open-source utility on Github. However, I want people who download the program to be able to easily compile and run the program on any platform (specifically Mac, Linux, and Windows) in as few steps as possible. Assuming only small changes have to be made to the code to make it compatible with the various platform-independent C++ compilers (g++ and win32), how can I do this? Are makefiles relevant?
My advice is, do not use make files, maintaining the files for big enougth projects is tedious and errors happen sometimes which you don't catch immediatly (because the *.o file is still there).
See this question here
Makefiles are indeed highly relevant. You may find that you need (at least) two different makefiles to compensate for the fact that you have different compilers.
It's hard to be specific about how you solve this, since it depends on how complex the project is. It may be easiest to write a script/batchfile, and just document "Use the command build.sh on Linux/Unix, and build.bat on Windows") - and then let the respective files deal with for example setting up the name of the compiler and flags, etc.
Or you can have an include into the makefile, which is determined by the architecture. Or different makefiles.
If the project is REALLY simple, it may be just enough to provide a basic makefile - but it's unlikely, as a compile of x.cpp on Linux/MacOS makes an object file is called x.o, on windows the object file is called x.obj. Libraries have different names, dll's have differnet names, and on Linux/MacOS, the final executable has no extension (typically) so it's called "myprog", where the executable under windows is called "myprog.exe".
These sorts of differences mean that the makefile needs to be different.

A single makeFile-Windows/Linux/Mac OS

We have some applications written in C/C++ and makefiles for the same. Currently, we are using the GNU make system on windows (cygwin based). The makefiles were written long back considering only Windows OS in mind. Now we are going to revamp everything.
I am unaware of the factors to be considered while writing the makefiles so as to make them cross platform compatible. I looked at some Sources on the Internet, but they were unsatisfactory. Can someone please list out the issues to be considered while writing the makefiles so as to make them compatible across various platforms.
PS: I have seen this link, but i think it isn't what i want.
Makefiles and cross platform development
You can use cmake - it's a cross platform tool which generates makefiles or projects files with respect to your platform. So instead of writing Makefile you write CMakeLists.txt, then you run cmake and it will generate Makefiles. When you want to compile your program on another platform you just ru-run cmake with different target project system.

Include only certain libraries on an operating system

When writing an app that one wants to have compile on mac, linux and windows, what is the best way of managing the different libraries that will need to be included on the various operating systems. For example, using the glut opengl toolkit requires different includes on each operating system.
Your question is actually two questions in one:
1) How do I write my C++ code to include the right include files on the right platform?
2) How do I write my Makefile to work on different platforms?
The C++ code question is already answered - find the platform-specific defines and use them to figure out what platform you're on.
Automake or scons are quite complex, and are worth your time only if you intend to release your code to a wide audience. In the case of in-house code, a "generic" makefile with per-platform include is usually sufficient. For Windows, you can get GNU Make for Windows (available from here, or use nmake and limit yourself to the subset of syntax common between all platforms.
If you just need to worry about header files, then the preprocessor will do everything you need. If you want to handle differing source files, and possibly different libraries you'll need a tool to handle it.
Some options include:
The Autotools
Scons
CMake
My personal favorite is CMake. The Autotools uses a multi-stage process that's relatively easy to break, and scons just feels weird to me. Cmake will also generate project files for a variety of IDEs, in addition to makefiles.
There is a good article on Macros. One of the answers how to use conditional compilation based on OS/COmpiler (its near the top).
The use of the Autoconfiguration tools is a nice addition on top of this but is not needed for small projects where it may be easier to detect the OS explicitly, though for larger projects that may need to run on many different types of OS you should also explore the Available autoconfiguration tools mentioned by Branan
Several projects I've worked on use an autoconf-based configure script which builds a Makefile, hence the reason you can build all of them from source with a simple:
./configure
make
make install
Scons has a configuring mechanism that will do a lot of what autotools do without as much complexity, and is pretty darn portable (although not as portable as autotools).
The compiler should have a set of preprocessor symbols it will provide that you can use. For example linux for gcc on a Linux system, _WIN32 for VC++. If you need something more complex then look at autoconf, but that works best for Unix based code.
I'd recommend checking out how some of the larger OpenSource projects handle this. See AutoSense.hpp from (an old release of) Apache Xerces.
If the libraries offer the same API on the different platforms, I would create a "proxy" include file containing all the necessary #ifdefs. That 'platform-independent' include file is then included in your client code instead of cluttering it with numerous and ugly-reading preprocessor commands. These will be contained in the ugly and cluttered platform-independent include.
If the API differs across platforms, you will need to create your own abstraction.
Perhaps this is a cop-out answer, but have you looked at how boost handles this? They build on quite a few platforms without autoconf, although they do have their own build system - bjam - that probably handles some of the same situations. They also do a nice auto-linking trick on windows that automatically selects the right version of libraries for linking depending on the version of the MSVC compiler. Based on your initial description, it sounds like just macro defs checking for various platforms/compilers might do the trick, but perhaps there is more to your problem that would prevent this.