I'm writing a small library in C++ that I need to be able to build on quite a few different platforms, including iPhone, Windows, Linux, Mac and Symbian S60. I've written most of the code so that it is platform-agnostic but there are some portions that must be written on a per-platform basis.
Currently I accomplish this by including a different header depending on the current platform but I'm having trouble fleshing this out because I'm not sure what preprocessor definitions are defined for all platforms. For windows I can generally rely on seeing WIN32 or _WIN32. For Linux I can rely on seeing _UNIX_ but I am less certain about the other platforms or their 64-bit variants. Does anyone have a list of the different definitions found on platforms or will I have to resort to a config file or gcc parameter?
I have this sourceforge pre-compiler page in my bookmarks.
The definitions are going to be purely up to your compiler vendor. If you are using the same compiler (say, gcc) on all your platforms then you will have a little bit easier time of it.
You might also want to try to instead organize your project such that most of the .h files are not platform dependent. Split your implementation (cpp files) into separate files; one for the nonspecific stuff and one for each platform. The platform specific ones can include 'private' headers that only make sense for that platform. You may have to make adapter functions to get something like this to work 100% (when the system libs take slightly differed arguments) but I have found it to be really helpful in the end, and bringing on a new platform is a whole lot easier in the future.
Neither the C nor the C++ standards define such symbols, so you are going to be at the mercy of specific C or C++ implementations. A list of commonly used symbols would be a useful thing to have, but unfortunately I haven't got one.
I don't think there exists a universal list of platform defines judging by the fact that every cross-platform library I have seen has an ad-hoc config.h full of these stuff. But you can consider looking at the ones used by fairly portable libraries like libpng, zlib etc.
Here's the one used by libpng
If you want to look through the default preprocessor symbols for a given system on which you have GCC (e.g. Mac OS X, iOS, Linux), you can get a complete list from the command-line thus:
echo 'main(){}' | cpp -dM
These are often of limited use however, as at the stage of the compilation at which the preprocessor operates, most of the symbols identify the operating system and CPU type of only the system hosting the compiler, rather than the system being targeted (e.g. when cross-compiling for iOS). On Mac OS X and iOS, the right way to determine the compile-time characteristics of the system being targeted is
#include <TargetConditionals.h>
This will pick up TargetConditionals.h from the Platform and SDK currently in use, and then you can determine (e.g.) endianness and some other characteristics from some of the Macros. (Look through TargetConditionals.h to see what kinds of info you can glean.)
Related
I want to detect which operating system my .exe is being run on, so that I could perform specific operations depending on the underlying operating system. I want it to be just for windows and mac.
I want to detect which operating system my .exe is being run on
(notice that on MacOSX executable files are traditionally not suffixed with .exe; that convention is specific to Windows)
You cannot detect that in pure standard C++11 at runtime (so your question makes technically no sense), because the C++11 standard n3337 does not know about operating systems (some C++ source code may be compiled for the bare metal). I would recommend conditional compilation (e.g. #if) and using operating system specific headers (and configuring your build automation to detect them). Some C++ frameworks (boost, poco, Qt....) could be helpful, but you need to decide to choose one (they might try to provide some common abstractions above several OSes, but the evil is still in the details, e.g. file paths are still different on Windows and on MacOSX).
MicroSoft Windows has its own API, called WinAPI (which is only, in practice, used by Windows. Perhaps some day ReactOS would implement most of that API), documented here. MacOSX is more or less conforming to POSIX, which is a standard API about a family of OSes (but Windows don't care much about that standard).
If you want to learn more about OSes in general (a sensible thing to do), read for example Operating Systems: Three Easy Pieces (freely downloadable textbook).
In practice, an executable file compiled from C++ source code is specific to the operating system it is designed to run on, and its file format is also specific to some OS (e.g. PE on Windows, ELF on Linux, Mach-O on MacOSX...). So a Windows executable won't run on MacOSX and vice versa (the fat binary idea is out of fashion in 2018). You probably would adjust your build procedure (e.g. your Makefile, or with cmake, ninja, etc...) for your OS. That build procedure might (and often does) pass extra compilation flags (e.g. some -DBUILD_FOR_MACOSX compiler flag for your preprocessor, if building for MacOSX; then you might -in few places- use some preprocessor directive like #if BUILD_FOR_MACOSX in your C++ source code) specific to the target operating system. You could even customize your build so that on MacOSX some for-macosx.cc C++ file is compiled (and its object file linked into) your software, but on Windows you'll use some other for_windows.cc C++ file. Sometimes, the build procedure is flexible enough to auto-detect at build time what is the operating system (but how to do that is a different question. The GNU autoconf could be inspirational).
Once your build procedure has been configured for (or has auto-detected) your operating system, you could use preprocessor conditional facilities to #include appropriate system-specific headers and to compile calls to OS specific functions (such as POSIX uname) querying further at runtime about the OS.
Some compilers predefine a set of preprocessor symbols, and that set depends upon the target operating system and processor and of your compiler. If you use GCC, you might create some empty file empty.cc and compile it with g++ -c -C -E -dM (perhaps also with other relevant flags, e.g. -std=c++11) to find out which set of preprocessor symbols are predefined in your case.
Consider studying the source code (including the build procedure!) of some cross-platform free software projects (e.g. on github or somewhere else) for inspiration. You can also find many resources about multi-platform C++ builds (e.g. this one and many others).
There are various ways you could do it, like testing for the presence of a C:\Windows directory to detect Windows (edit: if the OS accepts a path like that at all then it's at least part of the DOS family), but none of them are foolproof. Here's one way (although it's cheating):
#include<boost/predef.h>
enum class OperatingSystem {
// ...
Windows
// ...
};
OperatingSystem DetectOperatingSystem()
{
#ifdef BOOST_OS_WINDOWS
return OperatingSystem::Windows;
#endif
}
Also, if you just want a name to show the user, then on POSIX systems you can use uname(2).
To solve this kind of problem you would need to check both at compile time and at run time. You are going to need different code for different both systems. You are going to need Windoze code to check the Windoze version and Mac code to check the Mac version.
You are going to have to use
#if defined ()
at compile time to check which of the two you are on. Then have separate code for each to get the version.
I have searched Google but haven't found quite a direct answer to my queries.
I have been reading C++ Primer and I'm still quite new to the language, but despite how good the book is it discusses the use of the standard library but doesn't really describe where it is or where it comes from (it hasn't yet anyway). So, where is the standard library? Where are the header files that let me access it? When I downloaded CodeBlocks, did the STL come with it? Or does it automatically come with my OS?
Somewhat related, but what exactly is MinGW that came with Cobeblocks? Here it says
MinGW is a C/C++ compiler suite which allows you to create Windows executables without dependency on such DLLs
So at the most basic level is it just a collection of "things" needed to let me make C++ programs?
Apologies for the quite basic question.
"When I downloaded CodeBlocks, did the STL come with it?"
Despite it's not called the STL, but the C++ standard library, it comes with your c++ compiler implementation (and optionally packaged with the CodeBlocks IDE).
You have to differentiate IDE and compiler toolchain. CodeBlocks (the Integrated Development Environment) can be configured to use a number of different compiler toolchains (e.g. Clang or MSVC).
"Or does it automatically come with my OS?"
No, usually not. Especially not for Windows OS
"So, where is the standard library? Where are the header files that let me access it?"
They come with the compiler toolchain you're currently using for your CodeBlocks project.
Supposed this is the MinGW GCC toolchain and it's installed in the default directory, you'll find the libraries under a directory like (that's what I have)
C:\MinGW\lib\gcc\mingw32\4.8.1
and the header files at
C:\MinGW\lib\gcc\mingw32\4.8.1\include\c++
"So at the most basic level is it just a collection of "things" needed to let me make C++ programs?"
It's the Minimalist GNU toolchain for Windows. It usually comes along with the GCC (GNU C/C++ compiler toolchain), plus the MSYS minimalist GNU tools environment (including GNU make, shell, etc.).
When you have installed a C++ implementation you'll have something which implements everything necessary to use C++ source files and turn them into something running. How that is done exactly depends on the specific C++ implementation. Most often, there is a compiler which processes individual source file and translates them into object files which are then combined by a linker to produce an actual running program. That is by no means required and, e.g., cling directly interprets C++ code without compiling.
All this is just to clarify that there is no one way how C++ is implemented although the majority of contemporary C++ implementations follow the approach of compiler/linker and provide libraries as a collection of files with declarations and library files providing implementations of these declarations.
Where the C++ standard library is located and where its declarations are to be found entirely depends on the C++ implementations. Oddly, all C++ implementations I have encountered so far except cling do use a compiler and all these compilers support a -E option (although it is spelled /E for MSVC++) which preprocesses a C++ file. The typically quite large output shows locations of included files pointing at the location of the declarations. That is, something like this executed on a command line yields a file with the information about the locations:
compiler -E input.cpp > input.ii
How the compiler compiler is actually named entirely depends on the C++ implementation and is something like g++, clang++, etc. The file input.cpp is supposed to contain a suitable include directive for one of the standard C++ library headers, e.g.
#include <iostream>
Searching in the output input.ii should reveal the location of this header. Of course, it is possible that the declarations are made available by the compiler without actually including a file but just making declarations visible. There used to be a compiler like this (TenDRA) but I'm not aware of any contemporary compiler doing this (with modules being considered for standardization these may come back in the future, though).
Where the actual library file with the objects implementing the various declarations is located is an entirely different question and locating these tends to be a bit more involved.
The C++ implementation is probably installed somehow when installing CodeBlocks. I think it is just one package. On systems with a package management system like dpkg on some Linuxes it would be quite reasonable to just have the IDE have a dependency on the compiler (e.g., gcc for CodeBlocks) and have the compiler have a dependency on the standard C++ library (libstdc++ for gcc) and have the package management system sort out how things are installed.
There are several implementations of the C++ standard library. Some of the more popular ones are libstdc++, which comes packaged with GCC, libc++, which can be used with Clang, or Visual Studio's implementation by Microsoft. They use a licensed version of Dinkumware's implementation. MinGW contains a port of GCC. CodeBlocks, an IDE, allows you to choose a setup which comes packaged with a version of MinGW, or one without. Either way, you can still set up the IDE to use a different compiler if you choose. Part of the standard library implementation will also be header files, not just binaries, because a lot of it is template code (which can only be implemented in header files.)
I recommend you read the documentation for the respective technologies because they contain a lot of information, more than a tutorial or book would:
libstdc++ faq
MinGW faq
MSDN
Is there a way to compile a cross platform library to work on all platforms with the single file?
So what I mainly (only?) see is that Windows uses DLL and other platforms each use a different file.
Why are these not standardised? Is there a standard format that can be used instead? If not can this be faked?
Sorry about multiple questions but answering one should invalidate the others.
Libraries contain compiled code, which is actually specific to the architecture of the platform. Since there is no standard agreement between big players on machine architecture, the unfortunate result is that the libraries are not portable across platforms.
The best way is to open source the code and let the users compile the code on the platform they want.
The second best option is to go Java way. Distribute your library in form of jar file containing the class files. And let the users install the right JRE for their platform.
I am not aware of any other options unfortunately.
IDK why object files aren standartized (although you can use GCC for crosscompilation afaik), but the only viable guaranteed crossplatform solution as for right now is source (as far as i know). For example CImg ships as single header file (40kb), but it has some dependencies, it needs backend image processing library/toolchain. Although im not quite sure, maybe there are cross-platform static object formats.
C++ Buildsystem with ability to compile dependencies beforehand
Java has Maven which is a pleasure to work with, simply specifying dependencies that are already compiled, and deposited to Mavens standard directory, meaning that the location of the dependencies is standardized as opposed to the often used way of having multiple locations (give me a break, like anyone remembers the default installed directories for particular deps) of C/C++ dependencies.
It is massively unproductive for every individual developer having to, more often than not, find, read about, get familiar with the configure options/build, and finally compile for every dependency to simply make a build of a project.
What is the theoretical reason this has not been implemented?
Why would it be difficult to provide packages of the following options with a maven-like declaration format?
version
platform (windows, linux)
src/dev/bin
shared/static
equivalent set of Boost ABI options when applicable
Having to manually go to websites and search out dependencies in the year 2013 for the oldest major programming language is absurd.
There aren't any theoretical reasons. There are a great many practical reasons. There are just too many different ways of handling things in the C++ world to easily standardize on a dependency system:
Implementation differences - C++ is a complicated language, and different implementations have historically varied in how well they support it (how well they can correctly handle various moderate to advanced C++ code). So there's no guarantee that a library could be built in a particular implementation.
Platform differences - Some platforms may not support exceptions. There are different implementations of the standard library, with various pros and cons. Unlike Java's standardized library, Windows and POSIX APIs can be quite different. The filesystem isn't even a part of Standard C++.
Compilation differences - Static or shared? Debug or production build? Enable optional dependencies or not? Unlike Java, which has very stable bytecode, C++'s lack of a standard ABI means that code may not link properly, even if built for the same platform by the same compiler.
Build system differences - Makefiles? (If so, GNU Make, or something else?) Autotools? CMake? Visual Studio project files? Something else?
Historical concerns - Because of C's and C++'s age, popular libraries like zlib predate build systems like Maven by quite a bit. Why should zlib switch to some hypothetical C++ build system when what it's doing works? How can a newer, higher-level library switch to some hypothetical build system if it depends on libraries like zlib?
Two additional factors complicate things:
In Linux, the distro packaging systems do provide standardized repositories of development library headers binaries, with (generally) standardized ABIs and an easy way of specifying a project's build dependencies. The existence of these (platform-specific) solutions reduces the impetus for a cross-platform solution.
With all of these complicating factors and pre-existing approaches, any attempt to establish a standard build system is going to run into the problem described in XKCD's "Standards":
Situation: There are 14 competing standards.
"14? Riculous! We need to develop one universal standard that covers everyone's use cases."
Soon: There are 15 competing standards.
With all of that said:
There is some hope for the future. For example, CMake seems to be gradually replacing other build systems. Some of the Boost developers have started Ryppl, an attempt to do what you're describing.
(also posted in linked question)
Right now I'm working on a tool able to automatically install all dependencies of a C/C++ app with exact version requirement :
compiler
libs
tools (cmake, autotools)
Right now it works, for my app. (Installing UnitTest++, Boost, Wt, sqlite, cmake all in correct order)
The tool, named «C++ Version Manager» (inspired by the excellent ruby version manager) is coded in bash and hosted on github : https://github.com/Offirmo/cvm
Any advices and suggestions are welcomed.
well, first off a system that resolves all the dependencies doesn't makes you productive by default, potentially it can make you even less productive.
Regarding the differences between languages I would say that in Java you have packages, which are handy when you have to organize and give a limited horizon to your code, in C++ you don't have an equivalent concept.
In C++ all the libraries that can solve a symbol are good enough for the compiler, the only real requirement for a library is to have a certain ABI and to solve the required symbols, there are no automated ways that you can work to pick the right library, also solving a symbol it's just a matter of linking your function to the actual implementation, this doesn't even grant you that a correct linking phase will make your app work.
To this you can add important variables such as the library version, different implementations of the same library and different libraries with the same methods name.
An example is the Mesa library VS the opengl lib from the official drivers, or whatever lib you want that offers multiple releases and each one can solve all the symbols but probably there is a release that is more mature than the others and you can ask a compiler to pick the right one because they are all the same for its own purposes .
When writing an app that one wants to have compile on mac, linux and windows, what is the best way of managing the different libraries that will need to be included on the various operating systems. For example, using the glut opengl toolkit requires different includes on each operating system.
Your question is actually two questions in one:
1) How do I write my C++ code to include the right include files on the right platform?
2) How do I write my Makefile to work on different platforms?
The C++ code question is already answered - find the platform-specific defines and use them to figure out what platform you're on.
Automake or scons are quite complex, and are worth your time only if you intend to release your code to a wide audience. In the case of in-house code, a "generic" makefile with per-platform include is usually sufficient. For Windows, you can get GNU Make for Windows (available from here, or use nmake and limit yourself to the subset of syntax common between all platforms.
If you just need to worry about header files, then the preprocessor will do everything you need. If you want to handle differing source files, and possibly different libraries you'll need a tool to handle it.
Some options include:
The Autotools
Scons
CMake
My personal favorite is CMake. The Autotools uses a multi-stage process that's relatively easy to break, and scons just feels weird to me. Cmake will also generate project files for a variety of IDEs, in addition to makefiles.
There is a good article on Macros. One of the answers how to use conditional compilation based on OS/COmpiler (its near the top).
The use of the Autoconfiguration tools is a nice addition on top of this but is not needed for small projects where it may be easier to detect the OS explicitly, though for larger projects that may need to run on many different types of OS you should also explore the Available autoconfiguration tools mentioned by Branan
Several projects I've worked on use an autoconf-based configure script which builds a Makefile, hence the reason you can build all of them from source with a simple:
./configure
make
make install
Scons has a configuring mechanism that will do a lot of what autotools do without as much complexity, and is pretty darn portable (although not as portable as autotools).
The compiler should have a set of preprocessor symbols it will provide that you can use. For example linux for gcc on a Linux system, _WIN32 for VC++. If you need something more complex then look at autoconf, but that works best for Unix based code.
I'd recommend checking out how some of the larger OpenSource projects handle this. See AutoSense.hpp from (an old release of) Apache Xerces.
If the libraries offer the same API on the different platforms, I would create a "proxy" include file containing all the necessary #ifdefs. That 'platform-independent' include file is then included in your client code instead of cluttering it with numerous and ugly-reading preprocessor commands. These will be contained in the ugly and cluttered platform-independent include.
If the API differs across platforms, you will need to create your own abstraction.
Perhaps this is a cop-out answer, but have you looked at how boost handles this? They build on quite a few platforms without autoconf, although they do have their own build system - bjam - that probably handles some of the same situations. They also do a nice auto-linking trick on windows that automatically selects the right version of libraries for linking depending on the version of the MSVC compiler. Based on your initial description, it sounds like just macro defs checking for various platforms/compilers might do the trick, but perhaps there is more to your problem that would prevent this.