How portable IS C++? - c++

In C++, if I write a simple game like pong using Linux, can that same code be compiled on Windows and OSX? Where can I tell it won't be able to be compiled?

You have three major portability hurdles.
The first, and simplest, is writing C++ code that all the target compilers understand. Note: this is different from writing to the C++ standard. The problem with "writing to the standard" starts with: which standard? You have C++98, C++03, C++TR1 or C++11 or C++14 or C++17? These are all revisions to C++ and the newer one you use the less compliant compilers are likely to be. C++ is very large, and realistically the best you can hope for is C++98 with some C++03 features.
Compilers all add their own extensions, and it's all too easy to unknowingly use them. You would be wise to write to the standard and not to the compiler documentation. Some compilers have a "strict" mode where they will turn off all extensions. You would be wise to do primary development in the compiler which has the most strictures and the best standard compliance. gcc has the -Wstrict family of flags to turn on strict warnings. -ansi will remove extensions which conflict with the standard. -std=c++98 will tell the compiler to work against the C++98 standard and remove GNU C++ extensions.
With that in mind, to remain sane you must restrict yourself to a handful of compilers and only their recent versions. Even writing a relatively simple C library for multiple compilers is difficult. Fortunately, both Linux and OS X use gcc. Windows has Visual C++, but different versions are more like a squabbling family than a single compiler when it comes to compatibility (with the standard or each other), so you'll have to pick a version or two to support. Alternatively, you can use one of the gcc derived compiler environments such as MinGW. Check the [list of C++ compilers](less compliant compilers are likely to be) for compatibility information, but keep in mind this is only for the latest version.
Next is your graphics and sound library. It has to not just be cross platform, it has to look good and be fast on all platforms. These days there's a lot of possibilities, Simple DirectMedia Layer is one. You'll have to choose at what level you want to code. Do you want detailed control? Or do you want an engine to take care of things? There's an existing answer for this so I won't go into details. Be sure to choose one that is dedicated to being cross platform, not just happens to work. Compatibility bugs in your graphics library can sink your project fast.
Finally, there's the simple incompatibilities which exist between the operating systems. POSIX compliance has come a long way, and you're lucky that both Linux and OS X are Unix under the hood, but Windows will always be the odd man out. Things which are likely to bite you mostly have to do with the filesystem. Here's a handful:
Filesystem layout
File path syntax (ie. C:\foo\bar vs /foo/bar)
Mandatory Windows file locking
Differing file permissions systems
Differing models of interprocess communication (ie. fork, shared memory, etc...)
Differing threading models (your graphics library should smooth this out)
There you have it. What a mess, huh? Cross-platform programming is as much a state of mind and statement of purpose as it is a technique. It requires some dedication and extra time. There are some things you can do to make the process less grueling...
Turn on all strictures and warnings and fix them
Turn off all language extensions
Periodically compile and test in Windows, not just at the end
Get programmer who likes Windows on the project
Restrict yourself to as few compilers as you can
Choose a well maintained, well supported graphics library
Isolate platform specific code (for example, in a subclass)
Treat Windows as a first class citizen
The most important thing is to do this all from the start. Portability is not something you bolt on at the end. Not just your code, but your whole design can become unportable if you're not vigilant.

C++ is ultra portable and has compilers available on more platforms than you can shake a stick at. Languages like Java are typically touted as being massively cross platform, ironically they are in fact usually implemented in C++, or C.
That covers "portability". If you actually mean, how cross platform is C++, then not so much: The C++ standard only defines an IO library suitable for console IO - i.e. text based, so as soon as you want develop some kind of GUI, you are going to need to use a GUI framework - and GUI frameworks are historically very platform specific. Windows has multiple "native" GUI frameworks now - the C++ framework made available from Microsoft is still MFC - which wraps the native Win32 API which is a C API. (WPF and WinForms are available to CLR C++).
The Apple Mac's GUI framework is called Cocoa, and is an objective-C library, but its easy to access Objective C from C++ in that development environment.
On Linux there is the GTK+ and Qt frameworks that are both actually ported to Windows and Apple, so one of these C++ frameworks can solve your "how to write a GUI application in C++ once that builds and runs on windows, apple mac and linux".
Of course, its difficult to regard Qt as strictly C++ anymore - Qt defines a special markup for signals and slots that requires a pre-compile compile step.

You can read the standard - if a program respects the standard, it should be compilable on all platforms that have a C++ standard-compliant compiler.
As for 3rd party libraries you might be using, the platform availability is usually specified in the documentation.
When GUI comes to question, there are cross-platform options (such as QT), but you should probably ask yourself - do I really want portability when it comes to UI? Sometimes, it's better to have the GUI part platform-specific.

If you are thinking of porting from Linux to Windows, using OPENGL for the graphical part gives you freedom to run your program on both operating systems as long as you don't use any system specific functionality.

Compared to C, C++ portability is extremely limited, if not completely unexisting. For one you can't disable exceptions (well you can), for the standard specifically says that's undefined behaviour. Many devices don't even support exceptions. So as for that, C++ is ZERO portable. Plus seeing the UB, it's obvioulsy a no-go for zero-fail high-performance real time systems in which exceptions are taboo - undefined behaviour has no place in zero-fail environment. Then there's the name mangling which most, if not every, compiler does completely different. For good portability and inter-compatibility extern "C" would have to be used to export symbols, yet this renders any and all namespace information completely void, resulting in duplicate symbols. One can ofcourse choose to not use namespaces and use unique symbol names. Yet another C++ feature rendered void. Then there's the complexity of the language, which results in implementation difficulties in the various compilers for various architectures. Due to these difficulties, true portability becomes a problem. One can solve this by having a large chain of compiler directives/#ifdefs/macros. Templates? Not even supported by most compilers.
What portability? You mean the semi-portability between a couple of main-stream build targets like MSVC for Windows and GCC for Linux? Even there, in that MAIN-STREAM segment, all the above problems and limitations exist. It's retarded to even think C++ is portable.

Related

Are C++ applications cross-platform?

One of the first things I learned as a student was that C++ applications don't run on different operating systems. Recently, I read that Qt based C++ applications run everywhere. So, what is going on? Are C++ applications cross-platform or not?
Source code compatible. If I compile the source code, will it run everywhere?
API/ABI compatibility. Does the OS provide the interface to its components in a way that the code will understand?
Binary compatibility. Is the code capable of running on the target host?
Source code compatible
C++ is a standard which defines how structures, memory, files can be read and written.
#include <iostream>
int main( int argc, char ** argv )
{
std::cout << "Hello World" << std::endl;
}
Code written to process data (e.g. grep, awk, sed) is generally cross-platform.
When you want to interact with the user, modern operating systems have a GUI, these are not cross-platform, and cause code to be written for a specific platform.
Libraries such as qt or wxWidgets have implementations for multiple platforms and allow you to program for qt instead of Windows or iOS, with the result being compatible with both.
The problem with these anonymizing libraries, is they take some of the specific benefits of platform X away in the interest of uniformity across platforms.
Examples of this would be on Windows using the WaitForMultipleObjects function, which allows you to wait for different types of events to occur, or the fork function on UNIX, which allows two copies of your process to be running with significant shared state. In the UI, the forms look and behave slightly different (e.g. color-picker, maximize, minimize, the ability to track mouse outside of your window, the behaviour of gestures).
When the work you need to be done is important to you, then you may end up wanting to write platform specific code to leverage the advantages of the specific application.
The C library sqlite is broadly cross-platform code, but its low-level IO is platform specific, so it can make guarantees for database integrity (that the data is really written to disk).
So libraries such as Qt do work, they may produce results which are unsatisfactory, and you end up having to write native code.
API/ABI compatibility
Different releases of UNIX and Windows have some form of compatibility with each other. These allow a binary built for one version of the OS to run on other versions of the OS.
In UNIX the choice of your build machine defines the compatibility. The lowest OS revision you wish to support should be your build machine, and it will produce binaries compatible with subsequent minor versions until they make a breaking change (deprecate a library).
On Windows and Mac OS X, you choose an SDK which allows you to target a set of OS's with the same issues with breaking changes.
On Linux, each kernel revision is ABI incompatible with any other, and kernel modules need to be re-compiled for each kernel revision.
Binary compatibility
This is the ability of the CPU to understand the code. This is more complex than you might think, as the x64 chips, can be capable (depending on OS support) of running x86 code.
Typically a C++ program is packaged inside a container (PE executable, ELF format) which is used by the operating system to unpack the sections of code and data and to load libraries. This makes the final program have both binary (type of code) and API (format of the container) forms of incompatibilities.
Also today if you compile a x86 Windows Application (targeting Windows 7 on Visual Studio 2015), then the code may fail to execute if the processor does not have SSE2 instructions (about 10 years old CPU).
Finally when Apple changed from PowerPC to x86, they provided an emulation layer which allowed the old PowerPC code to run in an emulator on the x86 platform.
So in general binary incompatibility is a murky area. It would be possible to produce an OS which identified invalid instructions (e.g. SSE2) and in the fault, emulated the behaviour, this could be updated as new features come out, and keeps your code running, even though it is binary incompatible.
Even if your platform is incapable of running a form of instruction set, it could be emulated and behave compatibly.
Standard C++ is cross platform in the "write once, compile anywhere" sense, but not in the "compile once, run anywhere" sense.
That means that if you write a program in standard C++, you can compile and then run it on any target environment that has a standard conforming implementation of C++.
You can however not compile your program on your machine, ship the binary and then expect it to work on other targets. (At least not in general. One can of course distribute binaries from C++ code under certain conditions, but those depend on the actual target. This is a broad field.)
Of course, if you use extra, non-standard features like gcc's variable length arrays or third party libraries, you can only compile on systems that provide those extensions and libraries.
Some libraries like Qt and Boost are available on many systems (those two on Linux, Mac and Windows at least I believe), so your code will stay cross platform if you use those.
You can achieve that your source compiles on various platforms, giving you various binaries from the same source base.
This is not "compile once, run anywhere with an appropriate VM" as Java or C# do it, but "write once, compile anywhere with an appropriate environment" the way C has done it all the time.
Since the standard library does not provide everything you might need, you have to look for third-party libraries to provide that functionality. Certain frameworks -- like Boost, Qt, GTK+, wxWidgets etc. -- can provide that. Since these frameworks are written in a way that they compile on different platforms, you can achieve cross-platform functionality in the aforementioned sense.
There are various things to be aware of if you want your C++ code to be cross-platform.
The obvious thing is source that makes assumption on data types. Your long might be 32bit here and 64bit there. Data type alignment and struct padding might differ. There are ways to "play it safe" here, like size_t / size_type / uint16_t typedefs etc., and ways to get it wrong, like wchar_t and std::wstring. It takes discipline and some experience to "get it right".
Not all compilers are created equal. You cannot use all the latest C++ language features, or use libraries that rely on those features, if you require your source to compile on other C++ compilers. Check the compatibility chart first.
Another thing is endianess. Just one example, when you're writing a stream of integers to file on one platform (say, x86 or x86_64), and then read it back again on a different platform (say, POWER), you can run into problems. Why would you write integers to file? Well, UTF-16 is integers... again, discipline and some experience go a long way toward making this rather painless.
Once you've checked all those boxes, you need to make sure of the availability of the libraries you base your code on. While std:: is safe (but see "not all compilers are created equal" above), something as innocent as boost:: can become a problem if you're looking beyond the mainstream. (I helped the Boost guys to fix one or two showstoppers regarding AIX / Visual Age in past years simply because they didn't have access to that platform for testing new releases...)
Oh, and watch out for the various licensing schemes out there. Some frameworks that improve your cross-platform capabilities -- like Qt or Cygwin -- have their strings attached. That is not to say they are not a big help in the right circumstances, just that you need to be aware of copyleft / proprietary licensing requirements.
All that being said, there is Wine ("Wine is not emulation"), which makes executables compiled for Windows run on a variety of Unix-alike systems (Linux, OS X, *BSD, Solaris). There are certain limits to its capabilities, but it's getting better all the time.
Yes. No. Maybe. What is cross-platform C++ code? Cross-platform C++ code is such a code that can be compiled under different operation systems without the need to be modified.
That means, if you explicitly use any platform-dependant headers, your code is no longer cross-platform. Qt solves this problem in the following way: they provide wrappers for everything that is platform-specific. For example, imagine that you are using QFile to open/read/write a file. Your code looks like
QFile file(filename);
file.open(QFile::ReadOnly);
//other stuff
You can compile this code under any OS as long as you have a suitable compiler and Qt libraries for that OS. The code hidden under QFile will use the OS-appropriate file-handling functions, but that shouldn't concern you.
Also, if you only use the standard library, your code can be compiled anywhere where a C++ compiler is present.
The already-compiled applications, however, are not cross-platform in a way that, say, Java applications are - for instance, you can't compile an app for Windows and then run in in Linux, you will have to recompile your code under Linux instead.
C++ is cross-platform. You can use it to build applications that will run on many different operating systems.
What is not cross-platform is the compilers that translate C++ into object code. No single compiler, to my knowledge, has all the necessary features so that when you use it to compile a C++ program, it will automatically run on Windows, Linux and Mac OS.
Qt Creator is integrated with multiple compilers and has build automation. It makes it easy to switch between different setups and target platforms. It provides support for building, running and deploying C++ applications not only for desktop environments but also for mobile devices.
C++ is a programming language. Text. As such, it doesn't run anywhere.
Conforming Standard C++ code is expected to behave equally on any platform; "cross-platform" if you want. Writing (strictly) conforming C++ code requires pedantry because some assumptions often made have dependencies on details that are final to the actual implementation and this is inherited from the targets C++ itself aims to.
Notice we're still talking about C++ code, not C++ programs. Indeed, when we pass to term "program", we've no more guarantees because we aren't talking about C++ anymore; rather, the output of the compiler. This is where portability begins to fade away: executable format, ISA, ABI, low-level routines and so on.
Can you rely on that? If you can't, then you need to integrate your C++ program in the environment it will run on, by recompiling it or using platform-specific elements.

How to find Boost libraries that does not contain any platform specific code

For our current project, we are thinking to use Boost framework.
However, the project should be truly cross-platform and might be shipped to some exotic platforms. Therefore, we would like to use only Boost packages (libraries) that does not contain any platform specific code: pure C++ and that's all.
Boost has the idea of header-only packages (libraries).
Can one assume that these packages (libraries) are free from platform specific code?
In case if not, is there a way to identify these kind of packages of Boost?
All C++ code is platform-specific to some extent. On the one side, there is this ideal concept of "pure standard C++ code", and on the other side, there is reality. Most of the Boost libraries are designed to maintain the ideal situation on the user-side, meaning that you, as the user of Boost, can write platform-agnostic standard C++ code, while all the underlying platform-specific code is hidden away in the guts of those Boost libraries (for those that need them).
But at the core of this issue is the problem of how to define platform-specific code versus standard C++ code in the real world. You can, of course, look at the standard document and say that anything outside of it is platform-specific, but that's nothing more than an academic discussion.
If we start from this scenario: assume we have a platform that only has a C++ compiler and a C++ standard library implementation, and no other OS or OS-specific API to rely on for other things that aren't covered by the standard library. Well, at that point, you still have to ask yourself:
What compiler is this? What version?
Is the standard library implementation correct? Bug-free?
Are those two entirely standard-compliant?
As far as I know, there is essentially no universal answer to this and there are no realistic guarantees. Most exotic platforms rely on exotic (or old) compilers with partial or non-compliant standard library implementations, and sometimes have self-imposed restrictions (e.g., no exceptions, no RTTI, etc.). An enormous amount of "pure standard C++ code" would never compile on these platforms.
Then, there is also the reality that most platforms today, even really small embedded systems have an operating system. The vast majority of them are POSIX compliant to some level (except for Windows, but Windows doesn't support any exotic platform anyways). So, in effect, platform-specific code that relies on POSIX functions is not really that bad since it is likely that most exotic platforms have them, for the most part.
I guess what I'm really getting at here is that this pure dividing line that you have in your mind about "pure C++" versus platform-specific code is really just an imaginary one. Every platform (compiler + std-lib + OS + ext-libs) lies somewhere along a continuum of level of support for standard language features, standard library features, OS API functions, and so on. And by that measure, all C++ code is platform-specific.
The only real question is how wide of a net it casts. For example, most Boost libraries (except for recent "flimsy" ones) generally support compilers down to a reasonable level of C++98 support, and many even try to support as far back as early 90s compilers and std-libs.
To know if a library, part of Boost or not, has wide enough support for your intended applications or platforms, you have the define the boundaries of that support. Just saying "pure C++" is not enough, it means nothing in the real world. You cannot say that you will be using C++11 compilers just after you've taken Boost.Thread as an example of a library with platform-specific code. Many C++11 implementations have very flimsy support for std::thread, but others do better, and that issue is as much of a "platform-specific" issue as using Boost.Thread will ever be.
The only real way to ever be sure about your platform support envelope is to actual set up machines (e.g., virtual machines, emulators, or real hardware) that will provide representative worst-cases. You have to select those worst-case machines based on a realistic assessment of what your clients may be using, and you have to keep that assessment up to date. You can create a regression test suite for your particular project, that uses the particular (Boost) libraries, and test that suite on all your worst-case test environments. Whatever doesn't pass the test, doesn't pass the test, it's that simple. And yes, you might find out in the future that some Boost library won't work under some new exotic platform, and if that happens you need to either get the Boost dev-team to add code to support it, or you have to re-write your code to get around it, but that's what software maintenance is all about, and it's a cost you have to anticipate, and such problems will come not only from Boost, but from the OS and from the compiler vendors too! At least, with Boost, you can fix the code yourself and contribute it to Boost, which you can't always do with OS or compiler vendors.
We had "Boost or not" discussion too. We decided not to use it.
We had some untypical hardware platforms to serve with one source code. Especially running boost on AVR was simply impossible because RTTI and exceptions, which Boost requires for a lot of things, aren't available.
There are parts of boost which use compiler specific "hacks" to e.g. get information about class structure.
We tried splitting the packages, but the inter dependency is quite high (at least 3 or 4 years ago).
In the meantime, C++11 was underway and GCC started supporting more and more. With that many reasons to use from boost faded (Which Boost features overlap with C++11?). We implemented the rest of our needs from scratch (with relative low effort thanks to variadic templates and other TMP features in C++11).
After a steep learning curve we have all we need without external libraries.
At the same time we have pondered the future of Boost. We expected the newly standardized C++11 features would be removed from boost. I don't know the current roadmap for Boost, but at the time our uncertainty made us vote against Boost.
This is not a real answer to your question, but it may help you decide whether to use Boost. (And sorry, it was to large for a comment)

Program portability

How to make sure that my program will be fully portable?
Continuous integration on all target platforms.
1. Test
This is a necessary but not a sufficient condition for doing anything properly. To test portability, you'll want multiple platforms and compilers.
2. Write to the standard, not to your development platform.
This means, only do something if the standard says you can do it. Only expect a particular result if the standard says you can expect it. Only use a library or API if the standard says it exists. The standard is available here (among other places):
http://openassist.googlecode.com/files/C%2B%2B%20Standard%20-%20ANSI%20ISO%20IEC%2014882%202003.pdf
It helps if you assume that:
CHAR_BIT is equal to 9.
sizeof(int) is equal to 5 and int is a 37 bit type. Or a 16 bit type.
the basic character set is EBCDIC.
The epoc began in 1721.
time_t is double
And so on. By which I don't mean, write code that relies on those things to be true, I mean write code that will work if they are, and will also work on a sane implementation.
3. Use the most restrictive and pedantic compiler options you can find,
This is the only practical way to give yourself a reasonable chance of achieving (1).
4. Understand that "real compilers" fail to implement the standard correctly or fully, and make some concessions to this fact.
Theoretically, there's nothing non-portable about a C++ program that uses export. If it's a perfectly good C++ program in every other respect, then it will work on any conforming C++ compiler. But hardly anyone uses a conforming C++ compiler, so there's a de facto common subset of C++ that you'll want to stick to.
5. Understand that the C++ standard provides quite a restricted programming environment
Certain things are not portable in standard C++, such as drawing graphics on a screen, since standard C++ has no graphics or GUI API. So there is no such thing as a "fully portable" GUI program written in C++. So you may or may not need to revise your goal, depending what your program is supposed to do.
If your program requires something that simply cannot be done entirely within standard C++, then you can make your program easier to port by encapsulating that behaviour within an interface which you think should be implementable on all platforms you care about. Then set about implementing it for each one. This doesn't result in a "fully portable" program, though, since to me that means a program which you can compile and run unchanged on any conforming C++ implementation. A program which can be ported to most platforms with a C++ compiler, probably, assuming they have a screen and a mouse, with some bespoke programming work, isn't the same thing.
All this can be taken too far, of course. You will probably actually want to assume that CHAR_BIT is 8 (reading files is madness otherwise), and perhaps even rely on a GUI framework like Qt. But you did say, "fully portable", and one of the main things you need to do to write portable programs is usually to work out how far you're willing to compromise on "fully".
6. Assert what you assume
At compile-time if you can, or runtime otherwise, ensure that if your program requires int to be at least 32 bits (or whatever), then it will fail noisily when it isn't. OK, so comprehensive test coverage would catch cases where your arithmetic silently overflows and gives the wrong answer, but it's hard to write truly comprehensive tests, and anyway the tests might make the same non-portable errors as the code, or some poor sucker who has downloaded your code might not run them all properly.
When you use libraries, you are effectively doing this automatically. You'll #include some header, and if the library isn't available that will fail immediately. At least, you hope it will - it's conceivable that some other implementation could have a header of the same name which does something radically or subtly different. Radical differences usually result in compilation failures, for subtle differences you can test for preprocessor symbols to identify implementations.
Your question:
How to make sure that my program will be fully portable?
cannot be answered satisfyingly. You cannot, in any real-world application, make sure it's portable. You can only prove your expectation by accurate tests of the application on the target platform, as has been already proposed here by Lou Franco.
In the process of developing and testing in parallel on different platforms or environments, every one of us finds his bag of tricks and explores his share of pitfalls. You said in one comment you work on a Windows system. This is fine. Try to get your program working with the Visual Studio compiler (and environment). Then, install CygWin with the GCC 4.x compiler suite. Install the Netbeans IDE & C++ Environment and create a project based on the same sources. Netbeans will use the Cygwin GCC 4.x. If your compiled program works with both toolchains, you mastered probably about 90% of the real-world portability hurdles.
Regards
rbo
Avoid platform-specific libraries.
Make it standards compliant. At least a common subset of the standard that is implemented by vendors on all platforms you intend to deploy your application on.
Factor out platform specific portions from platform-independent ones. Typically, the lowest layer or two should deal with the platform.
Keep abreast with changes of:
Platform/OS APIs
Tool chains
Language features
Test, Deploy. Rinse and repeat.
Unit test it, on each platform, during development
Avoid using platform specific libraries. If you can implement desired functionality using the STL and BOOST only, go ahead.
Develop on the most restrictive compilation environment. Use the smallest set of features from C++. Split the platform-dependent portions of code into separate files. Develop a configuration (make) environment for each platform, as part of the software package.
Making sure to only use libraries that actually exist on all target platforms would be a good start.
It is impossible. What happens when I write my operating system that has a weird C compiler?
That said, to be portable, you need to:
Avoid Win32
Avoid POSIX (which is annoying... You may want to just use Cygwin to provide Windows support)
Avoid any platform specific library. This usually limits you in graphics to wxWindows, GTK, and QT.
TEST. Make sure it works.
Don't assume anything. Windows is weird and uses \r\n, so be careful about that.
I think Visual C++ on windows gives you warnings about "unsafe c functions" and asks you to use the "safe ones", which are not standard. Don't fall for Microsoft's attempt to monopolize your program.
Some things will help:
Autoconf will allow any decent system (ie one that includes a shell) to detect common portability issues and set up the correct headers
Cmake can do this as well, but only on platforms that Cmake itself is available on
Know the platforms that you intend to ship for. If some platform convention contradicts the standard, ignore the standard. I'm serious about that. For example, if you use the standard std::ifstream constructor, which takes a char* argument, you won't be able to open any files with Unicode filenames on Windows—you must use the nonstandard wchar_t* overload there. The functionality lost by not being able to open files that are allowed and legal on the platform severely outweighs the portability gained by using only what the standard knows; in the end, it's the functionality that matters, not the adherence to a particular standard.
This is less a direct answer to the question, than an answer in the light of other answers.
You need to balance a requirement for absolute portability against the expectations of platform users - there are different basic HCI/HIG guidelines for Windows, OS X, KDE and Gnome, and none of the portable GUI toolkits will automatically produce the right results in each (some allow you to apply different layouts, which is a start).
The inbetween approach is to have a pure portable core with multiple native GUIs.
It's not necessary (there is a lot of software that succeeds despite ignoring conventions) but it is a trade-off that needs to be considered - in particular if there is an existing strong native application.

Developing embedded software library, C or C++?

I'm in the process of developing a software library to be used for embedded systems like an ARM chip or a TI DSP (for mostly embedded systems, but it would also be nice if it could also be used in a PC environment). Obviously this is a pretty broad range of target systems, so being able to easily port to different systems is a priority.The library will be used for interfacing with a specific hardware and running some algorithms.
I am thinking C++ is the best option, over C, because it is much easier to maintain and read. I think the additional overhead is worth it for being able to work in the object oriented paradigm. If I was writing for a very specific system, I would work in C but this is not the case.
I'm assuming that these days most compilers for popular embedded systems can handle C++. Is this correct?
Is there any other factors I should consider? Is my line of thinking correct?
If portability is very important for you, especially on an embedded system, then C is certainly a better option than C++. While C++ compilers on embedded platforms are catching up, there's simply no match for the widespread use of C, for which any self-respecting platform has a compliant compiler.
Moreover, I don't think C is inferior to C++ where it comes to interfacing hardware. The amount of abstraction is sufficiently low (i.e. no deep class hierarchies) to make C just as good an option.
There is certainly good support of C++ for ARM. ARM have their own compiler and g++ can also generate EABI compliant ARM code. When it comes to the DSPs, you will have to look at their toolchain to decide what you are going to do. Be aware that the library that comes with a DSP may well not implement the full C or C++ standard library.
C++ is suitable for low-level embedded development and is used in the SymbianOS Kernel. Having said that, you should keep things as simple as possible.
Avoid exceptions which may demand more library support than what is present (therefore use new (std::nothrow) Foo instead of new Foo).
Avoid memory allocations as much as possible and do them as early as possible.
Avoid complex patterns.
Be aware that templates can bloat your code.
I have seen many complaints that C++ is "bloated" and inappropriate for embedded systems.
However, in an interview with Stroustrup and Sutter, Bjarne Stroustrup mentioned that he'd seen heavily templated C++ code going into (IIRC) the braking systems of BMWs, as well as in missile guidance systems for fighter aircraft.
What I take away from this is that experts of the language can generate sophisticated, efficient code in C++ that is most certainly suitable for embedded systems. However, a "C With Classes"[1] programmer that does not know the language inside out will generate bloated code that is inappropriate.
The question boils down to, as always: in which language can your team deliver the best product?
[1] I know that sounds somewhat derogatory, but let me say that I know an awful lot of these guys, and they churn out an awful lot of relatively simple code that gets the job done.
C++ compilers for embedded platforms are much closer to 83's C with classes than 98's C++ standard, let alone C++0x. For instance, some platform we use still compile with a special version of gcc made from gcc-2.95!
This means that your library interface will not be able to provide interfaces with containers/iterators, streams, or such advanced C++ features. You'll have to stick with simple C++ classes, that can very easily be expressed as a C interface with a pointer to a structure as first parameter.
This also means that within your library, you won't be able to use templates to their full power. If you want portability, you will still be restricted to generic containers use of templates, which is, I'm sure you'll admit, only a very tiny part of C++ templates power.
C++ has little or no overhead compared to C if used properly in an embedded environment. C++ has many advantages for information hiding, OO, etc. If your embedded processor is supported by gcc in C then chances are it will also be supported with C++.
On the PC, C++ isn't a problem at all -- high quality compilers are extremely widespread and almost every C compiler is directly associated with a C++ compiler that's quite good, though there are a few exceptions such as lcc and the newly revived pcc.
Larger embedded systems like those based on the ARM are generally quite similar to desktop systems in terms of tool chain availability. In fact, many of the same tools available for desktop machines can also generate code to run on ARM-based machines (e.g., lots of them use ports of gcc/g++). There's less variety for TI DSPs (and a greater emphasis on quality of generated code than source code features), but there are still at least a couple of respectable C++ compilers available.
If you want to work with smaller embedded systems, the situation changes in a hurry. If you want to be able to target something like a PIC or an AVR, C++ isn't really much of an option. In theory, you could get (for example) Comeau to produce a custom port that generated code you could compile on that target's C compiler -- but chances are pretty good that even if you did, it wouldn't work out very well. These systems are really just too limitated (especially on memory size) for C++ to fit them well.
Depending on what your intended use is for the library, I think I'd suggest implementing it first as C - but the design should keep in mind how it would be incorporated into a C++ design. Then implement C++ classes on top of and/or along side of the C implementation (there's no reason this step cannot be done concurrently with the first). If your C design is done with a C++ design in mind, it's likely to be as clean, readable and maintainable as the C++ design would be. This is somewhat more work, but I think you'll end up with a library that's useful in more situations.
While you'll find C++ used more and more on various embedded projects, there are still many that restrict themselves to C (and I'd guess this is more often the case than not) - regardless of whether or not the tools support C++. It would be a shame to have a nice library of routines that you could bring to a new project you're working on, but be unable to use them because C++ isn't being used on that particular project.
In general, it's much easier to use a well-designed C library from C++ than the other way around. I've taken this approach with several sets of code including parsing Intel Hex files, a simple command parser, manipulating synchronization objects, FSM frameworks, etc. I'm planning on doing a simple XML parser at some point.
Here's an entirely different C++-vs-C argument: stable ABIs. If your library exports a C ABI, it can be compiled with any compiler that works on the system, because C ABIs are generally platform standards. If your library exports a C++ ABI, it can only be compiled with a matching compiler -- because C++ ABIs are usually not platform standards, and often differ from compiler to compiler and even version to version.
Interestingly, one of the rare exceptions to this is ARM; there's an ARM C++ ABI specification, and all compliant ARM compilers follow it. This is not true on x86; on x86, you're lucky if a C++ library compiled with a 4.1 version of GCC will link correctly with an application compiled with GCC 4.4, and don't even ask about 3.4.6.
Even if you export a C ABI, you can have problems. If your library uses C++ internally, it will then link to libstdc++ for things in the C++ std:: namespace. If your user compiles a C++ application that uses your library, they'll also link to libstdc++ -- and so the overall application gets linked to libstdc++ twice, and their libstdc++ may not be compatible with your libstdc++, which can (or so I understand) lead to odd errors from the intersection of the two. Considerably less likely, but still possible.
All of these arguments only apply because you're writing a library, and they're not showstoppers. But they are things to be aware of.

How do I write a C++ program that will easily compile in Linux and Windows?

I am making a C++ program.
One of my biggest annoyances with C++ is its supposed platform independence.
You all probably know that it is pretty much impossible to compile a Linux C++ program in Windows and a Windows one to Linux without a deluge of cryptic errors and platform specific include files.
Of course you can always switch to some emulation like Cygwin and wine, but I ask you, is there really no other way?
The language itself is cross-platform but most libraries are not, but there are three things that you should keep in mind if you want to go completely cross-platform when programming in C++.
Firstly, you need to start using some kind of cross-platform build system, like SCons. Secondly, you need to make sure that all of the libraries that you are using are built to be cross-platform.
And a minor third point, I would recommend using a compiler that exists on all of your target platforms, gcc comes in mind here (C++ is a rather complex beast and all compilers have their own specific quirks).
I have some further suggestions regarding graphical user interfaces for you. There are several of these available to use, the three most notable are:
GTK+
QT
wxWidgets
GTK+ and QT are two API's that come with their own widget sets (buttons, lists, etc.), whilst wxWidgets is more of a wrapper API to the currently running platforms native widget set. This means that the two former might look a bit differently compared to the rest of the system whilst the latter one will look just like a native program.
And if you're into games programming there are equally many API's to choose from, all of them cross-platform as well. The two most fully featured that I know of are:
SDL
SFML
Both of which contains everything from graphics to input and audio routines, either through plugins or built-in.
Also, if you feel that the standard library in C++ is a bit lacking, check out Boost for some general purpose cross-platform sweetness.
Good Luck.
C++ is cross platform. The problem you seem to have is that you are using platform dependent libraries.
I assume you are really talking about UI componenets- in which case I suggest using something like GTK+, Qt, or wxWindows- each of which have UI components that can be compiled for different systems.
The only solution is for you to find and use platform independent libraries.
And, on a side note, neither cygwin or Wine are emulation- they are 100% native implementations of the same functionality found their respective systems.
Once you're aware of the gotchas, it's actually not that hard. All of the code I am currently working on compiles on 32 and 64-bit Windows, all flavors of Linux, as well as Unix (Sun, HP and IBM). Obviously, these are not GUI products. Also, we don't use third-party libraries, unless we're compiling them ourselves.
I have one .h file that contains all of the compiler-specific code. For example, Microsoft and gcc disagree on how to specify an 8-bit integer. So in the .h, I have
#if defined(_MSC_VER)
typedef __int8 int8_t;
#elif defined(__unix)
typedef char int8_t;
#endif
There's also quite a bit of code that uniformizes certain lower-level function calls, for example:
#if defined(_MSC_VER)
#define SplitPath(Path__,Drive__,Name__,Ext__) _splitpath(Path__,Drive__,Dir__,Name__,Ext__)
#elif defined(__unix)
#define SplitPath(Path__,Drive__,Name__,Ext__) UnixSplitPath(Path__,Drive__,Name__,Ext__)
#endif
Now in this case, I believe I had to write a UnixSplitPath() function - there will be times when you need to. But most of the time, you just have to find the correct replacement function. In my code, I'll call SplitPath(), even though it's not a native function on either platform; the #defines will sort it out for me. It takes a while to train yourself.
Believe it or not, my .h file is only 240 lines long. There's really not much to it. And that includes handling endian issues.
Some of the lower-level stuff will need conditional compilation. For example, in Windows, I use Critical Sections, but in Linux I need to use pthread_mutex's. CriticalSection's were encapsulated in a class, and this class has a good deal of conditional compilation. However, the upper-level program is totally unaware, the class functions exactly the same regardless of the platform.
The other secret I can give you is: build your project on all platforms often (particularly at the beginning). It is a lot easier when you nip the compiler problems in the bud. Don't wait until you're done development before you attempt to go cross-platform.
Stick to ANSI C++ and libraries that are cross-platform and you should be fine.
Create some low-level layer that will contain all the platform-specific code in your project. Implement 2 versions of this layer - one for Windows, and one for Linux - with the same interface, and build them to 2 libraries. Access all platform-specific functionality in your project through that interface.
This layer can contain general classes for file access, printing, GUI, etc.
All the (now non-platform-specific) code that uses that layer can now be compiled once on Windows and once on Linux.
Compile it in Window and again in Linux. Unless you used platform specific libraries, it should work. It's not like Java, where you compile it once and it works everywhere. No one has made a virtual machine for C++, and probably never will. The code you write in C++ will work in any platform. You just have to compile it in every platform first.
Suggestions:
Use typedef's for ints. Or #include <stdint.h>. Some machines think int is 8 bytes, some 4. (It used to be 2 and 4. How the times have changed.)
Use encapsulation wherever possible. My last window's compiler thought %lld was %I64d", gave screwy return values for vsnprintf(), similar issues with close() and sockets, etc.
Watch out for stack size / buffer size limits. I've run into an 8k UDP buffer limit under Windows, amongst other problems.
For some reason, my Window's C++ compiler wouldn't accept dynamicly-sized allocations off the stack. E.g.: void foo(int a) { int b[a]; } Be aware of those sort of things. Plan how you will recode.
#ifdef can be your best friend. And your worst enemy! (At the same time!)
It can certainly be done. But compile and test early and often!
Also Linux and Windows have diffrent data model.
See article: The forgotten problems of 64-bit programs development
Standard C++ is code compiles without errors on any platform.
Try using Bloodshed Dev C++ on windows (instead of VC++ / Borland C++).
As Bloodshed Dev C++ confirms C++ standards, so programs compiled using it will be compiled on linux without errors in most of the cases.