If a program was written in c++ to run on Windows, does it have to be completely rewritten to run on Mac OS or a mobile OS?
C++ is a standard language, which means that the source code that you write can be compiled on any platform which has an implementation of the C++ standard. There are two ways you can write C++ programs that can't be compiled on different implementation. First, if you use language extensions that are found on a specific (set of)implementation(s) only. Second, using a library that depends on code that doesn't ship with the standard library(like on OS API).
For the first matter, try always to write standard code. For the second, use cross-platform libraries like Boost, Qt...
Typically, yes, because the program will need to use OS-specific features for windowing and possibly for other features as well (networking, synchronization, etc.) However, many programs try to mitigate this as much as possible by building wrapper classes so that most of the program deals with these wrappers rather than the raw platform-specific tools. To port the program from one platform to another, you just need to reimplement the wrappers using the new platform's tools.
Many programs take this a step further by using prewritten libraries like Qt or Boost to handle some of the cross-platform silliness, but this is (essentially) the above idea at a larger scale.
This depends. In general, Standard C++ is a general-purpose, portable language which can be compiled to run on any system or platform that has a standard-compliant compiler.
However, a lot of the more "interesting" features you might want to add to a typical application are not part of Standard C++. This includes GUIs, threads, sockets, and low-level OS API calls. These are generally not portable, and parts of the code which use these features will need to be implemented separately for each operating system or platform.
Fortunately, this is not as daunting as it sounds, because there are a lot of cross-platform libraries in existence that have already gone through the trouble of doing this. For example, the Boost threading library already has threading code written for different platforms or operating systems, but all of this is abstracted behind a nice uniform API that can be used portably in C++ application code.
Additionally, a lot of non-Standard C++ code still conforms to some standard, such as POSIX, which is supported across multiple platforms. For example, most UNIX-ish systems, including Linux and Mac OS X, support POSIX threads (the pthread API).
If the code itself uses libraries that are supported on all the target platforms then you will only need the appropriate compilers to generate a valid binary for each system.
Software can be written in multiple languages and then linked together. For example, I can code the back-end logic of my application in C++ (often using Boost), and then build two separate front-ends in C# for Windows and Objective-C for Mac. I can link the C++ and C# components to ship for one platform, then link the C++ and Objective-C components to ship for another. That approach will give the most "native" look-and-feel for each platform.
As alternative, I can code the entire front-end in C++ using Qt or WxWidgets. This will run on all platforms, albeit without 100% of a platform's bells and whistles.
Related
One of the first things I learned as a student was that C++ applications don't run on different operating systems. Recently, I read that Qt based C++ applications run everywhere. So, what is going on? Are C++ applications cross-platform or not?
Source code compatible. If I compile the source code, will it run everywhere?
API/ABI compatibility. Does the OS provide the interface to its components in a way that the code will understand?
Binary compatibility. Is the code capable of running on the target host?
Source code compatible
C++ is a standard which defines how structures, memory, files can be read and written.
#include <iostream>
int main( int argc, char ** argv )
{
std::cout << "Hello World" << std::endl;
}
Code written to process data (e.g. grep, awk, sed) is generally cross-platform.
When you want to interact with the user, modern operating systems have a GUI, these are not cross-platform, and cause code to be written for a specific platform.
Libraries such as qt or wxWidgets have implementations for multiple platforms and allow you to program for qt instead of Windows or iOS, with the result being compatible with both.
The problem with these anonymizing libraries, is they take some of the specific benefits of platform X away in the interest of uniformity across platforms.
Examples of this would be on Windows using the WaitForMultipleObjects function, which allows you to wait for different types of events to occur, or the fork function on UNIX, which allows two copies of your process to be running with significant shared state. In the UI, the forms look and behave slightly different (e.g. color-picker, maximize, minimize, the ability to track mouse outside of your window, the behaviour of gestures).
When the work you need to be done is important to you, then you may end up wanting to write platform specific code to leverage the advantages of the specific application.
The C library sqlite is broadly cross-platform code, but its low-level IO is platform specific, so it can make guarantees for database integrity (that the data is really written to disk).
So libraries such as Qt do work, they may produce results which are unsatisfactory, and you end up having to write native code.
API/ABI compatibility
Different releases of UNIX and Windows have some form of compatibility with each other. These allow a binary built for one version of the OS to run on other versions of the OS.
In UNIX the choice of your build machine defines the compatibility. The lowest OS revision you wish to support should be your build machine, and it will produce binaries compatible with subsequent minor versions until they make a breaking change (deprecate a library).
On Windows and Mac OS X, you choose an SDK which allows you to target a set of OS's with the same issues with breaking changes.
On Linux, each kernel revision is ABI incompatible with any other, and kernel modules need to be re-compiled for each kernel revision.
Binary compatibility
This is the ability of the CPU to understand the code. This is more complex than you might think, as the x64 chips, can be capable (depending on OS support) of running x86 code.
Typically a C++ program is packaged inside a container (PE executable, ELF format) which is used by the operating system to unpack the sections of code and data and to load libraries. This makes the final program have both binary (type of code) and API (format of the container) forms of incompatibilities.
Also today if you compile a x86 Windows Application (targeting Windows 7 on Visual Studio 2015), then the code may fail to execute if the processor does not have SSE2 instructions (about 10 years old CPU).
Finally when Apple changed from PowerPC to x86, they provided an emulation layer which allowed the old PowerPC code to run in an emulator on the x86 platform.
So in general binary incompatibility is a murky area. It would be possible to produce an OS which identified invalid instructions (e.g. SSE2) and in the fault, emulated the behaviour, this could be updated as new features come out, and keeps your code running, even though it is binary incompatible.
Even if your platform is incapable of running a form of instruction set, it could be emulated and behave compatibly.
Standard C++ is cross platform in the "write once, compile anywhere" sense, but not in the "compile once, run anywhere" sense.
That means that if you write a program in standard C++, you can compile and then run it on any target environment that has a standard conforming implementation of C++.
You can however not compile your program on your machine, ship the binary and then expect it to work on other targets. (At least not in general. One can of course distribute binaries from C++ code under certain conditions, but those depend on the actual target. This is a broad field.)
Of course, if you use extra, non-standard features like gcc's variable length arrays or third party libraries, you can only compile on systems that provide those extensions and libraries.
Some libraries like Qt and Boost are available on many systems (those two on Linux, Mac and Windows at least I believe), so your code will stay cross platform if you use those.
You can achieve that your source compiles on various platforms, giving you various binaries from the same source base.
This is not "compile once, run anywhere with an appropriate VM" as Java or C# do it, but "write once, compile anywhere with an appropriate environment" the way C has done it all the time.
Since the standard library does not provide everything you might need, you have to look for third-party libraries to provide that functionality. Certain frameworks -- like Boost, Qt, GTK+, wxWidgets etc. -- can provide that. Since these frameworks are written in a way that they compile on different platforms, you can achieve cross-platform functionality in the aforementioned sense.
There are various things to be aware of if you want your C++ code to be cross-platform.
The obvious thing is source that makes assumption on data types. Your long might be 32bit here and 64bit there. Data type alignment and struct padding might differ. There are ways to "play it safe" here, like size_t / size_type / uint16_t typedefs etc., and ways to get it wrong, like wchar_t and std::wstring. It takes discipline and some experience to "get it right".
Not all compilers are created equal. You cannot use all the latest C++ language features, or use libraries that rely on those features, if you require your source to compile on other C++ compilers. Check the compatibility chart first.
Another thing is endianess. Just one example, when you're writing a stream of integers to file on one platform (say, x86 or x86_64), and then read it back again on a different platform (say, POWER), you can run into problems. Why would you write integers to file? Well, UTF-16 is integers... again, discipline and some experience go a long way toward making this rather painless.
Once you've checked all those boxes, you need to make sure of the availability of the libraries you base your code on. While std:: is safe (but see "not all compilers are created equal" above), something as innocent as boost:: can become a problem if you're looking beyond the mainstream. (I helped the Boost guys to fix one or two showstoppers regarding AIX / Visual Age in past years simply because they didn't have access to that platform for testing new releases...)
Oh, and watch out for the various licensing schemes out there. Some frameworks that improve your cross-platform capabilities -- like Qt or Cygwin -- have their strings attached. That is not to say they are not a big help in the right circumstances, just that you need to be aware of copyleft / proprietary licensing requirements.
All that being said, there is Wine ("Wine is not emulation"), which makes executables compiled for Windows run on a variety of Unix-alike systems (Linux, OS X, *BSD, Solaris). There are certain limits to its capabilities, but it's getting better all the time.
Yes. No. Maybe. What is cross-platform C++ code? Cross-platform C++ code is such a code that can be compiled under different operation systems without the need to be modified.
That means, if you explicitly use any platform-dependant headers, your code is no longer cross-platform. Qt solves this problem in the following way: they provide wrappers for everything that is platform-specific. For example, imagine that you are using QFile to open/read/write a file. Your code looks like
QFile file(filename);
file.open(QFile::ReadOnly);
//other stuff
You can compile this code under any OS as long as you have a suitable compiler and Qt libraries for that OS. The code hidden under QFile will use the OS-appropriate file-handling functions, but that shouldn't concern you.
Also, if you only use the standard library, your code can be compiled anywhere where a C++ compiler is present.
The already-compiled applications, however, are not cross-platform in a way that, say, Java applications are - for instance, you can't compile an app for Windows and then run in in Linux, you will have to recompile your code under Linux instead.
C++ is cross-platform. You can use it to build applications that will run on many different operating systems.
What is not cross-platform is the compilers that translate C++ into object code. No single compiler, to my knowledge, has all the necessary features so that when you use it to compile a C++ program, it will automatically run on Windows, Linux and Mac OS.
Qt Creator is integrated with multiple compilers and has build automation. It makes it easy to switch between different setups and target platforms. It provides support for building, running and deploying C++ applications not only for desktop environments but also for mobile devices.
C++ is a programming language. Text. As such, it doesn't run anywhere.
Conforming Standard C++ code is expected to behave equally on any platform; "cross-platform" if you want. Writing (strictly) conforming C++ code requires pedantry because some assumptions often made have dependencies on details that are final to the actual implementation and this is inherited from the targets C++ itself aims to.
Notice we're still talking about C++ code, not C++ programs. Indeed, when we pass to term "program", we've no more guarantees because we aren't talking about C++ anymore; rather, the output of the compiler. This is where portability begins to fade away: executable format, ISA, ABI, low-level routines and so on.
Can you rely on that? If you can't, then you need to integrate your C++ program in the environment it will run on, by recompiling it or using platform-specific elements.
I'm a die-hard .NET developer with limited experience in C++. I'm really familiar with how happily interpreted languages (and scripting languages) work cross-platform, but what about C++?
I realize that GCC/GPP and some other compilers sorta work multi-platform with the right compiler flags, and I understand that the STL is normalized between compilers, but what else am I missing? I'll need to do audio in/out, high accuracy timers, and I'll need to do multithreading. I don't think any of these things are supported in the STL, so I'll be needing a cross-platform library of some type, right? Which one should I use?
I'm aiming to support the latest Mac and Windows platforms. This is a shared framework/sdk and won't have a UI. I plan on writing the UI in a native language such as Objective-C/Cocoa and .NET/WPF (both which have excellent native UI support).
So what should my tool chain look like? Should I be using GCC/GPP or MinGW? What other 'libraries' should I integrate that will function cross platform? I'd like to setup my build environment to 'just work' such that I can build a Mac compatible binary and a windows compatible binary (32 bit and 64 bit). How can I do that?
At this point, I'll be writing code for each platform so that I can interop with this my C++ multiplatform framework sdk/api thingy. In windows, I think this will look like a managed DLL, is that right? Any thoughts on how I'll do this on a Mac?
Any suggestions or recommendations?
Thanks for the suggestions,
Brett
1) Your real goal here should be to write portable C++ -- you get portability by keeping your language use clean, not by using a particular compiler.
2) Some of what you want, like audio i/o, is fairly inherently platform dependent. You can, however, cleanly isolate the platform dependencies inside particular modules and conditionally compile a set of platform support files on each platform.
3) Stuff like multithreading can be done in a platform independent manner using standard libraries.
Getting into subjective territory but for a developer used to working with large, do-it-all kind of frameworks like those we find in .NET, QT would probably be your best cross-platform C++ analogy (though I am not a huge fan of it). There you have your threads, XML I/O, localization, sockets, high accuracy timers, GUI building blocks, etc.
[...] and I'll need to do multithreading. I don't think any of these things
are supported in the STL [...]
Just a small thing, but the STL is limited to describing the aggregate containers and generic algorithms of the C++ Standard Library. It is not synonymous, but C++11 has specifications in the standard library for concurrency support. However, popular compilers are still slow to support it fully. You also have boost if you need threads for the time being: http://www.boost.org which is an extremely cross-platform library.
As for building an API that can happily plug in to various environments (.NET, Cocoa, etc), your best, most cross-platform option is actually to expose a C API (you're free to implement it using C++). It'll cause the least headaches this way (no issues with name mangling, trying to interop with complex, user-defined C++ types, etc.).
I recommend you get some practice with building DLLs/shared libraries in C++ as a way to extend .NET and Cocoa applications (ex: C++ module used in objective-C) before you start trying to develop a grand library.
For timing the standard library has <chrono>, which has nanosecond resolution on Mac. Unfortunately it's only got millisecond resolution on Windows as of the VS11 Beta. I'm hoping they fix it without too much delay.
In C++, if I write a simple game like pong using Linux, can that same code be compiled on Windows and OSX? Where can I tell it won't be able to be compiled?
You have three major portability hurdles.
The first, and simplest, is writing C++ code that all the target compilers understand. Note: this is different from writing to the C++ standard. The problem with "writing to the standard" starts with: which standard? You have C++98, C++03, C++TR1 or C++11 or C++14 or C++17? These are all revisions to C++ and the newer one you use the less compliant compilers are likely to be. C++ is very large, and realistically the best you can hope for is C++98 with some C++03 features.
Compilers all add their own extensions, and it's all too easy to unknowingly use them. You would be wise to write to the standard and not to the compiler documentation. Some compilers have a "strict" mode where they will turn off all extensions. You would be wise to do primary development in the compiler which has the most strictures and the best standard compliance. gcc has the -Wstrict family of flags to turn on strict warnings. -ansi will remove extensions which conflict with the standard. -std=c++98 will tell the compiler to work against the C++98 standard and remove GNU C++ extensions.
With that in mind, to remain sane you must restrict yourself to a handful of compilers and only their recent versions. Even writing a relatively simple C library for multiple compilers is difficult. Fortunately, both Linux and OS X use gcc. Windows has Visual C++, but different versions are more like a squabbling family than a single compiler when it comes to compatibility (with the standard or each other), so you'll have to pick a version or two to support. Alternatively, you can use one of the gcc derived compiler environments such as MinGW. Check the [list of C++ compilers](less compliant compilers are likely to be) for compatibility information, but keep in mind this is only for the latest version.
Next is your graphics and sound library. It has to not just be cross platform, it has to look good and be fast on all platforms. These days there's a lot of possibilities, Simple DirectMedia Layer is one. You'll have to choose at what level you want to code. Do you want detailed control? Or do you want an engine to take care of things? There's an existing answer for this so I won't go into details. Be sure to choose one that is dedicated to being cross platform, not just happens to work. Compatibility bugs in your graphics library can sink your project fast.
Finally, there's the simple incompatibilities which exist between the operating systems. POSIX compliance has come a long way, and you're lucky that both Linux and OS X are Unix under the hood, but Windows will always be the odd man out. Things which are likely to bite you mostly have to do with the filesystem. Here's a handful:
Filesystem layout
File path syntax (ie. C:\foo\bar vs /foo/bar)
Mandatory Windows file locking
Differing file permissions systems
Differing models of interprocess communication (ie. fork, shared memory, etc...)
Differing threading models (your graphics library should smooth this out)
There you have it. What a mess, huh? Cross-platform programming is as much a state of mind and statement of purpose as it is a technique. It requires some dedication and extra time. There are some things you can do to make the process less grueling...
Turn on all strictures and warnings and fix them
Turn off all language extensions
Periodically compile and test in Windows, not just at the end
Get programmer who likes Windows on the project
Restrict yourself to as few compilers as you can
Choose a well maintained, well supported graphics library
Isolate platform specific code (for example, in a subclass)
Treat Windows as a first class citizen
The most important thing is to do this all from the start. Portability is not something you bolt on at the end. Not just your code, but your whole design can become unportable if you're not vigilant.
C++ is ultra portable and has compilers available on more platforms than you can shake a stick at. Languages like Java are typically touted as being massively cross platform, ironically they are in fact usually implemented in C++, or C.
That covers "portability". If you actually mean, how cross platform is C++, then not so much: The C++ standard only defines an IO library suitable for console IO - i.e. text based, so as soon as you want develop some kind of GUI, you are going to need to use a GUI framework - and GUI frameworks are historically very platform specific. Windows has multiple "native" GUI frameworks now - the C++ framework made available from Microsoft is still MFC - which wraps the native Win32 API which is a C API. (WPF and WinForms are available to CLR C++).
The Apple Mac's GUI framework is called Cocoa, and is an objective-C library, but its easy to access Objective C from C++ in that development environment.
On Linux there is the GTK+ and Qt frameworks that are both actually ported to Windows and Apple, so one of these C++ frameworks can solve your "how to write a GUI application in C++ once that builds and runs on windows, apple mac and linux".
Of course, its difficult to regard Qt as strictly C++ anymore - Qt defines a special markup for signals and slots that requires a pre-compile compile step.
You can read the standard - if a program respects the standard, it should be compilable on all platforms that have a C++ standard-compliant compiler.
As for 3rd party libraries you might be using, the platform availability is usually specified in the documentation.
When GUI comes to question, there are cross-platform options (such as QT), but you should probably ask yourself - do I really want portability when it comes to UI? Sometimes, it's better to have the GUI part platform-specific.
If you are thinking of porting from Linux to Windows, using OPENGL for the graphical part gives you freedom to run your program on both operating systems as long as you don't use any system specific functionality.
Compared to C, C++ portability is extremely limited, if not completely unexisting. For one you can't disable exceptions (well you can), for the standard specifically says that's undefined behaviour. Many devices don't even support exceptions. So as for that, C++ is ZERO portable. Plus seeing the UB, it's obvioulsy a no-go for zero-fail high-performance real time systems in which exceptions are taboo - undefined behaviour has no place in zero-fail environment. Then there's the name mangling which most, if not every, compiler does completely different. For good portability and inter-compatibility extern "C" would have to be used to export symbols, yet this renders any and all namespace information completely void, resulting in duplicate symbols. One can ofcourse choose to not use namespaces and use unique symbol names. Yet another C++ feature rendered void. Then there's the complexity of the language, which results in implementation difficulties in the various compilers for various architectures. Due to these difficulties, true portability becomes a problem. One can solve this by having a large chain of compiler directives/#ifdefs/macros. Templates? Not even supported by most compilers.
What portability? You mean the semi-portability between a couple of main-stream build targets like MSVC for Windows and GCC for Linux? Even there, in that MAIN-STREAM segment, all the above problems and limitations exist. It's retarded to even think C++ is portable.
I am in the middle of reading The Linux Programming Interface and Linux programming by examples. Both are very good books and explain Linux API very well. But quite often I find myself thinking that in real world projects I would prefer C++ standard library, Boost or some other good C++ library (there are many well written and portable C++ libs) over C API whenever possible. This naturally bags a question - why do I need to use Linux API directly when good C++ compiler and libs (Boost, TBB and etc) are available on target platforms? I guess the same could be said about Windows API too, but I don't know much about Windows system programing.
This is not new to C++. In C, there have been two ways to open files for a long, long time:
// Only on POSIX
int fdes = open("file.txt", O_RDONLY);
Or:
// Any hosted C environment, POSIX or otherwise
FILE *fp = fopen("file.txt", "rb");
So why would anyone ever use the POSIX-specific version? The answer is simple -- there are a large number of system calls which work with POSIX file descriptors. For example, select. You can also make things other than files, like pipes and sockets, and you can pass them to other processes. There is a long tradition of using POSIX file descriptors, and we have a large number of books and references on how to do network programming with them.
So the trade-off is between the portable version and the powerful version. It always has been.
The other half of this is that time you work with files on Linux you are working with the POSIX interface. Libraries just hide it from you. Boost uses it, the C runtime uses it, the JRE uses it, and GHC uses it. Many (most?) language runtimes are written in C, and direct access to the system calls is preferred.
You should use higher level API whenever possible. It's usually faster to work with and makes it easier to port your code to another platform. However:
Due to the law of leaky abstractions it's useful to know the underyling operating system API so you understand various quirks and performance issues that the higher level API was not able to hide.
Some things are not doable with the portable API, usually because it's so different between operating systems that it's not easily possible.
All portable APIs incur some overhead. In a big project it's small compared to the rest of the code, but if you are doing something small, you might want to avoid that overhead, especially if you know you'll need to use the specific API somewhere anyway.
C++ standard is not published for a particular platform. It is platform independent, So If you are going to use some platform features/functionality you will have to use platform dependent feature/functionality usually called system api. So in that sense No the C++ library does not deprecate linux/windows api.
I have heard that C++ offers no native support for multithreading. I assume that multithreaded C++ apps depended on managed code for multithreading; that is, for example, a Visual C++ app used MFC or .NET or something along those lines to provide multithreading capability. I further assume that some or all of those managed-code capabilities are unavailable to unmanaged applications. But I have read about unmanaged multithreaded applications. How is this possible? Which of my assumptions is false?
It is wholly up to the operating system to provide support for multi-threading. On Windows, the necessary functionality is available via the Win32 API. Frameworks such as MFC provide wrappers over the low-level threading functions to simplify things, while of course .NET/CLR has its own managed interface for accessing Win32 multi-threading capabilities.
A good explanation is offerred in this article (Multithreading in C++).
Why Doesn’t C++ Contain Built-In
Support for Multithreading?
C++ does not contain any built-in
support for multithreaded
applications. Instead, it relies
entirely upon the operating system to
provide this feature. Given that both
Java and C# provide built-in support
for multithreading, it is natural to
ask why this isn’t also the case for
C++. The answers are efficiency,
control, and the range of applications
to which C++ is applied. Let’s examine
each.
By not building in support for
multithreading, C++ does not attempt
to define a “one size fits all”
solution. Instead, C++ allows you to
directly utilize the multithreading
features provided by the operating
system. This approach means that your
programs can be multithreaded in the
most efficient means supported by the
execution environment. Because many
multitasking environments offer rich
support for multithreading, being able
to access that support is crucial to
the creation of high-performance,
multithreaded programs.
Multithreading in C++ does not require managed code.
In very much the same way that C++ does not provide native support for displaying graphics or emitting sounds or reading input from a mouse, the operating system that's being used will provide a C++ API for utilizing these features.
It's not a matter of C++ not being able to do it. It simply hasn't been written into the C++ standard yet.
Some of your assumptions are not quite right. The operating system (I'm talking about win32 since you mention .NET) provides support for threading. There are lots of good threading libs. that build ontop of the OS functionality in C++ to make multithreading "easier" :) -- pthreads for example. Here is more at MSDN.
The ISO standard for the programming language C++ neither defines nor prohibits multithreading. An implementation is allowed to provide extensions if it wishes. A program is allowed to use implementation extensions if it wishes, and then the program will only run on systems that provide those extensions.
For comparison, the ISO standard for the programming language C++ neither defines nor prohibits the use of a mouse. A program is allowed to use implementation extensions and then it will only run on systems that provide those extensions. For another comparison, the ISO standard for C++ neither defines nor prohibits UTF-8, so your program can depend on Latin-1 and then your program will only run on systems that provide Latin-1.
Native C++ does not offer "built in" multithreading support simply because it was not intended to, or in fact, needed.
Your misconception is that this is a fault, while it is in fact a strength of the language. By being "oblivious" to multithreading, C++ seamlessly integrates with the MT support offered by the OS your code will compile and run on, thereby offering much more flexibility and efficiency than if it came with it's own "MT baggage" so to speak.
You mention MFC and .NET as examples - be aware that these libraries/wrappers are merely a layer over basic Win32 API's. Using C++ as intended will provide you with efficient code that will run multithrededly on ANY OS, as long as you seperate the logic from the OS-specific MT API calles (i.e thread creation etc), so that porting between OS's is greatly facilitated.
Unlike Java, which defined language constructs and JVM specs, the C++ standard is oblivious to threading (so is C). As far as these languages are concerned, anything thread-related consists of function calls to OS functionality. Libraries compiled for multithreading simply make sense of the same calls, but from a language perspectives they are plain old code.
I think you misunderstand the definition of 'managed' code. 'Managed' code is a Microsoft-specific term meaning code that is uses the .NET framework and thus is subject to the various aspects of .NET. 'Unmanaged' code means code that runs outside that and does not operate through the .NET layer. MFC code is 'unmanaged'; it's merely a spectacularly bad wrapper for the ubiquitous Win32 API (which isn't even the lowest level API available on Windows).
The .NET libraries (including multithreading) are almost all, at some level, interfaces for the more basic system APIs used by traditional, 'unmanaged' applications. There is, generally speaking, no functionality available to 'managed' code that cannot be replicated in 'unmanaged' code with sufficient effort, though the reverse is not true (this is called the abstraction penalty, if you wish to know more). While it may be easier to do in 'managed' code, that's just because somewhere, some 'unmanaged' code is doing it for you, more or less. In the case of a threading API, it is in turn an interface to the operating system kernel, which itself accesses the processor's capabilities to allow a process to run in multiple places concurrently (if using multiple cores; if not, then it's just a pseudo-concurrency).
The C++ standard currently provides no definition of threads (the upcoming C++1x standard does). There are a number different threading libraries available, including those provided by Win32 and MFC, the pthreads library found on POSIX systems, and Boost.Thread, which will use the platform's local threading library.
The next C++ standard (named c++0x) will have support for multithreading.
Will include atomic operations.