Is new C++ backward compatible - c++

I haven't coded in C++ for years. I recently discovered that during those years it has changed quite dramatically. I'm not sure I like the changes, but that's another discussion.
I still have some C++ code knocking around my hard drive. If I got it out and tried to compile it with a nice new C++ compiler, say the latest version of g++, would it compile? Without warnings (assuming it compiled without warnings before)?
I did get to mess around with a little VC++ 2010 recently and found some things I expected to work just not working, and got different messages depending on context when I tried to use NULL. But in one part of that code I used NULL without even a warning.

It depends. Usually newer compilers are more standard-compliant, so many constructs that compiled on earlier compilers don't compile now without fixing. For example:
for( int i = 0; ... );
i++;
compiled in Visual C++ 7, but not in Visual C++ 9.

In general, yes it is backwards compatible. However, the devil is in the details. You likely will find things where conventions change, or specific libraries fall into or out of use.

NULL is a macro - prefer to use 0 (or nullptr in C++0x).
Not sure just how old your code is, but Visual C++ v6 suffers from limitations that result in code that simply won't compile on newer compilers. VS2005 and up are a lot better (in the sense of more correct wrt the contemporaneous C++ standard).
I would not expect it to be a huge amount of work to get old code compiled, though. I've done some pretty major ports from VC6 -> VS2005 and for the most part it's a matter of hours, not days. Perhaps once the culture shock has worn off this will not seem so daunting. Really, VC++ 10 is very nice.

This depends on what you're comparing to.
Visual Studio 2010 partially implements the upcoming C++0x draft (recent versions of GCC also implement a subset of this draft, which is expected to be standardized next year)
C++98/C++03 was the first standardized version of C++, and is still the official one (as C++0x is still only a draft)
and of course, there are the dialects from before the language was standardized
C++0x is pretty much backwards compatible with C++03/98. There may be a couple of obscure corner cases that have changed, but you are unlikely to encounter them.
However, a lot of changes occurred when the language was first standardized, meaning that C++98 isn't entirely (but nearly) compatible with pre-standard C++.
But more likely, what you're into isn't a question of C++ backwards compatibility, but simply that compilers have gotten stricter. They have become much better at following the standard, and no longer allow many non-standard tricks that were common earlier. Most likely, your old code was never valid C++, but worked because compilers used to deviate more from the standard.

The language itself hasn't changed since it was standardized in 1998. Idioms have changed, and compilers have both improved their support for the standard and become stricter about non-standard code, but any standard-conforming C++ code that compiled in the past should still compile today. Code that relies on non-standard compiler features or quirks might not work, but compilers typically offer command-line options to make them accept non-standard code that used to be accepted by previous versions of the same compiler.
NULL is a macro that's defined in <cstddef>. Other standard headers may include that header as an implementation detail, so it's often possible to use NULL without explicitly including <cstddef>, but relying on that has always been non-portable. It's OK to use NULL as long as you include its header, though just using 0 is preferred in idiomatic C++.

All version of C++ should be backwards compatible.
Although there might be some unusual cases where there could be a problem, e.g. noexcept on destructos in C++0x (although this has not yet been decided).

Newer C++ standards come to clarify things, then take decissions on which usage is "more correct" or accepted, so I would expect warnings. Some other decisions change what can be done or not, so I would also expect errors. I've come across the same situation, and I had to tweak the code for it to compile. It was not a great deal, but this was taking into account my knowledge of C++. In general, you should encounter not great problems.
There are also other things. For example, not all compilers implemented the whole rules of the C++ standard, or they had bugs. When you get used with a compiler, some of those errors or missing features may pass unnoticed to your compiler, but may cause errors in future versions of the same compiler.

That's the major reason why we have standards. You don't have to worry about compatibility, you will just tell the compiler to compile the code as C++98 (or 2003).
MSVC is unfortunately very bad at supporting C++ standards. It is slowly getting better (and that's why old code doesn't compile, since it shouldn't compile in the first place).

Well, I know a lot of our old VS6 code started throwing tons of warnings when we upgraded to VS 2005 (with full warnings on of course, as everyone should work). Most were good things though, warning of potential information loss, warning that some types may be 64-bit instead of 32-bit like old code might expect, etc.
For your specific example of NULL, that wasn't even really standard C back in the day. Every compiler just had a #define for it that set it to 0. These days it is generally considered more correct (and clear) to just use 0.

If you have used old versions of various libraries such as Boost, expect some problems.

Slightly random, but the keyword export is being removed from the standard. So previously standards compliant code the used export would now be illegal. Of course, very few compilers even began implementing that keyword.

Very similar to sharptooth's answer, there are bits of older C and C++ code that will need /ZC:forScope- set (i.e. don't force conformance in for loop scope). e.g.
int myfunc(int x, int y, int z)
{
int j;
for (int i=0; i <= 10; i++)
{
if (f(i) == 0)
break;
}
j = i;
for (i=0; i <= 10; i++)
{
if (g(i) == 0)
break;
}
if (i > j)
return j;
return i;
}
This type of thing is quite common in much older code, where bytes cost money and variable re-use was common place.

Related

Using different C++ standard for different parts of the same file

Ok, I know this sounds like a broken question, but let me explain before the downvotes start: we are using C++11 as a baseline for our development (targeting several platforms). Recently a few changes needed to be done in our codebase and suddenly the need for some C++17 features crawled in.
All is fine till now, on our developer platform all compiles nicely, but on one of the continuous integration platforms (Ubuntu 18.04) we have got some errors, like:
/usr/include/ev++.h:355:46: error: ISO C++1z does not allow dynamic exception specifications dynamic_loop (unsigned int flags = AUTO) throw (bad_loop)
(c++1z dynamic exception specification error gives a good explanation why this happens and also offers some hacks on how to make it disappear)
But I started thinking (theoretically only, of course) that would it be possible that we specify some portions of a source file to be compiled with code conforming to C++11 and other parts with code conforming to C++17?
(And just to clarify: No, we don't want to upgrade, and yes, we have solved the problem, again, I am just interested in a theoretical approach)
In theory-theory it would be possible. Compiling different parts of file with different standards is effectivelly splitting the file and using different backend for every of these parts. It should not be that hard. But there is one big caveat - ABI compatibility. When you realize that linking object files compiled with different standards is often not possible, you see this whole idea is a landmine. The compiler would have to either keep the ABI stable (which would be nice but with the amount of changes made with every standard, keeping the ABI compatible between them would be a nightmare) or somehow translate the ABIs of different standards (this would also be nightmarish I suppose). Even now the most popular way to keep the ABI problems 'in check' is using extern "C" to avoid all fuss.
So the answer is - it would be possible in theory but the practical implications of such choice would be so dramatic that it is not worth it.

C++11 equivalent to std::quoted introduced in C++14

As used in this answer, I'm looking for a C++11 compatible code for the same but the usage of std::quoted prevents me from achieving that. Can anyone suggest an alternative solution?
I give my answer assuming that you expect to find a generic approach to handle such situations. The main question that defines the guideline for me is:
"How long am I supposed to maintain this code for an older compiler version?"
If I'm certain that it will be migrated to the newer toolset along with the rest of the code base (even though in a few years time, but it will inevitably happen), then I just copy-paste implementation from the standard headers of the next target version of my compiler and put it into namespace std in a separate header within my code base. Even though it's a very rude hack, it ensures that I have exactly the same code version as the one I'll get after migration. As I start using newer (in this case C++14-compatible) compiler, I will just remove my own "quoted.h", and that's it.
Important Caveat: Barry suggested to copy-paste gcc's implementation, and I agree as long as the gcc is your main target compiler. If that's not the case, then I'd take the one from your compiler. I'm making this statement explicitly because I had troubles when I tried to copy gcc's std::nested_exception into my code base and, having switched from Visual Studio 2013 to 2017, noticed several differences. Also, in the case of gcc, pay attention to its license.
If I'm in a situation where I'll have to maintain compatibility with this older compiler for quite a while (for instance, if my product targets multiple compiler version), then it's more preferable first of all to look if there's a similar functionality available in Boost. And there is, in most cases. So check out at Boost website. Even though it states
"Quoted" I/O Manipulators for Strings are not yet accepted into Boost
as public components. Thus the header file is currently located in
you are able to use it from "boost/detail". And, I strongly believe that it's still better than writing your own version (despite the advice from Synxis), even though the latter can be quite simple.
If you're obliged to maintain the old toolset and you cannot use Boost, well...then it's maybe indeed worth thinking of putting your own implementation in.

Major differences between Visual Studio 6.0 and VS 2010 Compilers

Some months ago I posted the following question
Problem with templates in VS 6.0
The ensuing discussion and your comments helped me to realize that getting my hands on a new compiler was mandatory - or basically they were the final spark which set me into motion. After one month of company-internal "lobbying" I am finally getting VS 2012 !! (thank you guys)
Several old tools which I have to use were developed with VS 6.0
My concerns are that some of these tools might not work with the new Compiler. This is why I was wondering whether somebody here could point out the major differences between VS 6 and VS 2012 - or at least the ones between VS 6 and VS 2010 - the changes from 2010 to 2012 are well documentes online.
Obviously the differences between VS 6.0 and VS 12 must be enormous ... I am mostly concerned with basic things like casts etc. There is hardly any information about VS 6.0 on the web - and I am somewhat at a loss :(
I think I will have to create new projects with the same classes. In the second step I would overwrite the .h and .cpp files with the ones of the old tools. Thus at least I will be able to open the files via the new compiler. Still some casts or Class definitions might not be supported and I would like to have a general idea of what to look for while debugging :)
The language has evolved significantly since VS 6.0 came out.
VS6.0 is pre-C++98; VS 2012 is C++03, with a few features from
C++11.
Most of the newer language features are upwards compatible;
older code should still work. Still, VC 6.0 is pre-standard,
and the committee was less concerned about breaking existing
code when there was no previous standard (and implementations
did vary). There are several aspects of the language (at least)
which might cause problems.
The first is that VC 6.0 still used the old scoping for
variables defined in a for. Thus, in VC 6.0, things like the following
were legal:
int findIndex( int* array, int size, int target )
{
for ( int i = 0; i < size && array[i] != target ; ++ i ) {
}
return i;
}
This will not compile in VC 2012 (unless there is also a global
variable i, in which case, it will return that, and not the
local one).
IIRC, too, VC 6.0 wasn't very strict in enforcing access
controls and const. This may not be problem when migrating,
however, because VC 2012 still fails to conform to C++98 in some
of the more flagrant cases, at least with the default options.
(You can still bind a temporary to a non-const reference, for
example.)
Another major language change which isn't backwards compatible
is name lookup in templates. Here too, however, even in VC
2012, Microsoft has implemented pre-standanrd name lookup (and
I mean pre-C++98). This is a serious problem if you want to
port your code to other compilers, but it does make migrating
from VC 6.0 to VC 2012 a lot easier.
With regards to the library, I can't remember whether 6.0
supported the C++98 library, or whether it was still
pre-standard (or possibly it supported both). If your code has
things like #include <iostream.h> in it, be prepared for some
differences here: minor for straight forward use of << and
>>; major if you implement some complicated streambuf. And
of course, all of the library was moved from global namespace to
std::.
For the rest: your code obviously won't use any of the features
introduced after VC 6.0 appeared. This won't cause migration
problems (since the older features are still supported), but
you'll doubtlessly want to go back, and gradually upgrade the
code once you've migrated. (You mentionned casts. This is
a good example: C style casts are still legal, with the same
semantics they've always had, but in new code, you'll want to
avoid them, and least when pointers or references are involved.)

Is it time to say goodbye to VC6 compiler?

Of late I'm facing the issues that points finger to VC6 compiler.
Few of them are:
A function-try-block doesn't work. Related Q
in-class constant doesn't work.
__FUNCTION_ (Macro to get function name) doesn't work
The latest addition is it doesn't allow void functions to be passed as part of for_each.
The below example is not compiling with VC6 compiler. It says "error C2562: '()' : 'void' function returning a value". It looks like VC6 doesn't like void functions to be passed to for_each.
class Temp
{
public:
Temp(int i):m_ii(i)
{}
int getI() const
{
return m_ii;
}
void printWithVoid()
{
cout<< "i = "<<m_ii<<endl;
}
bool printWithBool()
{
cout<< "i = "<<m_ii<<endl;
return true;
}
private:
int m_ii;
};
int main(void)
{
std::vector<Temp> arrTempObjects;
arrTempObjects.push_back(Temp(0));
arrTempObjects.push_back(Temp(2));
//Doesnot work, compiler error
std::for_each(arrTempObjects.begin(), arrTempObjects.end(), std::mem_fun_ref(&Temp::printWithVoid));
//Works
std::for_each(arrTempObjects.begin(), arrTempObjects.end(), std::mem_fun_ref(&Temp::printWithBool));
return 0;
}
Have you faced any other issues related to VC6.0. Any workaround to resolve these issues ? Or is it time to change the compiler?
Quite frankly I can hardly understand why you wouldn't buy a modern computer and switch to Visual Studio 2008.
VC6 has got a deficient STL, poor C++ standard compliance and obsolete GUI.
You shouldn't let your competitors use better tools than you.
Well, here's the thing. The VC6 compiler sucks. However... the IDE is pretty good.
VS2005 has much better source control support. Otherwise, it's much slower debugging, has a crappy output pane that exponentially decays on inserting output lines (what absolute garbage coding is that?), the help system is many times slower, and debug and continue (possibly Microsoft's best feature over other IDEs) is considerably more broken.
.NET? Sure, VS20xx is the only way to go. However, for one small client that is sticking with VC6/MFC (for interfaces to embedded systems, etc) I actually enjoy working with VC6. It's just FAST.
2008? I'd like to... but it takes a while for my clients to migrate. Nobody has, yet.
Is it time to say goodbye to VC6
compiler ?
Yes.
VC6 cannot do much of any kind of modern C++. I recall I tried to use one of the boost libraries ages ago like probably Graph and it was giving "INTERNAL COMPILER ERROR" all over the place so eventually I chucked that in.
The no-brainer answer is yes, and ASAP. You have free alternatives like VC++ express and Code::Blocks, if the cost as in issue. The pain in solving compatibility issues is IMO no reason not to upgrade because you will have to do it some day anyway and it only gets harder.
The only reason I see for a possible obstacle is if you have MFC code that will be difficult/time consuming to port. In that case you can't use VC++ express (no support for MFC) and you have to make the investment for at least the VS std. edition. That will cost you about EUR 300 (depending on where you live).
I changed from VC++ 6.0 to Code::Blocks (which is FOSS) with g++ a few months ago and haven't really looked back. I miss the VC++ debugger a bit, as the gdb implementation in CB is nowhere near as slick, but that's about all. Some things in the IDE work better (code completion, tooltips, dependancy xalculation) and the compiler is obviously much better.
Regarding your points, function try blocks are hardly a widely used feature, and most people think they are pretty useless. And the __FUNCTION__ macro is not part of the C++ Standard, so you shouldn't depend on it too much if portability is an issue.
No, it was time to say goodbye to it a decade ago. Here are a few reasons why:
There are free, standards-compliant compilers available, both from Microsoft and others
VC6 was written before the C++ language was standardized, and it is nowhere near standards compliant. Especially templates and the standard library live in a world of their own, with no tie to how these features actually work in ISO C++. The language it compiles is not C++. It is a hybrid of pre-standard C++, Microsoft extensions, compiler limitations and bugs. Neither of which are desirable.
VC6 is known to generate invalid code in some cases. Not only does it compile a home-made, buggy and nonstandard language, it also makes invalid optimizations causing crashes, or in some cases actually produces bad assembly that simply can not be executed.
It is broken, and it was always broken. It was designed to compile a language that ceased existing about the same time as the compiler was relased (when the language was standardized), and it failed even at that task, with countless bugs, some of which have been fixed in the half-dozen service packs that were released. But not all of them, and not even all the critical ones.
Of course, the downside to this is that your application is most likely just as broken. (not because you're bad programmers, but because it targets a broken compiler. It has to be broken to be accepted by VC6)
Porting that to a standards-compliant compiler is likely to be a lot of work. Don't assume that you can just import your old projects, click "build", and it'll work.
So if you're part of a big business that can't just take a month off to switch compilers, you might have to port it as a side project, while part of the team is maintaining the VC6 version. Don't scrap VC6 until you've successfully ported everything, and it works.
Unless you have a large program to maintain, yes. Switch today!
The Express versions of VC++ are a free download from Microsoft.
I guess this is why so many applications on Windows sucks because people still use VC6. Why mess with broke, never maintained MFC or even Win32 when their is wxWidgets and Qt4 out there way better than MFC could ever be and you you can even use the free additions of Visual Studio 2005+
You can learn to live with VC6s foibles. It almost has a certain retro charm these days. We've been repeatedly providing "just one last VC6 release" of some libraries to a customer for years now. Hard to argue with a customer prepared to pay for the extra work backporting and maintaining a branch. But at some point the cost for us to backport newer features developed in newer VCs will exceed the cost of them upgrading at their end (especially as more boost and Intel TBB creeps into the codebase's head). Or at least I hope that's what'll happen! Worst case it'll happen just as flaky C++0x support appears and we'll be stuck supporting that for them for 10 years...
General rule seems to be that a new version is an upgrade and is thus worthwhile.
However! you have to pick the right time for it, there are so many bugs fixed, but you then need to be aware of the new bugs and variations from the standard.
Set aside time for the upgrade.
Upgrading compiler versions could well be a project in its own right, make sure you have stable code and good tests before you do an upgrade and when you finish prove that it is still working the same.
You may be forced to upgrade when you start to develop for Vista as VC6 doesn't provide for code signing easily and the redist is not in a form that Vista likes. (want at least VC2K5)
Are you updating the OS any time soon? When I researched moving our apps to Vista, I found that Vista doesn't officially support anything before VS 2005 (except for VB 6), and has a page-long list of possible little problems with VS 2005 that may or may not bite you. I recommended delaying until VS 2008 SP1 was available (i.e., when VS 2008 was really usable), and doing the compiler changeover first.
If the project is a special one for a few customers who run it soley on old NT machines, you may want to keep it at VS 6. If you are selling it for any sort of general consumption, you will need to make it Vista-compatible at some point (or 7-compatible, or whatever), and you will need to upgrade.

What are the practical differences between C compilers on Windows?

A program written in Visual C/C++ 2005/2008 might not compile with another compiler such as GNU C/C++ or vice-versa. For example when trying to reuse code, which uses windows.h, written for a particular compiler with another, what are the differences to be aware of?
Is there any information about how to produce code which is compatible with either one compiler or another e.g. with either GC/C++ or MSVC/C++? What problems will attempting to do this cause?
What about other compilers, such as LCC and Digital Mars?
The first thing to do when trying to compile code written for MSVC to other compilers is to compile it with Microsoft-extensions switched off. (Use the /Za flag, I think). That will smoke out lots of things which GCC and other compilers will complain about.
The next step is to make sure that Windows-specific APIs (MFC, Win32, etc.) are isolated in Windows-specific files, effectively partioning your code into "generic" and "windows-specific" modules.
Remember the argument that if you want your web page to work on different browsers, then you should write standards-compliant HTML?
Same goes for compilers.
At the language level, if your code compiles without warnings on GCC with -std=c89 (or -std=c++98 for C++), -pedantic -Wall, plus -Wextra if you're feeling brave, and as long as you haven't used any of the more blatant GNU extensions permitted by -pedantic (which are hard to do accidentally) then it has a good chance of working on most C89 compilers. C++ is a bit less certain, as you're potentially relying on how complete the target compiler's support is for the standard.
Writing correct C89 is somewhat restrictive (no // comments, declarations must precede statements in a block, no inline keyword, no stdint.h and hence no 64bit types, etc), but it's not too bad once you get used to it. If all you care about is GCC and MSVC, you can switch on some language features you know MSVC has. Otherwise you can write little "language abstraction" headers of your own. For instance, one which defines "inline" as "inline" on GCC and MSVC/C++, but "__inline" for MSVC/C. Or a MSVC stdint.h is easy enough to find or write.
I've written portable code successfully in the past - I was working mostly on one platform for a particular product, using GCC. I also wrote code that was for use on all platforms, including Windows XP and Mobile. I never compiled it for those platforms prior to running a "test build" on the build server, and I very rarely had any problems. I think I might have written bad code that triggered the 64bit compatibility warning once or twice.
The Windows programmers going the other way caused the occasional problem, mostly because their compiler was less pedantic than ours, so we saw warnings they didn't, rather than things that GCC didn't support at all. But fixing the warnings meant that when the code was later used on systems with more primitive compilers, it still worked.
At the library level, it's much more difficult. If you #include and use Windows APIs via windows.h, then obviously it's not going to work on linux, and the same if you use KDE with GCC and then try to compile with MSVC.
Strictly speaking that's a platform issue, not a compiler issue, but it amounts to the same thing. If you want to write portable code, you need an OS abstraction API, like POSIX (or a subset thereof) that all your targets support, and you need to be thinking "portable" when you write it in the first place. Taking code which makes heavy use of windows-specific APIs, and trying to get it to work on GCC/linux, is basically a complete rewrite AFIAK. You might be better off with WINE than trying to re-compile it.
You're mixing up "compilers" and "OSs". <windows.h> is not something that MSVC C compiler brings to the table: it's C-specific embodiment of Windows API. You can get it independently from Visual Studio. Any other C compiler on Windows is likely to provide it. On the Linux side, for example, you have <unistd.h>, <pthereads.h> and others. They are not an essential part of GCC, and any other compiler that compiles for Linux would provide them.
So you need to answer two different questions: how can I code C in such a way that any compiler accepts it? And how do I hide my dependencies on OS?
As you can tell from the diverse answers, this topic is fairly involved. Bearing that in mind here are some of the issues I faced when recently porting some code to target three platforms (msvc 8/Windows, gcc 4.2/Linux, gcc 3.4/embedded ARM9 processor). It was originally only compiling under Visual Studio 2005.
a) Much code that's written on the Windows platforms uses types defined in windows.h. I've had to create a "windows_types.h" file with the following in it:
#ifndef _WIN32
typedef short INT16;
typedef unsigned short UINT16;
typedef int INT32;
typedef unsigned int UINT32;
typedef unsigned char UCHAR;
typedef unsigned long long UINT64;
typedef long long INT64;
typedef unsigned char BYTE;
typedef unsigned short WORD;
typedef unsigned long DWORD;
typedef void * HANDLE;
typedef long LONG;
#endif
Ugly, but much easier than modifying the code that, previously, was only targeting Windows.
b) The typename keyword was not required in templated code to declare types. MSVC is lax in this regard (though I assume certain compiler switches would have generated a warning). Had to add it in a number of places.
c) Simple, but time-consuming: Windows is not case sensitive and many #included files were specified with incorrect case causing problems under Linux.
d) There was a fair chunk of code using the Windows API for many things. An example was for CRITICAL_SECTIONS and INTERLOCKED_INCREMENT. We used the boost libraries as much as possible to replace these issues but reworking code is time-consuming.
e) A lot of the code relied on headers being included in precompiled headers. We had issues with using pch on gcc3.4 so we had to ensure that all .h/cpp files correctly included all their dependencies (as they should have in the first place).
f) VS 2005 has two nasty bugs. auto_ptr's can be assigned to anything and temporary variables are allowed to be passed to reference parameters. Both fail to compile (thankfully!) under gcc but rework is required.
g) Bizarrely, we had template code that was trying to explicitly specialise class template functions. Not allowed. Again gcc refused, VS 2005 let it go. Easy enough to rework into regular overloads once the problem is understood.
h) Under VS 2005 std::exception is allowed to be constructed with a string. Not allowed under gcc or the standard. Rework your code to prefer to use one of the derived exception classes.
Hopefully that's the kind of information you were looking for!
Well this is a quite difficult question. Fact is that MSVC does not support the newest
C standard, about it's c++ compliance I can tell you anyything. Howerver "windows" C is understand by both MSVC and gcc you just can not hope for the other way. E.g if you use ANSI C99 features then you might have a hard time "porting" from gcc to MSVC.
But as long as you try the way MSVC-> gcc you chances will be better. The only point you have to be aware of is the libraries. Most of the libraries on Windows are supposed to work with MSVC and so you need some extra tools to make them accessible to gcc also.
LCC is a quite older system,which AFAIKT does not support much from ANSI C99, it also needs tools from MSVC to work properly. LCC is "just " a compiler.
lcc-win32 is a C Development system striving to be ANSI C99 compliant. It comes along with linker, IDE etc.
I can not tell about the state of Digital Mars implementation
Then there is also Pelles-C which is a fully fledged IDE also.
And we have hanging around OpenWatcom. Which once was quite a decent system but I can't tell how conformant it is.
All in all the "best" you can hope for the easier way from MSVC -> other system but it will probably be much worse the other way round.
Regards
Friedrich
vs2008 is a lot more standards compliant than 2005.
I have had more problems going the other way, especially the 'feature' of gcc that lets you allocate an array with a variable size at run time "int array[variable]" which is just pure evil.
A program written in Visual C/C++ 2005/2008 might not compile with another compiler such as GNU C/C++ or vice-versa.
This is true if you either (1) use some sort of extension available in one compiler but not another (the C++ standard, for instance requires the typename and template keywords in a few places but many compilers -- including Visual C++ don't enforce this; gcc used to not enforce this either, but changed in 3.4) or (2) use some standard compliant behavior implemented on one compiler but not another (right now the poster boy for this is exported templates, but only one or two compilers support this, and Visual C++ and gcc are not in that group).
For example when trying to reuse code, which uses windows.h, written for a particular compiler with another,
I've never seen a problem doing this. I have seen a problem using Microsoft's windows.h in gcc. But when I use gcc's windows.h in gcc and Microsoft's windows.h in Visual C++ I have access to all of the documented functions. That's the definitions of "implemented windows.h" after all.
what are the differences to be aware of?
The main one I've seen is people not knowing about the dependent template/typename thing mentioned above. I find it funny that a number of people think gcc is not smart enough to do what Visual C++ does, when in reality gcc had the feature first and then decided to remove it in the name of standards compliance.
In the near future you will run into problems using C++0x features. But both gcc and Visual C++ have implemented the easier things in that standard.