C++0x implementation guesstimates? - c++

The C++0x standard is on its way to being complete. Until now, I've dabbled in C++, but avoided learning it thoroughly because it seems like it's missing a lot of modern features that I've been spoiled by in other languages. However, I'd be very interested in C++0x, which addresses a lot of my complaints. Any guesstimates, after the standard is ratified, as to how long it will take for major compiler vendors to provide reasonably complete, production-quality implementations? Will it happen soon enough to reverse the decline in C++'s popularity, or is it too little, too late? Do you believe that C++0x will become "the C++" within a few years, or do you believe that most people will stick to the earlier standard in practice and C++0x will be somewhat of a bastard stepchild, kind of like C99?

I see no reason why C++0x shouldn't be adopted. The C++ community is much more forward-looking than C. C was always meant to be a "portable assembler language", so people who use that aren't really super interested in fancy new features. C++ spans much wider, and I've yet to hear of a C++ programmer who wasn't looking forward to 0x. (It's also my impression that the C++ community is much "stricter", and really don't want to move outside the standard into undefined behavior, which implies you choose either C++03 or C++0X rather than a half-implemented hybrid. C programmers tend to be much more relaxed about that, and seem happy to use C89 with just a couple of C99 features and headers mixed in)
However, it'll take a few years before Microsoft catches up, at least. Visual Studio 2010 will support a small handful of C++0x features (lambdas, decltype and a couple of others), but the vast majority will not be supported. We'll have to wait for VS2012 or whatever the next version ends up being, to have somewhat complete support.
With GCC/G++, the situation is a lot better, since most of the standard has been implemented there already (the standard committee doesn't like adopting features that haven't been implemented and tested in a real compiler, and a GCC fork is often used for that)
But it'll probably still take some time to get that stable and production-ready.
About C++'s "decline in popularity", I don't really see it. I don't think C++ has declined significantly in popularity for the last years. RAD developers have already jumped ship, of course, to .NET, Python or other languages or platforms. But where C++ is used today, there aren't many viable alternatives, and no reason why it should decline in popularity.

I don't know about other vendors, but from what I've seen, Microsoft plans to include four C++0x language features in Visual C++ 2010:
rvalue references
auto
lambdas
static assert
Although this is a small set of C++0x features, they are important ones. Some will enable programmers to write much more compact (auto, lambdas) and error free code. Some (like rvalue references) enable libraries to be more efficient. Microsoft likes lambdas as an enabler for parallel computing.
IMHO: auto alone will make it so much easier to use templates that more programmers will do so. And hopefully this will increase demand for more C++0x features from Microsoft and all vendors.
Microsoft will also be updating their C++ Standard Library implementation but I don't know the details. I believe they are modifying some container classes to take advantage of rvalue reference move semantics. And I believe they including more of TR1 .

Some implementations are already on their way to C++0x: (gcc). My intuition says that C++0x support will be available in major compilers fairly quickly; however a large body of legacy code still exists that must be maintained.

Newer versions of gcc already support many of the C++0x features: http://gcc.gnu.org/projects/cxx0x.html

Microsoft will be including C++0x support in Visual Studio 2010 later this year (a community technology preview is already available).
I don't think it'll become "the C++" any time soon, but rather certain people will choose to add parts of the new syntax where it makes sense in their code.
I don't do much C++ these days, but people I talk to either love it, or feel that the beauty of C++ is in the control they get, and that if they wanted a language with all those extra features they'd use C#/Java.

Related

c++ proposal for new mathematical special functions [duplicate]

When the C++ committee publish a new feature that will be part of the standard library in the next standard of the language, do they also release some source code or some kind of guidance on how to implement that feature?
Let's take unique_ptr as an example. The language committee just defines an interface for that class template and let the compiler vendor implement it as it wants? How exactly does this process of implementation of the standard library features occur?
Can anyone implement parts of the standard library for a platform that doesn't have support for them yet? Let say I would like to implement some cool features of the C++ standard library to use it on a microcontroller environment. How could I do that? Where should I look for information? If I decide to open source my project, can I do that? Will I need to follow exactly what the standard says, or I can write a non-compliant version?
Usually,
every new library feature goes through a proposal.
If the proposal makes it to the C++ committee's Library Evolution Working Group, it goes through a series of iterations (a "tough ground" as I am aware).
It undergoes a series of refinement process as described here
Should it require a (TS) Technical Specification (since C++11), it goes there to be baked. Take for example, the #include <filesystem> was in a Filesystem TS prior to C++17.
One thing I believe the committee likes, is an implementation experience.
More information can be found on the ISOCpp site
Well, as to the implementation:
There are quite a number of "library features" that cannot be implemented purely as a library. they require compiler support. And in these case, compilers provides "intrinsic" that you could hook on to. Take for example, clang provides intrinsics for certain type_traits
Most library features have some implementation experience, mostly from the Boost libraries.
You could actually look into the source code for the default standard library that ships with your compiler:
libc++ for Clang
libstdc++ for GCC
Sadly most of the implementations use a whole bunch of underscores. Mostly because they are reserved for use by the "Standard Library".
Can anyone implement parts of the standard library for a platform that doesn't have support for it yet?
Yes, you can, so far your compiler supports that platform, and the platform or Operating System provides usable API. For example. std::cout, elements of std::ifstream, and so much more requires platform specific support.
Let say I would like to implement some cool features of the C++ standard library to use it on a microcontroller environment. How could I do that?
You can look into the code of others and start from there. We learn from giants. Some Open Source Examples:
ETL
StandardCPlusPlus
uClib++
How could I do that? Where should I look for information?
You could check the paper that introduced the feature into the C++ library. For example, std::optional has a stand-alone implementation here which was used as a reference implementation during the proposal stages.
You could check the standard library, and do a laborious study. :-)
Search the internet. :-)
Or write it from scratch as specified by the standard
Will I need to follow exactly what the standard say or I can write a non-compliant version?
There's is no compulsion to follow what the C++ standard library specifies. That would be your "own" library.
Formally, no. As with all standards out there, C++ Standard sets the rules, and does not gives implementation. However, from the practical standpoint, it is nearly impossible to introduce a new feature into Standard Library without proposed implementation, so you often can find those attached to proposals.
As for your questions on "can you write non-compliant version", you can do whatever you want. Adoption might depend on your compliance, or might not - a super-widely adopted MSVC is known to violate C++ standard.
Typically, a new feature is not standardized, unless the committee has some solid evidence that it can be implemented, and will be useful. This very often consists of a prototype implementation in boost, a GNU library, or one of the commercial compiler vendors.
The standard itself does not contain any implementation guidance - it is purely a specification. The compiler vendors (or their subcontractors) choose how to implement that specification.
In the specific case of unique_ptr, it was adopted into the standard from boost::unique_ptr - and you can still use the latter. If you have a compiler that will compile for your microcontroller, it is almost certain that it will be able to build enough of boost to make unique_ptr work.
There is nothing stopping you from writing a non-conforming implementation (apart from the trivial point that if you sold it as being standards-conforming, and it wasn't you might get your local equivalent of Trading Standards come knocking.)
The committee does not release any reference implementations. In the early days, things got standardized and then the tool developers went away and implemented the standard. This has changed, and now the committee looks for features that have been implemented and tested before standardization.
Also major developments usually don't go directly into the standard. First they become experimental features called a Technical Specification or TS. These TS may then be incorporated into the main standard at a later date.
You are free to write you own implementation of the C++ standard library. Plum Hall has a test suite (commercial, I have no connection, but Plum Hall are very involved with C++ standardization).
I don't see any issue with not being conformant. Almost all implementations have some extensions. Just don't make any false claims, especially if you want to sell your product.
If you're interested in getting involved, this can be done via your 'National Body' (ANSI for the USA, BSI for the UK etc.). The isocpp web site has a section on standardization which would be a good starting place.

C++ Standard Library Portability

I work on large scale, multi platform, real time networked applications. The projects I work on lack any real use of containers or the Standard Library in general, no smart pointers or really any "modern" C++ language features. Lots of raw dynamically allocated arrays are common place.
I would very much like to start using the Standard Library and some of the C++11 spec, however, there are many people also working on my projects that are against because "STL / C++11 isn't as portable, we take a risk using it". We do run software on a wide variety of embedded systems as well as fully fledged Ubuntu/Windows/Mac OS systems.
So, to my question; what are the actual issues of portability with concern to the Standard Library and C++11? Is it just a case of having g++ past a certain version? Are there some platforms that have no support? Are compiled libraries required and if so, are they difficult to obtain/compile? Has anyone had serious issues being burnt by non-portable pure C++?
Library support for the new C++11 Standard is pretty complete for either Visual C++ 2012, gcc >= 4.7 and Clang >= 3.1, apart from some concurrency stuff. Compiler support for all the individual language features is another matter. See this link for an up to date overview of supported C++11 features.
For an in-depth analysis of C++ in an embedded/real-time environment, Scott Meyers's presentation materials are really great. It discusses costs of virtual functions, exception handling and templates, and much more. In particular, you might want to look at his analysis of C++ features such as heap allocations, runtime type information and exceptions, which have indeterminate worst-case timing guarantees, which matter for real-time systems.
It's those kind of issues and not portability that should be your major concern (if you care about your granny's pacemaker...)
Any compiler for C++ should support some version of the standard library. The standard library is part of C++. Not supporting it means the compiler is not a C++ compiler. I would be very surprised if any of the compilers you're using at the moment don't portably support the C++03 standard library, so there's no excuse there. Of course, the compiler will have to be have been updated since 2003, but unless you're compiling for some archaic system that is only supported by an archaic compiler, you'll have no problems.
As for C++11, support is pretty good at the moment. Both GCC and MSVC have a large portion of the C++11 standard library supported already. Again, if you're using the latest versions of these compilers and they support the systems you want to compile for, then there's no reason you can't use the subset of the C++11 standard library that they support - which is almost all of it.
C++ without the standard library just isn't C++. The language and library features go hand in hand.
There are lists of supported C++11 library features for GCC's libstdc++ and MSVC 2012. I can't find anything similar for LLVM's libc++, but they do have a clang c++11 support page.
The people you are talking to are confusing several different
issues. C++11 isn't really portable today. I don't think any
compiler supports it 100% (although I could be wrong); you can
get away with using large parts of it if (and only if) you limit
yourself to the most recent compilers on two or three platforms
(Windows and Linux, and probably Apple). While these are the
most visible platforms, they represent but a small part of all
machines. (If you're working on large scale networked
applications, Solaris will probably be important, and Sun CC.
Unless Sun have greatly changed since I last worked on it, that
means that there are even parts of C++03 that you can't count
on.)
The STL is a completely different issue. It depends partially
on what you mean by the STL, but there is certainly no
portability problem today in using std::vector. locale
might be problematic on a very few compilers (it was with Sun
CC—with both the Rogue Wave and the Stlport libraries),
and some of the algorithms, but for the most part, you can
pretty much count on all of C++03.
And in the end, what are the alternatives? If you don't have
std::vector, you end up implementing something pretty much
like it. If you're really worried about the presence of
std::vector, wrap it in your own class—if ever it's not
available (highly unlikely, unless you go back with a time
machine), just reimplement it, exactly like we did in the
pre-standard days.
Use STLPort with your existing compiler, if it supports it. This is no more than a library of code, and you use other libraries without problem, right?
Every permitted implementation-defined behaviour is listed in publicly available standard draft. There is next to nothing less portable in C+11 than in C++98.

What C++(98/03) features are not-so-well supported by poor compilers?

Often I read of some software cutting out some C++ features in order to be compliant with poor/old/exotic C++ compilers.
This one is just the last one I got into: Box2D isn't using namespaces because they need to support:
poor C++ compilers where namespace support can be spotty
A bigger example I can think of is Qt, which is relying upon MOC, is limiting template usage a lot and is avoiding templates (well, this is at least true for Qt3 and previous versions, Qt4 is mostly doing that to keep to their conventions).
I'm wondering what compilers are that poor?
There are lots of C++ compilers out there (I never heard of most of them), but I would hope that all of them are supporting most common(/simple?) C++ features like namespaces (unless they're dead); isn't this the case?
What are the most unsupported features?
I can easily expect the lack of external templates, maybe template partial specialization and similar features. At most even RTTI or exceptions, but I would have never suspected of namespaces.
In my experience, people are just scared of new things and especially things that broke on them once, 20 decades ago. There's no valid reason against using namespaces in anything written during this century.
If you're looking for things to toss out though, if you happened to be targeting windows not too long ago you had to do more than just toss features out of C++ and not use them, you had to use different syntax. Templates come to mind as one of the worse supported features in VC. They've gotten much better but still fail sometimes.
Another one that is not supported by that particular compiler (STILL!) is overloading virtual functions to return derived type pointers to the types the base version returned when using MI. VC just plain freaks out and you end up having to do virtual_xxx() and providing non-virtual "xxx()" functions to replicate the standard behavior.
What are the most unsupported features?
Let's abstract away export, which, sadly, was an experiment that failed. Then, by count of compiler users this is probably two-phase lookup, which Visual C++ still doesn't properly implement. And VC has a lot of users.
Basically anything that was added as part of the standardization process in C++98. That is:
Templates (Not just partial specialization -- I mean all templates)
Exceptions (Often because this one requires runtime support/overhead -- they're often not available on things like microcontrollers)
Namespaces
RTTI
If you go even older, you can add
Most everything in the standard library.
Several old compilers are just plain bad; OTOH I would strongly not recommend worrying about them. There are already several good and freely available reasonably standards compliant compilers available for pretty much every platform of which I am aware (mostly via G++).
Namespaces have been around forever. Some of the darker places in templates, maybe. But namespaces? No way. Even with templates, the vast, vast majority of usage scenarios are fine. Some compilers aren't perfect (they're software), but flat out not supporting a C++03 feature that isn't export? That just doesn't happen. Most compilers extend the language, not reduce it, and are moving to support C++0x.
RTTI and exceptions are often accused of poorly efficient implementations, but not poor implementation conformance.
If you are programming in the main stream with a main stream compiler then usually you find everything in the C++03 standard supported (except export as mentioned by sbi (but then again nobody used the feature (as it was not supported)). The further from the main stream the compiler is the less features usually (but they are all moving forward (some more slowly than others))
Its only when you get to less used hardware that has its own proprietary compiler that support for features start to dwindle.
The main example would be mobile devices and their associated compilers (though because of their popularity in the main stream in recent years these compilers are getting updated more quickly than before).
A secondary example would by SOC devices. There compilers are so specific that they are usually in house compilers and only get as much work on them as the SOC company can afford and as a result tend to lag a long way (or at least further) behind other compilers.
Value Initialization (C++03 feature) is not properly implemented in MSVC++.
Qt doesn't use templates much because it is older than templates.

How important is standards-compliance?

For a language like C++ the existence of a standard is a must. And good compilers try their best (well, most of the good compilers, at least) to comply. Many compilers have language extensions, some of which are allowed by the standard, some of which are not. Of the latter kind 2 examples:
gcc's typeof
microsoft's compilers allow a pure virtual function declaration to have both a pure-specifier(=0) and a definition (which is prohibited by the standard - let's not discuss why, that's another topic:)
(there are many other examples)
Both examples are useful in the following sense: example1 is a very useful feature which will be available in c++0x under a different name. example2 is also useful, and microsoft has decided not to respect the ban that made no sense.
And I am grateful that compilers provide language extensions that help us developers in our routine. But here's a question: shouldn't there be an option which, when set, mandates that the compiler be as standards compliant as it can, no matter whether they agree with the standard or not. For example visual studio has such an option, which is called disable language extensions. But hey, they still allow example2.
I want everyone to understand my question correctly. It is a GREAT thing that MSVC allows example2, and I would very much like that feature to be in the standard. It doesn't break any compliant code, it does nothing bad. It just isn't standard.
Would you like that microsoft disable example2 when disable language extensions is set to true? Note that the words microsoft, example2, etc. are placeholders :)
Why?
Again, just to make sure. The crucial point is: Should a compiler bother to provide a compliant version (optionally set in the settings)(in its limits, e.g. I am not talking about export) for a certain feature when they provide a better alternative that is not standard and is perhaps even a superset of the standard, thus not breaking anything.
Standards compliance is important for the fundamental reason that it makes your code easier to maintain. This manifests in a number of ways:
Porting from one version of a compiler to another. I once had to post a 1.2 million-LOC app from VC6 to VC9. VC6 was notorious for being horribly non-Compliant, even when it was new. It allowed non-compliant code even on the highest warning levels that the new compiler rejected at the lowest. If the code had been written in a more compliant way in the first place, this project wouldn't (shouldn't)have taken 3 months.
Porting from one platform to another. As you say, the current MS compilers have language extensions. Some of these are shared by compilers on other platforms, some are not. Even if they are shared, the behavior may be subtly different. Writing compliant code, rather that using these extensions, makes your code correct from the word go. "Porting" becomes simply pulling the tree down and doing a rebuild, rather than digging through the bowels of your app trying to figure out why 3 bits are wrong.
C++ is defined by the standard. The extensions used by compilers changes the language. New programmers coming online who know C++ but not the dialect your compiler uses will get up to speed more quickly if you write to Standard C++, rather than the dialect that your compiler supports.
First, a reply to several comments. The MS VC extension in question is like this:
struct extension {
virtual void func() = 0 { /* function body here */ }
};
The standard allows you to implement the pure virtual function, but not "in place" like this, so you have to write it something like this instead:
struct standard {
virtual void func() = 0;
};
void standard::func() { ; }
As to the original question, yes, I think it's a good idea for the compiler to have a mode in which it follows (and enforces) the standard as accurately as possible. While most compilers have that, the result isn't necessarily as accurate a representation of the standard as you/I would like.
At least IMO, about the only answer to this is for people who care about portability to have (and use) at least a couple of compilers on a regular basis. For C++, one of those should be based on the EDG front-end; I believe it has substantially better conformance than most of the others. If you're using Intel's compiler on a regular basis anyway, that's fine. Otherwise, I'd recommend getting a copy of Comeau C++; it's only $50, and it's the closest thing to a "reference" available. You can also use Comeau online, but if you use it on a regular basis, it's worth getting a copy of your own.
Not to sound like an EDG or Comeau shill or anything, but even if you don't care much about portability, I'd recommend getting a copy anyway -- it generally produces excellent error messages. Its clean, clear error messages (all by themselves) have saved enough time over the years to pay for the compiler several times over.
Edit: Looking at this again, some of the advice is looking pretty dated, especially the recommendation for EDG/Comeau. In the three years since I originally wrote this, Clang has progressed from purely experimental to being quite reasonable for production use. Likewise, the gcc maintainers have (IMO) made great strides in conformance as well.
During the same time, Comeau hasn't released a single new version of their compiler, and there's been a new release of the C++ standard. As a result, Comeau is now fairly out of date with respect to the current standard (and the situation seems to be getting worse, not better -- the committee has already approved a committee draft of a new standard that is likely to become C++14).
As such, although I recommended Comeau at that time, I'd have difficulty (at best) doing so today. Fortunately, most of the advantages it provided are now available in more mainstream compilers -- both Clang and gcc have improved compliance (substantially) as outlined above, and their error messages have improved considerably as well (Clang has placed a strong emphasis on better error messages, almost from its inception).
Bottom line: I'd still recommend having at least two compilers installed and available, but today I'd probably choose different compilers than I did when I originally wrote this answer.
"Not breaking anything" is such a slippery slope in the long run, that it's better to avoid it altogether. My company's main product outlived several generations of compilers (first written in 1991, with RW), and combing through compiler extensions and quiet standards violations whenever it was the time to migrate to a newer dev system took a lot of effort.
But as long as there's an option to turn off or at least warn about 'non-standard extension', I'm good with it.
34, 70, 6.
I would certainly want an option that disables language extensions to disable all language extensions. Why?
All options should do what they say they do.
Some people need to develop portable code, requiring a compiler that only accepts the standard form of the language.
"Better" is a subjective word. Language extensions are useful for some developers, but make things more difficult for others.
I think that it's critical that a compiler provide a standards-only mode if it wants to be the primary one used while developing. All compilers should, of course, compile standards compliant code, but it's not critical they they don't extend if they don't think of themselves as the primary compiler -- for example, a cross-compiler, or a compiler for a less popular platform that is nearly always ported to, rather than targeted.
Extensions are fine for any compiler, but it would be nice if I had to turn them on if I want them. By default, I'd prefer a standards-only compiler.
So, given that, I expect MSVC to be standards-only by default. The same with gcc++.
Stats: 40, 90, 15
I think standards compliance is very important.
I always consider source code is more for the human readers than for the machine(s). So, to communicate programmer's intention to the reader, abiding the standard is like speaking a language of lowest common denominator.
Both at home and work, I use g++, and I have aliased it with the following flags for strict standard compliance.
-Wall -Wextra -ansi -pedantic -std=c++98
Check out this page on Strict ANSI/ISO
I am not a standards expert, but this has served me well. I have written STL-style container libraries which run as-is on different platforms, e.g. 32-bit linux, 64-bit linux, 32-bit solaris, and 32-bit embedded OSE.
Consider indicators on cars (known as "turn signals" in some jurisdictions); they are a reliable way to determine which direction someone's going to turn off a roundabout... until just one person doesn't use them at all. Then the whole system breaks down.
It didn't "hurt anyone" or obviously "break anything" in IE when they allowed document.someId to be used as a shortcut for document.getElementById('someId').... however, it did spawn an entire generation of coders and even books that consequently thought it was okay and right, because "it works". Then, suddenly, the ten million resulting websites were entirely non-portable.
Standards are important for interoperability, and if you don't follow them then there's little point in having them at all.
Standards-compliance hounds may get hated for "pedanticism" but, really, until everybody follows suit you're going to have portability and compatibility problems for ever.
How important standards-compliance is depends on what you are trying to achieve.
If you are writing a program that will never be ported outside of its current environment (especially a program that you're not planning to develop/support for a long time) then it's not very important. Whatever works, works.
If you need your program to remain relevant for a long time, and be easily portable to different environments, than you will want it to be standards compliant, since that's the only way to (more or less) guarantee that it will work everywhere.
The trick, of course, is figuring out which situation you are actually in. It's very common to start a program thinking it is a short-term hack, and later on find that it's so useful that you're still developing/maintaining it years later. In that situation your life will be much less unpleasant if you didn't make any short-sighted design decisions at the beginning of the program's lifetime.

To use or not to use C++0x features [duplicate]

This question already has answers here:
Closed 12 years ago.
Possible Duplicate:
How are you using C++0x today?
I'm working with a team on a fairly new system. We're talking about migrating to MSVC 2010 and we've already migrated to GCC 4.5. These are the only compilers we're using and we have no plans to port our code to different compilers any time soon.
I suggested that after we do it, we start taking advantage of some of the C++0x features already provided like auto. My co-worker suggested against this, proposing to wait "until C++0x actually becomes standard." I have to disagree, but I can see the appeal in the way he worded it. Nevertheless, I can't help but think that this counter-argument comes more out of fear and trepidation of learning C++0x than a genuine concern for standardization.
Given the new state of the system, I want for us to take advantage of the new technology available. Just auto, for instance, would make our daily lives easier (just writing iterator-based for loops until range-based loops come along, e.g.).
Am I wrong to think this? It is not as though I'm proposing we radically change our budding codebase, but just start making use of C++0x features where convenient. We know what compilers we're using and have no immediate plans to port (if we ever port the code base, by then surely compilers will be available with C++0x features as well for the target platform). Otherwise it seems to me like avoiding the use of iostreams in 1997 just because the ISO C++ standard was not published yet in spite of the fact that all compilers already provided them in a portable fashion.
If you all agree, could you provide me arguments I could use to strengthen my position? If not, could I get a bit more details on this "until the C++0x is standard" idea? BTW, anyone know when that's going to be?
I'd make the decision on a per-feature basis.
Remember that the standard is really close to completion. All that is left is voting, bugfixing and more voting.
So a simple feature like auto is not going to go away, or have its semantics changed. So why not use it.
Lambdas are complex enough that they might have their wording changed and the semantics in a few corner cases fixed up a bit, but on the whole, they're going to behave the way they do today (although VS2010 has a few bugs about the scope of captured variables, MS has stated that they are bugs, and as such may be fixed outside of a major product release).
If you want to play it safe, stay away from lambdas. Otherwise, use them where they're convenient, but avoid the super tricky cases, or just be ready to inspect your lambda usage when the standard is finalized.
Most features can be categorized like this, they're either so simple and stable that their implementation in GCC/MSVC are exactly how they're going to work in the final standard, or they're tricky enough that they might get a few bugfixes applied, and so they can be used today, but you run the risk of running into a few rough edges in certain border cases.
It does sound silly to avoid C++0x feature solely because they're not formalized yet. Avoid the features that you don't trust to be complete, bug-free and stable, but use the rest.
Theoretical but not practical disadvantages of using C++0x:
Makes it harder to port to different compilers.
Not adhering to any published standard.
Practical advantages of using C++0x:
Makes your daily lives easier, hence more productive.
It's a debate between what's theoretically right, and what's practical. If your team has any intent of actually doing something with this code, the practical should outweigh the theoretical tenfold.
One thing you don't need to (mostly) worry about now is features being added or taken away because the working draft reached "Final Committee Draft" (FCD) back in march. Feature-wise should be frozen, standards committee will not accept any-more proposals for C++0x.
Downside is it's still a draft and not finalized yet, the standards committee are in the phase of making corrections and adjustments before finalizing and publish the ISO standard (expected release to be march 2011). That could mean minor syntactic or semantic/behaviour changes which could make your code not compilable or not work correctly once you compile with a compiler that is more standard compliant than the one you're using at the time you wrote the code.
You'll probably have to wait sometime for compilers like VC++10 to get update with any corrections/adjustments made.
We had the exact same problem so we compromised. We took the C++ 0x TR1 release and then only took the portions that we knew we wanted to use. Sounds like a lot of work, but so far it's worked out well. We're using the regex libraries, tuples, and a couple of others. Once the standard is ratified, then we'll migrate to the full C++ 0x. This obviously isn't the best solution but it was one that has worked well for us.
If you intend to make your system open source within a not-too-far future, then that's an argument for not using too many bleeding-edge features. A production system running Debian or Red Hat won't necessarily have a bleeding-edge compiler installed.
You said
if we ever port the code base, by then surely compilers will be available with C++0x features as well for the target platform
but that a compiler exists for a platform doesn't always mean that it's installed/used/wanted, especially on production systems.
If, on the other hand, you intend to do all the compiling yourself, this is not an issue.