Select function at compile time based on C++ version - c++

is there a way to use like:
#if (VERSION_C < 20) //where version_c is the version of C++ the compiler is using
#use FUNCTION1
#elseif use FUNCTION2
I want to use C++20 fmt() and I understand that I can just use an older way of doing it, but since I am learning to program, I am curious if this is even possible so that a program's source code could be written once and use the most modern features available to the local compile environment...It will compile on my system but it will not compile on another system unless they use the most current preview version,
Thanks for the help guys! (This is my very first question on here)

There's the __cplusplus macro which expands to the date when the current standard was released.
But it's mostly useless, since compilers don't implement all features from a new standard at the same time.
Because of that we have a bunch of separate feature test macros. The one you're looking for is __cpp_lib_format.
It was added in C++20, then its value was changed in C++23 to indicate new improvements to std::format.
But if you want my opinion, forget about std::format for now.
Use libfmt, which works on all major compilers, and has more features.

Related

How should I write my C++ to be prepared for C++ modules?

There are already two compilers that support C++ modules:
Clang: http://clang.llvm.org/docs/Modules.html
MS VS 2015: http://blogs.msdn.com/b/vcblog/archive/2015/12/03/c-modules-in-vs-2015-update-1.aspx
When starting a new project now, what should I pay attention to in order to be able to adopt the modules feature when it is eventually released in my compiler?
Is it possible to use modules and still maintain compatibility with older compilers that do not support it?
There are already two compilers that support C++ modules
clang: http://clang.llvm.org/docs/Modules.html
MS VS 2015: http://blogs.msdn.com/b/vcblog/archive/2015/12/03/c-modules-in-vs-2015-update-1.aspx
The Microsoft approach appears to be the one gaining the most traction, mainly because Microsoft are throwing a lot more resources at their implementation than any of the clang folk currently. See https://llvm.org/bugs/buglist.cgi?list_id=100798&query_format=advanced&component=Modules&product=clang for what I mean, there are some big showstopper bugs in Modules for C++, whereas Modules for C or especially Objective C look much more usable in real world code. Visual Studio's biggest and most important customer, Microsoft, is pushing hard for Modules because it solves a whole ton of internal build scalability problems, and Microsoft's internal code is some of the hardest C++ to compile anywhere in existence so you can't throw any compiler other than MSVC at it (e.g. good luck getting clang or GCC to compile 40k line functions). Therefore the clang build tricks used by Google etc aren't available to Microsoft, and they have a huge pressing need to get it fixed sooner rather than later.
This isn't to say there aren't some serious design flaws with the Microsoft proposal when applied in practice to large real world code bases. However Gaby is of the view you should refactor your code for Modules, and whilst I disagree, I can see where he is coming from.
When starting a new project now, what should I pay attention to in order to be able to adopt the modules feature when it is eventually released in my compiler?
In so far as Microsoft's compiler is currently expected to implement Modules, you ought to make sure your library is usable in all of these forms:
Dynamic library
Static library
Header only library
Something very surprising to many people is that C++ Modules as currently expected to be implemented keeps those distinctions, so now you get a C++ Module variant for all three of the above, with the first most looking like what people expect a C++ Module to be, and the last looking most like a more useful precompiled header. The reason you ought to support those variants is because you can reuse most of the same preprocessor machinery to also support C++ Modules with very little extra work.
A later Visual Studio will allow linking of the module definition file (the .ifc file) as a resource into DLLs. This will finally eliminate the need for the .lib and .dll distinction on MSVC, you just supply a single DLL to the compiler and it all "just works" on module import, no headers or anything else needed. This of course smells a bit like COM, but without most of the benefits of COM.
Is it possible to use modules in a single codebase and still maintain compatibility with older compilers that do not support it?
I'm going to assume you meant the bold text inserted above.
The answer is generally yes with even more preprocessor macro fun. #include <someheader> can turn into an import someheader within the header because the preprocessor still works as usual. You can therefore mark up individual library headers with C++ Modules support along something like these lines:
// someheader.hpp
#if MODULES_ENABLED
# ifndef EXPORTING_MODULE
import someheader; // Bring in the precompiled module from the database
// Do NOT set NEED_DEFINE so this include exits out doing nothing more
# else
// We are at the generating the module stage, so mark up the namespace for export
# define SOMEHEADER_DECL export
# define NEED_DEFINE
# endif
#else
// Modules are not turned on, so declare everything inline as per the old way
# define SOMEHEADER_DECL
# define NEED_DEFINE
#endif
#ifdef NEED_DEFINE
SOMEHEADER_DECL namespace someheader
{
// usual classes and decls here
}
#endif
Now in your main.cpp or whatever, you simply do:
#include "someheader.hpp"
... and if the compiler had /experimental:modules /DMODULES_ENABLED then your application automagically uses the C++ Modules edition of your library. If it doesn't, you get inline inclusion as we've always done.
I reckon these are the minimum possible set of changes to your source code to make your code Modules-ready now. You will note I have said nothing about build systems, this is because I am still debugging the cmake tooling I've written to get all this stuff to "just work" seamlessly and I expect to be debugging it for some months yet. Expect to see it maybe at a C++ conference next year or the year after :)
Is it possible to use modules and still maintain compatibility with older compilers that do not support it?
No, it is not possible. It might be possible using some #ifdef magic like this:
#ifdef CXX17_MODULES
...
#else
#pragma once, #include "..." etc.
#endif
but this means you still need to provide .h support and thus lose all the benefits, plus your codebase looks quite ugly now.
If you do want to follow this approach, the easiest way to detect "CXX17_MODULES" which I just made up is to compile a small test program that uses modules with a build system of your choice, and define a global for everyone to see telling whether the compilation succeeded or not.
When starting a new project now, what should I pay attention to in order to be able to adopt the modules feature when it is eventually released in my compiler?
It depends. If your project is enterprise and gets you food on the plate, I'd wait a few years once it gets released in stables so that it becomes widely adapted. On the other hand, if your project can afford to be bleeding-edge, by all means, use modules.
Basically, it's the same story ast with Python3 and Python2, or less relevantly, PHP7 and PHP5. You need to find a balance between being a good up-to-date programmer and not annoying people on Debian ;-)

Indicate C++ standard in source in a standard way

Standard compliant C++ compilers define a __cplusplus macro which may
be inspected during preprocessing to determine under what standard a
file is being compiled, e.g:
#if __cplusplus < 201103L
#error "You need a C++11 compliant compiler."
#endif
#include <iostream>
#include <vector>
int main(){
std::vector<int> v {1, 2, 3};
for (auto i : v){
std::cout << i << " ";
}
std::cout << std::endl;
return 0;
}
My question is:
Is there a standard way to indicate what standard a source
file should be compiled with?
That would allow build tools to inspect sources prior to compilation
to determine the appropriate argument for -std= (cf. shebang's which
can indicate scripting language/version: #!/usr/bin/env python3).
A non standard and brittle way I can think of is looking for the
preprocessor checks of __cplusplus but in the example above I could
also have written:
#if __cplusplus <= 199711L
#error "You need a C++11 compliant compiler."
#endif
hence, writing e.g. a regex would become quite tricky to catch all variations.
EDIT:
While I sympathize with the answer by #Gary which suggests relying on a build system,
it assumes that we actually will have a build step.
But you can already today:
use an interpreter to run a C++ program using e.g. CINT
or use a source to source translation using e.g. rosecompiler
My question is also about indicating that the source is C++ and what version
it was intended for (imagine someone digging out my code 70 years from now
when C++ might be as popular as say Cobol is today).
I guess the equivalent thing I would be looking for is the C++ equiavlent of HTML's:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN" "http://www.w3.org/TR/html4/strict.dtd">
C++ Standards in a way are somewhat like developing against a library. In that sense, libraries typically evolve in a way that slowly deprecates old functions while making access to new functions. The typical way is the introduction of new methods or signatures while still allowing access to the old ones.
As a simple example, for instance, you might make an app for the iPhone that is backwards compatible with IOS 4 and above. You don't get the option to cherry pick what specific versions you want to support. This is good because otherwise you open code evolution up to a matrix of possibilities, making your code harder to understand and maintain.
Alternatively, you may introduce preprocessor instructions to build certain pieces conditionally depending on a version or flag of some sort. These are temporary measures, however, and should be removed as the code evolves.
So I think for answering this question as is, the better question is asking oneself in this situation is what will adding something like this actually solve and will it add needless complexity (one of the code smells of bad design)?
In this situation and from experience, I personally think you're better sticking with one standard. I think you'll find that trying to differentiate standards by sprinkling various preprocessor #ifdef and #ifndefs is going to make understanding your code base difficult to understand and manage. Even if you had one include file with the definition of what version is allowed that gets included by all other files, it becomes yet another file to manage....not to mention when you change it you have to recompile everything that includes it.
If you're worried about someone building your code base with the wrong standard, use a build system that doesn't require developers to input that information. For instance Make, Ant, cmake. It makes the building of your software simple and clearly defines how the project should be compiled in a repeatable fashion. If you go this route, you'll see that trying to protect the code from being compiled improperly becomes a non-issue.
Also, if they go out of their way and compile with the wrong standard, they'll be greeted with plenty of compiler errors =)

How can I know if my compiler support XXXX C++11 feature? [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
How do I check for C++11 support?
I am writing a small library and I would like to use class enums whenever the compiler supports them. I also want to use other C++11 features, such as final and override keywords.
So far, I have used tricks to make sure it compiled on all versions of GCC, but I when I booted my Windows partition, Visual Studio 2010 started complaining too. Here is an example of the tricks I used:
#if __GNUC__ == 4 && (__GNUC_MINOR__ > 7 || \
(__GNUC_MINOR__ == 7 && __GNUC_PATCHLEVEL__ > 1))
# define TATO_OVERRIDE override
# define TATO_NO_THROW nothrow
#else
# define TATO_OVERRIDE
# define TATO_NO_THROW throw()
#endif
I know that the newest version of Visual Studio already supports a batch of new features too. What I would like to have, is something like a set of macro which tells me what features are available on the compiler I am using.
#ifdef THIS_COMPILER_SUPPORTS_CLASS_ENUMS
...
#endif
Does this exist? Is there a library that does that?
The compiler’s documentation?
Let me clarify. I know how to find those information, my problem is elsewhere. I don’t want to go through every possible compiler’s documentation to gather those information, especially since the same compiler might support different features with respect to its version. This is what I have been doing so far, and what I am looking for is actually a way not to do that.
Boost actually has a wide range of such macros available. You could use that. Otherwise, the only way is to essentially check the compiler's version and use your knowledge of the features supported in that version to decide if a feature is available or not.
Essentially, what Boost does, except manually.
There were discussions of having some standardized feature test mechanism but it turns out that this doesn't make any sense: If a compiler implements the standard, all feature tests would yield true. If it doesn't there is no reason to assume that it follows the standard in terms of the feature tests!
Thus, using some sort of configuration file seems to be the most reliable approach. Personally, I would do it differently than explicitly checking for compiler versions: instead, I would use something trying whether a compiler supports a specific feature to an acceptable degree. The configuration could be run in terms of autoconf or something similar.
With respect to the resulting configuration I would try to map things to suitable constructs and not use conditional compilation outside the configuration headers. For example, I would use something like this:
#if defined(KUHL_HAS_CLASS_FINAL)
# define kuhl_class_final final
#else
# define kuhl_class_final
#endif
Specifically for class enums you might need to use something a bit tricky because the enumeration values will only be available within a scope while the values are only available outside a scope. Thus, it may be necessaray to come up with some form of extra nesting in one case but not the other.
clang has some built-in macros for various feature checks: clang feature-check macros
Would be nice if all compiler vendors would pick up these (and more).
“What I would like to have, is something like a set of macro which tells me what features are available on the compiler I am using.”
There's no such thing in the standard.
A practical approach to compiler differences is to have a header for each compiler and compiler version you support. These headers should have the same name. Which one is included depends on the include path, tool usage, which is easy to customize for each compiler.
I call that concept virtual headers. I've found that it works nicely for three levels: system dependency, compiler dependency and version dependency. I think the scheme doesn't scale up to more than that, but on the other hand, that seems to be all that one needs.

writing code that supports new and older c++ compilers?

I have to write a code that can support newer and older compilers and i was wondering before i start is something like this possible?
#ifndef C++11 { //some code..... }
#endif
else
#ifndef older C++ version { //some code......}
#endif
The standard requires C++11 conforming implementations to define a macro named __cplusplus to the value 201103L. Nonconforming compilers are recommended to use a value with at most five decimal digits. The same was true for C++03 where the value this should be defined to is 199711L.
However, not many compilers consider(ed) themselves standards compliant, and e.g. gcc defined this for a long time to be just 1L. Also you have to consider that it is not only the compiler version, but also the parameters to the compiler. Gcc only supports (part of) C++11 when you pass -std=c++0x or -std=gnu++0x. In these cases it will define a macro __GXX_EXPERIMENTAL_CXX0X__.
So the most portable solution is to be unportable and have your own macro that you set when C++11 support is detected, and have some header/configure script in which you use the aforementioned things, along with possibly others for other supported compilers.
There's no simple universal macro, and for most compilers, it's not a binary "yes or no". Most compilers implement some C++11 features, but certainly not all.
MSVC simply has a single _MSC_VER macro indicating the version of the compiler, and if you know which features are supported by which version, then you can use that.
Clang has pretty comprehensive feature-specific macros, of the form _HAS_<feature> (I can't remember if that's the precise name).
If you want to know, across all compilers, whether feature X is available, you'll have to check all these different macros and determine the answer yourself. :)
In MSVS you have the macro, _MSC_VER which can help you. I don't know if there's such a standard macro.
The C++ standards committee spent a lot of effort to make sure, that any code written to the older standard is still valid in the new standard. And if you have to do without a feature on some platforms, using it on the others is a lot of work for rarely any gain. So just stick to the older version you need to support.
For the few exceptions the most reliable way is to test the compiler and define macros to choose the version you want to use. Either manually if you know your set of compilers or using something like autoconf or cmake if you don't. There are many compilers that support some C++11 features and not others, so there's little hope to find some test that would suffice without any work on your part. I believe all the features can be tested with just compiling; if they compile, they will generally also work.
Write your code to be compliant with the most recent compiler.
Any code which won't compile against an older version should be extracted into its own .cpp unit.
Then an alternative .cpp should be written for the old compiler.
For older builds select to include the older .cpp.
You don't need #defines.
See #ifdef Considered Harmful

Checking for availability of C++0x algorithm additions

I'm trying to figure out which of the additions to the algorithm headers are supported by a given implementation (gcc and MSVC would be enough).
The simple way would be to do it the same way as one would do it for core features: check the compiler version and define a macro if a language feature is supported. Unfortunately I cannot find a list that shows the version numbers for either compiler.
Is simply checking for a generic C++0x macro (GXX_EXPERIMENTAL or __cplusplus) enough or should I check the change lists for the compilers and build my macros based on those lists?
http://gcc.gnu.org/onlinedocs/libstdc++/manual/status.html#status.iso.200x
Since all compiler vendors provide a nice list of what's available in what version, and you would test the functionality anyways, I would use compiler versions to check for specific features. Or demand the user uses at least a good version, and not worry about it.
__cplusplus is not necessarily a C++0x macro, it tells you nothing. GXX_EXPERIMENTAL has existed since GCC 4.3, so that's pretty useless too.
This one is for GCC.
This one is for MSVC. (mind you: partially implemented means broken)
This one is for Intel.
Here you can find what macros to check against for a specific version of a compiler.
As far as I could figure out the only proper solution is to have a build script that tries to compile and run a file that uses the feature and has a runtime assertion. Depending on the outcome have a #define CONFIG_NO_FEATURENAME or similiar in a config file and guard your uses and workaround with a #ifndef.
This way it is possible to check if
the feature is available
the feature functions properly (depending on the correctness of the assertion)