I have a header file, whose functionality relies heavily on the success of SFINAE. In present g++ 4.6 it works as expected. Should I assume that, my code will behave seamlessly in the same way for all compilers (C++03 compilers) ?
I find this as an issue, because if something differs it won't result in compiler error and silently would change the code flow.
Yes, you may rely on SFINAE to exist and function properly.
If you have a compiler that fails at it, then it's terminally non-conformant and all bets are off anyway.
Since it depends on the success of the SFINAE, you should use static_assert (or BOOST_STATIC_ASSERT) to make sure the SFINAE passed successfully.
I do not know if your code would work on all compilers, but the static assert would fail the compilation if a specific compiler fails to produce expected output for specific SFINAE.
Related
constexpr permits expressions which can be evaluated at compile time to be ... evaluated at compile time.
Why is this keyword even necessary? Why not permit or require that compilers evaluate all expressions at compile time if possible?
The standard library has an uneven application of constexpr which causes a lot of inconvenience. Making constexpr the "default" would address that and likely improve a huge amount of existing code.
It already is permitted to evaluate side-effect-free computations at compile time, under the as-if rule.
What constexpr does is provide guarantees on what data-flow analysis a compliant compiler is required to do to detect1 compile-time-computable expressions, and also allow the programmer to express that intent so that they get a diagnostic if they accidentally do something that cannot be precomputed.
Making constexpr the default would eliminate that very useful diagnostic ability.
1 In general, requiring "evaluate all expressions at compile time if possible" is a non-starter, because detecting the "if possible" requires solving the Halting Problem, and computer scientists know that this is not possible in the general case. So instead a relaxation is used where the outputs are { "Computable at compile-time", "Not computable at compile-time or couldn't decide" }. And the ability of different compilers to decide would depend on how smart their test was, which would make this feature non-portable. constexpr defines the exact test to use. A smarter compiler can still pre-compute even more expressions than the Standard test dictates, but if they fail the test, they can't be marked constexpr.
Note: despite the below, I admit to liking the idea of making constexpr the default. But you asked why it wasn't already done, so to answer that I will simply elaborate on mattnewport's last comment:
Consider the situation today. You're trying to use some function from the standard library in a context that requires a constant expression. It's not marked as constexpr, so you get a compiler error. This seems dumb, since "clearly" the ONLY thing that needs to change for this to work is to add the word constexpr to the definition.
Now consider life in the alternate universe where we adopt your proposal. Your code now compiles, yay! Next year you decide you to add Windows support to whatever project you're working on. How hard can it be? You'll compile using Visual Studio for your Windows users and keep using gcc for everyone else, right?
But the first time you try to compile on Windows, you get a bunch of compiler errors: this function can't be used in a constant expression context. You look at the code of the function in question, and compare it to the version that ships with gcc. It turns out that they are slightly different, and that the version that ships with gcc meets the technical requirements for constexpr by sheer accident, and likewise the one that ships with Visual Studio does not meet those requirements, again by sheer accident. Now what?
No problem you say, I'll submit a bug report to Microsoft: this function should be fixed. They close your bug report: the standard never says this function must be usable in a constant expression, so we can implement however we want. So you submit a bug report to the gcc maintainers: why didn't you warn me I was using non-portable code? And they close it too: how were we supposed to know it's not portable? We can't keep track of how everyone else implements the standard library.
Now what? No one did anything really wrong. Not you, not the gcc folks, nor the Visual Studio folks. Yet you still end up with un-portable code and are not a happy camper at this point. All else being equal, a good language standard will try to make this situation as unlikely as possible.
And even though I used an example of different compilers, it could just as well happen when you try to upgrade to a newer version of the same compiler, or even try to compile with different settings. For example: the function contains an assert statement to ensure it's being called with valid arguments. If you compile with assertions disabled, the assertion "disappears" and the function meets the rules for constexpr; if you enable assertions, then it doesn't meet them. (This is less likely these days now that the rules for constexpr are very generous, but was a bigger issue under the C++11 rules. But in principle the point remains even today.)
Lastly we get to the admittedly minor issue of error messages. In today's world, if I try to do something like stick in a cout statement in constexpr function, I get a nice simple error right away. In your world, we would have the same situation that we have with templates, deep stack-traces all the way to the very bottom of the implementation of output streams. Not fatal, but surely annoying.
This is a year and a half late, but I still hope it helps.
As Ben Voigt points out, compilers are already allowed to evaluate anything at compile time under the as-if rule.
What constexpr also does is lay out clear rules for expressions that can be used in places where a compile time constant is required. That means I can write code like this and know it will be portable:
constexpr int square(int x) { return x * x; }
...
int a[square(4)] = {};
...
Without the keyword and clear rules in the standard I'm not sure how you could specify this portably and provide useful diagnostics on things the programmer intended to be constexpr but don't meet the requirements.
When I first heard about it, it sounded like a great featureāa c++ REPL. However, it cannot call STL functions or methods, and has a whole lot of other issues. This question also applies to conditional breakpoints.
Is it still an experimental feature, or have the developers just dropped it?
Example:
(lldb) p iterator->aField
error: call to a function 'std::__1::__wrap_iter<aClass const*>::operator->() const' ('_ZNKSt3__111__wrap_iterIPK8aClassEptEv') that is not present in the target
error: 0 errors parsing expression
error: The expression could not be prepared to run in the target
At present, there's no good way for the debugger to generate methods of template specializations for which the compiler only emitted inlined versions. And the debugger can't call inlined methods.
Here's one limited trick (though it requires C++11) that you can use to force the compiler to generate full copies of the relevant template class so that there are functions the debugger can call. For instance, if I put:
template class std::vector<int>;
in my source code somewhere, the compiler will generate real copies of all the functions in the int specialization of std::vector. This obviously isn't a full solution, and you should only do this in debug builds or it will bloat your code. But when there are a couple of types that you really call methods on, its a useful trick to know.
You mention a "whole lot of other issues". Please file bugs on any expression parser issues you find in lldb, either with the lldb bugzilla: https://llvm.org/bugs, or Apple's bug reporter: http://bugreporter.apple.com. The expression parser is under active development
I think my compiler understands C++11, but maybe not. Instead of trying it on existing messes of source code which are buggy anyway, is there some simple "hello world" level snippet of source code I can try to compile, which if it does compile without error, proves the compiler is reading it as C++11?
Try this one,
auto f = [](){};
or write some code with rvalue reference.
Shortest thing possible:
[]{};
Is's a lambda-expression without argument list.
The Problem is that compiler usually don't support a new standard completely from the start. Meaning, they may support one c++11 feature, but not another.
However, as far as c++11 is concerned, I think VC++ is the only major compiler that doesn't fully support it, even though you may have to enable the c++11 mode manually. For g++ you e.g. have to supply the compiler flag -std=c++11 (or -std=gnu++11) - the same holds true for newer versions like c++14).
Now "static_assert" is a keyword in C++0x I thought it would be logical to replace the C "assert" macro with an "assert" keyword too.
static_assert is interpreted at compile time so it has to be a keyword so the compiler can process it.
assert doesn't need to be a keyword, and it doesn't make much sense to make it one, since there are many ways a program might want to respond to assertion success or failure. Therefore, it makes more sense to implement it in a library, and it is typically implemented as a macro.
assert has no compile time meaning, except during Pre-processing. The preprocessor has no knowledge of the C++ language, so a keyword makes no sense.
By contrast, static_assert is evaluated at compile time. Making it a keyword makes more sense in that regard. The compiler cares about it's existence.
There are also historic reasons; it was not a keyword in C, and making it one in C++ would have rendered existing assert macros result in undefined behavior.
Basically, because it doesn't need it. Existing assertion mechanisms for run-time assertions are perfectly good and don't require language support.
The other answers give some possible answers to your question, but a recent proposal indicates that assert may indeed become a keyword in C++17: https://isocpp.org/files/papers/N4154.pdf
assert can be implemented in a library, static_assert cannot. So static_assert gets a keyword because it needs language support, and assert doesn't.
This can not be done for compatibility with the code already written in c which has assert as a variable name. And hence as oli mentioned we won't be able to compile as assert is no longer macro
In C++0x (from here):
In C++0x, static assertions can be declared to detect and diagnose common usage errors at compile time.
this is static_assert syntax:
>>-static_assert--(--constant-expression--,--string-literal----->
where constant-expression must be contextually converted to bool. If it converts to false, then the compiler will emit an error according the string-literal.
So, this is basically an extension of the language that needs a keyword. It is not a runtime mechanism.
Again from the document linked above:
The addition of static assertions to the C++ language has the following benefits:
Libraries can detect common usage errors at compile time.
Implementations of the C++ Standard Library can detect and diagnose common usage errors, improving usability.
You can use a static_assert declaration to check important program invariants at compile time.
I would like to do this for usages which may be inefficient but not necessarily incorrect.
No.
An assertion failure indicates a problem preventing the program from being completed (be that execution [run-time assertions], or compilation [static assertions]).
In truth, an implementation is allowed to do anything as long as they emit a diagnostic (including continuing execution). But, in practice, mainstream toolchains will all behave pretty much the same: they will error out. You certainly can't hack them to something user-defined.
The attributes are introduced in C++0x for that purpose. See http://docwiki.embarcadero.com/RADStudio/en/C%2B%2B0x_attribute_deprecated for an example.
Not as Standard, no. You can find #warning in many compilers, but that's really not the same in most situations.