Can someone explain to me why the following code compiles? Is it ignored by the compiler?
#include <stdio.h>
int main() {
1234;
return 0;
}
The Standard obliges implementers to allow statements even with no apparent effect. This is mainly because through the magic of macros and templates, they're surprisingly easy to come up with.
There is nothing wrong with this code. It's completely legal. It doesn't do anything, but it's completely legal. Your compiler -- with the right warning settings -- may warn you that it's utterly useless, but it's completely compliant.
A good compiler will give you a warning that you have a statement that has no side effect a (null statement effectively), however null statements are allowed in C/C++ so there will be no compile error.
You can think of the statement 1234; as similar to the statement getc(); in that both statements "return" (evaluate to) a value, but nothing is done with the return value. The getc() call has the side effect of consuming a character from standard input, so you're more likely to see that in a program than a bare number. But both are legal.
DeadMG has a good note on why it's a good idea to allow this. It's not because 1234 might be defined as a macro (because as far as I know, that's not allowed). It's because, especially with more complex macros, it's easy to end up with a macro that might reduce to some statement that doesn't do anything.
In C (and thus C++), an expression is a statement and is evaluated for its side effects even if the result is discarded. If it doesn't have any side effects, the compiler might find out and optimized it away (very likely in your case), but it must still compile the code.
Of course, any compiler will warn about that if that warning isn't disabled explicitly.
Turn on warnings.
Set warnings to be treated like errors (as they usually are).
Now it will behave as you would expect:
> cat t.cpp
int main() {
1234;
return 0;
}
> g++ t.cpp -Wall -Wextra -pedantic -Werror
cc1plus: warnings being treated as errors
t.cpp: In function ‘int main()’:
t.cpp:2: warning: statement has no effect
Its just that default compiler settings are lax
Because 1234 is a constant, it lets you get away with it. Replacing it with 'x' (without declaring variable x) or 'This doesn't compile' should cause it to fail.
Essentially it is an empty statement, so no harm no foul and it discards the code and keeps going.
Related
Suppose we have code like this:
int check(){
int x = 5;
++x; /* line 1.*/
return 0;
}
int main(){
return check();
}
If line 1 is commented out and the compiler is started with all warnings enabled, it emits:
warning: unused variable ‘x’ [-Wunused-variable]
However if we un-comment line 1, i.e. increase x, then no warning is emitted.
Why is that? Increasing the variable is not really using it.
This happen in both GCC and Clang for both c and c++.
Yes.
x++ is the same as x = x+1;, the assignment. When you are assigning to something, you possibly can not skip using it. The result is not discarded.
Also, from the online gcc manual, regarding -Wunused-variable option
Warn whenever a local or static variable is unused aside from its declaration.
So, when you comment the x++;, it satisfies the condition to generate and emit the warning message. When you uncomment, the usage is visible to the compiler (the "usefulness" of this particular "usage" is questionable, but, it's an usage, nonetheless) and no warning.
With the preincrement you are incrementing and assigning the value to the variable again. It is like:
x=x+1
As the gcc documentation says:
-Wunused-variable:
Warn whenever a local or static variable is unused aside from its declaration.
If you comment that line you are not using the variable aside of the line in which you declare it
increasing variable not really using it.
Sure this is using it. It's doing a read and a write access on the stored object. This operation doesn't have any effect in your simple toy code, and the optimizer might notice that and remove the variable altogether. But the logic behind the warning is much simpler: warn iff the variable is never used.
This has actually the benefit that you can silence that warning in cases where it makes sense:
void someCallback(void *data)
{
(void)data; // <- this "uses" data
// [...] handler code that doesn't need data
}
Why is that? increasing variable not really using it.
Yes, it is really using it. At least from the language point of view. I would hope that an optimizer removes all trace of the variable.
Sure, that particular use has no effect on the rest of the program, so the variable is indeed redundant. I would agree that warning in this case would be helpful. But that is not the purpose of the warning about being unused, that you mention.
However, consider that analyzing whether a particular variable has any effect on the execution of the program in general is quite difficult. There has to be a point where the compiler stops checking whether a variable is actually useful. It appears that the stages that generate warnings of the compilers that you tested only check whether the variable is used at least once. That once was the increment operation.
I think there is a misconception about the word 'using' and what the compiler means with that. When you have a ++i you are not only accessing the variable, you are even modifying it, and AFAIK this counts as 'use'.
There are limitations to what the compiler can identify as 'how' variables are being used, and if the statements make any sense. In fact both clang and gcc will try to remove unnecessary statements, depending on the -O-flag (sometimes too aggressively). But these optimizations happen without warnings.
Detecting a variable that is never ever accessed or used though (there is no further statement mentioning that variable) is rather easy.
I agree with you, it could generate a warning about this. I think it doesn't generate a warning, because developers of the compilers just didn't bothered handling this case (yet). Maybe it is because it is too complicated to do. But maybe they will do this in the future (hint: you can suggest them this warning).
Compilers getting more and more warnings. For example, there is -Wunused-but-set-variable in GCC (which is a "new" warning, introduced in GCC 4.6 in 2011), which warns about this:
void fn() {
int a;
a = 2;
}
So it is completely fine to expect that this emits a warning too (there is nothing different here, neither codes do anything useful):
void fn() {
int a = 1;
a++;
}
Maybe they could add a new warning, like -Wmeaningless-variable
As per C standard ISO/IEC 9899:201x, expressions evaluation are always executed to allow for expression's side effects to be produced unless the compiler can't be sufficiently sure that removing it the program execution is not altered.
5.1.2.3 Program execution
In the abstract machine, all expressions are evaluated as specified by the semantics. An actual implementation need not evaluate part of an expression if it can deduce that its value is not used and that no needed side effects are produced (including any caused by calling a function or accessing a volatile object).
When removing the line
++x;
The compiler can deduce that the local variable x is defined and initialized, but not used.
When you add it, the expression itself can be considered a void expression, that must be evaluated for side effects, as stated in:
6.8.3 Expression and null statements
The expression in an expression statement is evaluated as a void expression for its side effects.
On the other hand to remove compiler warnings relative to unused variable is very common to cast the expression to void. I.e. for an unused parameter in a function you can write:
int MyFunc(int unused)
{
(void)unused;
...
return a;
}
In this case we have a void expression that reference the symbol unused.
This question already has answers here:
Why does flowing off the end of a non-void function without returning a value not produce a compiler error?
(11 answers)
Closed 6 years ago.
I found this in one of my libraries this morning:
static tvec4 Min(const tvec4& a, const tvec4& b, tvec4& out)
{
tvec3::Min(a,b,out);
out.w = min(a.w,b.w);
}
I'd expect a compiler error because this method doesn't return anything, and the return type is not void.
The only two things that come to mind are
In the only place where this method is called, the return value isn't being used or stored. (This method was supposed to be void - the tvec4 return type is a copy-and-paste error)
a default constructed tvec4 is being created, which seems a bit unlike, oh, everything else in C++.
I haven't found the part of the C++ spec that addresses this. References (ha) are appreciated.
Update
In some circumstances, this generates an error in VS2012. I haven't narrowed down specifics, but it's interesting, nonetheless.
This is undefined behavior from the C++11 draft standard section 6.6.3 The return statement paragraph 2 which says:
[...] Flowing off the end of a function is equivalent to a return with no value; this results in undefined behavior in a value-returning function. [...]
This means that the compiler is not obligated provide an error nor a warning usually because it can be difficult to diagnose in all cases. We can see this from the definition of undefined behavior in the draft standard in section 1.3.24 which says:
[...]Permissible undefined behavior ranges from ignoring the situation completely with unpredictable results, to behaving during translation or program execution in a documented manner characteristic of the environment (with or without the issuance of a diagnostic message), to terminating a translation or execution (with the issuance of a diagnostic message).[...]
Although in this case we can get both gcc and clang to generate a wanring using the -Wall flag, which gives me a warning similar to this:
warning: control reaches end of non-void function [-Wreturn-type]
We can turn this particular warning into an error using the -Werror=return-type flag. I also like to use -Wextra -Wconversion -pedantic for my own personal projects.
As ComicSansMS mentions in Visual Studio this code would generate C4716 which is an error by default, the message I see is:
error C4716: 'Min' : must return a value
and in the case where not all code paths would return a value then it would generate C4715, which is a warning.
Maybe some elaboration on the why part of the question:
As it turns out, it is actually quite hard† for a C++ compiler to determine whether a function exits without a return value. In addition to the code paths that end in explicit return statements and the ones that fall off the end of the function, you also have to consider potential exception throws or longjmps in the function itself, as well as all of its callees.
While it is quite easy for a compiler to identify a function that looks like it might be missing a return, it is considerably harder to prove that it is missing a return. In order to lift compiler vendors of this burden, the standard does not require this to generate an error.
So compiler vendors are free to generate a warning if they are quite sure that a function is missing a return and the user is then free to ignore/mask that warning in those rare cases where the compiler was actually wrong.
†: In the general case, this is equivalent to the halting problem, so it is actually impossible for a machine to decide this reliably.
Compile your code with -Wreturn-type option:
$ g++ -Wreturn-type source.cpp
This will give you warning. You can turn the warning into error if you use -Werror too:
$ g++ -Wreturn-type -Werror source.cpp
Note that this will turn all warnings into errors. So if you want error for specific warning, say -Wreturn-type, just type return-type without -W part as:
$ g++ -Werror=return-type source.cpp
In general you should always use -Wall option which includes most common warnings — this includes missing return statement also. Along with -Wall, you can use -Wextra also, which includes other warnings not included by -Wall.
Maybe some additional elaboration on the why part of the question.
C++ was designed so that a very large body of pre-existing body of C code compiles with minimum amount of changes. Unfortunately, C itself was paying a similar duty to earliest pre-standard C which did not even have the void keyword and instead relied on a default return type of int. C functions usually did return values, and whenever code superficially similar to Algol/Pascal/Basic procedures was written without any return statements, the function was, under the hood, returning whichever garbage was left on the stack. Neither the caller nor the callee assigns the value of the garbage in a reliable way. If the garbage is then ignored by every caller, everything is fine and C++ inherits the moral obligation to compile such code.
(If the returned value is used by the caller, the code may behave non-deterministically, similar to processing of an uninitialized variable. Could the difference be reliably identified by a compiler, in a hypothetical successor language to C? This is hardly possible. The caller and the callee may be in different compilation units.)
The implicit int is just a part of the C legacy involved here. A "dispatcher" function might, depending on a parameter, return a variety of types from some code branches, and return no useful value from other code branches. Such a function would generally be declared to return a type long enough to hold any of the possible types and the caller might need to cast it or extract it from a union.
So the deepest cause is probably the C language creators' belief that procedures that do not return any value are just an unimportant special case of functions that do; this problem got aggravated by the lack of focus on type safety of function calls in the oldest C dialects.
While C++ did break compatibility with some of the worst aspects of C (example), the willingness to compile a return statement without a value (or the implicit value-less return at the end of a function) was not one of them.
As already mentioned, this is undefined behavior and will give you a compiler warning. Most places I've worked require you to turn on compiler settings to treat warnings as errors - which enforces that all your code must compile with 0 errors and 0 warnings. This is a good example of why that is a good idea.
This is more of the standard C++ rule/feature which tends to be flexible with things and which tends to be more close to C.
But when we talk of the compilers, GCC or VS, they are more for professional usage and for variety of development purposes and hence put more strict development rules as per your needs.
That makes sense also, my personal opinion, because the language is all about features and its usage whereas compiler defines the rules for optimal and best way of using it as per your needs.
As mentioned in above post, compiler sometimes gives the error, sometimes gives warning and also it has the option of skipping these warning etc, indicating the freedom to use the language and its features in a way that suits us best.
Along with this there are several other questions mentioning this behaviour of returning a result without having a return statement. One simple example would be:
int foo(int a, int b){ int c = a+b;}
int main(){
int c = 5;
int d = 5;
printf("f(%d,%d) is %d\n", c, d, foo(c,d));
return 0;
}
Could this anomaly be due stack properties and more specifically:
Zero-Address Machines
In zero-address machines, locations of both operands are assumed to be at a default location.
These machines use the stack as the source of the input operands and the result goes back into
the stack. Stack is a LIFO (last-in-first-out) data structure that all processors support, whether
or not they are zero-address machines. As the name implies, the last item placed on the stack
is the first item to be taken out of the stack. All operations on this type of machine assume that the required input operands are the top
two values on the stack. The result of the operation is placed on top of the stack.
In addition to that, for accessing memory to read and write data same registers are used as data source and destination(DS (data segment) register), that store first the variables needed for the calculation and then the returned result.
Note:
with this answer I would like to discuss one possible explanation of the strange behaviour at machine (instruction) level as it has already a context and its covered in adequately wide range.
Unlike Java, in C/C++ the following is allowed:
int* foo ()
{
if(x)
return p;
// What if control reaches here?
}
This often causes crashes and it is hard to debug problems. Why doesn't the standard enforce to have a final return for non-void functions? (Compilers generate an error for a wrong return value.)
Is there a flag in GCC or MSVC to enforce this? (something like -Wunused-result)
It is not allowed (undefined behaviour). However, the standard does not require a diagnostic in this case.
The standard doesn't require the last statement to be return because of code like this:
while (true) {
if (condition) return 0;
}
This always returns 0, but a dumb compiler cannot see it. Note that the standard does not mandate smart compilers. A return statement after the while block would be a waste which a dumb compiler would not be able to optimise out. The standard does not want to require the programmer to write waste code just to satisfy a dumb compiler.
g++ -Wall is smart enough to emit a diagnostic on my machine.
Use the -Wall flag in GCC.
warning: control reaches end of non-void function
Or more specifically, -Wreturn-type.
My guess: Because sometimes the programmer knows better than the compiler. With this simple example, it's clear that someting is wrong, but consider a switch of many values, or many checks in general. You, as the coder, know that certain values just will not be passed in to the function, but the compiler doesn't and just hints you, that there might be something wrong.
#include <iostream>
int foo(){
if(false)
return 5;
}
int main(){
int i = foo();
std::cout << i;
}
Note that even warning level 1 on MSVC gives the following warning:
warning C4715: 'foo' : not all control paths return a value
You can convert the warning into an error by using the following compiler options
-Wreturn-type -Werror=return-type.
Check out This link
The obvious answer is: because it's not an error. It's only an error if
x is false and if the caller uses the return value, neither of which
can necessarily be determined by the compiler, at least in the general
case.
In this particular case (returning a pointer), it wouldn't be too
difficult to require a return for all paths; Java does this. In
general, however, it's not reasonable in C++ to require this, since in
C++, you can return user defined types for which it may be impossible to
construct a value (no default constructor, etc.) So we have the
situation where the programmer might not be able to provide a return
in a branch that he or she knows can't be taken, and the compiler can't
determine that the branch can't be taken.
Most compilers will warn in such cases, when it can determine the flow.
All of the ones I've seen also warn in some cases where it's clearly
impossible to fall off the end, however. (Both g++ and VC++ warn about:
int
bar( char ch )
{
switch ( ch & 0xC0 ) {
case 0x00:
case 0x40:
return 0;
case 0x80:
return -1;
case 0xC0:
return 1;
}
}
, at least with the usual options. Although it's quite clear that this
function never falls off the end.)
As far as I remember, Visual Studio 2008 warns you about a "execution path that does not have a return value". It is allowed in the meaning of that "C++ won't stop you from shooting you in the foot". So you are to think, not the compiler.
What the standard says about this kind of programming is that it produces undefined behavior.
Undefined behavior is the joy and pity of C/C++, but it is also a fundamental feature of the language design that allows for many of the low-level optimizations that make C a sort of "high level assembler" (it is not, actually, but just to give you an idea).
So, while redirecting to John's answer about the switch to use with GCC, to know "why" the standard does not prevent that, I would point to a very interesting analysis of undefined behavior and all of its misteries: What Every C Programmer Should Know About Undefined Behavior. It makes for a very instructive reading.
Okay, little oddity I discovered with my C++ compiler.
I had a not-overly complex bit of code to refactor, and I accidentally managed to leave in a path that didn't have a return statement. My bad. On the other hand, this compiled, and segfaulted when I ran it and that path was hit, obviously.
Here's my question: Is this a compiler bug, or is there no guarantee that a C++ compiler will enforce the need for a return statement in a non-void return function?
Oh, and to be clear, in this case it was an unecessary if statement without an accompanying else. No gotos, no exits, no aborts.
Personally I think this should be an error:
int f() {
}
int main() {
int n = f();
return 0;
}
but most compilers treat it as a warning, and you may even have to use compiler switches to get that warning. For example, on g++ you need -Wall to get:
[neilb#GONERIL NeilB]$ g++ -Wall nr.cpp
nr.cpp: In function 'int f()':
nr.cpp:2: warning: no return statement in function returning non-void
Of course, with g++ you should always compile with at least -Wall anyway.
There is no guarantee that a C++ compiler will enforce that. A C++ function could jump out of its control flow by mechanisms unknown to the compiler. Context switches when C++ is used to write an OS kernel is an example of that. An uncaught exception thrown by a called function (whose code isn't necessarily available to the caller) is another one.
Some other languages, like Java, explicitly enforce that with knowledge available at compile time, all paths return a value. In C++ this isn't true, as is with many other occasions in the language, like accessing an array out of its bounds isn't checked either.
The compiler doesn't enforce this because you have knowledge about what paths are practically possible that the compiler doesn't. The compiler typically only knows about that particular file, not others that may affect the flow inside any given function. So, it isn't an error.
In Visual Studio, though, it is a warning. And we should pay attention to all warnings.... right? :)
Edit:
There seems to be some discussion about when this could happen. Here's a modified but real example from my personal code library;
enum TriBool { Yes, No, Maybe };
TriBool GetResult(int input) {
if (TestOne(input)) {
return Yes;
} else if (TestTwo(input)) {
return No;
}
}
Bear with me because this is old code. Originally there was an "else return maybe" in there. :) If TestOne and TestTwo are in a different compilation unit then when the compiler hits this code, it can not tell if TestOne and TestTwo could both return false for a given input. You, as the programmer that wrote TestOne and TestTwo, know that if TestOne fails then TestTwo will succeed. Maybe there are side effects of those tests so they have to be done. Would it be better to write it without the "else if"? Maybe. Probably. But the point is that this is legal C++ and the compiler can't know if it is possible to exit without a return statement. It is, I agree, ugly and not good coding but it is legal and Visual Studio will give you a warning but it will compile.
Remember that C++ isn't about protecting you from yourself. It is about letting you do what your heart desires within the constraints of the language even if that includes shooting yourself in the foot.
Can't a compiler warn (even better if it throws errors) when it notices a statement with undefined/unspecified/implementation-defined behaviour?
Probably to flag a statement as error, the standard should say so, but it can warn the coder at least. Is there any technical difficulties in implementing such an option? Or is it merely impossible?
Reason I got this question is, in statements like a[i] = ++i; won't it be knowing that the code is trying to reference a variable and modifying it in the same statement, before a sequence point is reached.
It all boils down to
Quality of Implementation: the more accurate and useful the warnings are, the better it is. A compiler that always printed: "This program may or may not invoke undefined behavior" for every program, and then compiled it, is pretty useless, but is standards-compliant. Thankfully, no one writes compilers such as these :-).
Ease of determination: a compiler may not be easily able to determine undefined behavior, unspecified behavior, or implementation-defined behavior. Let's say you have a call stack that's 5 levels deep, with a const char * argument being passed from the top-level, to the last function in the chain, and the last function calls printf() with that const char * as the first argument. Do you want the compiler to check that const char * to make sure it is correct? (Assuming that the first function uses a literal string for that value.) How about when the const char * is read from a file, but you know that the file will always contain valid format specifier for the values being printed?
Success rate: A compiler may be able to detect many constructs that may or may not be undefined, unspecified, etc.; but with a very low "success rate". In that case, the user doesn't want to see a lot of "may be undefined" messages—too many spurious warning messages may hide real warning messages, or prompt a user to compile at "low-warning" setting. That is bad.
For your particular example, gcc gives a warning about "may be undefined". It even warns for printf() format mismatch.
But if your hope is for a compiler that issues a diagnostic for all undefined/unspecified cases, it is not clear if that should/can work.
Let's say you have the following:
#include <stdio.h>
void add_to(int *a, int *b)
{
*a = ++*b;
}
int main(void)
{
int i = 42;
add_to(&i, &i); /* bad */
printf("%d\n", i);
return 0;
}
Should the compiler warn you about *a = ++*b; line?
As gf says in the comments, a compiler cannot check across translation units for undefined behavior. Classic example is declaring a variable as a pointer in one file, and defining it as an array in another, see comp.lang.c FAQ 6.1.
Different compilers trap different conditions; most compilers have warning level options, GCC specifically has many, but -Wall -Werror will switch on most of the useful ones, and coerce them to errors. Use \W4 \WX for similar protection in VC++.
In GCC You could use -ansi -pedantic, but pedantic is what it says, and will throw up many irrelevant issues and make it hard to use much third party code.
Either way, because compilers catch different errors, or produce different messages for the same error, it is therefore useful to use multiple compilers, not necessarily for deployment, but as a poor-man's static analysis. Another approach for C code is to attempt to compile it as C++; the stronger type checking of C++ generally results in better C code; but be sure that if you want C compilation to work, don't use the C++ compilation exclusively; you are likely to introduce C++ specific features. Again this need not be deployed as C++, but just used as an additional check.
Finally, compilers are generally built with a balance of performance and error checking; to check exhaustively would take time that many developers would not accept. For this reason static analysers exist, for C there is the traditional lint, and the open-source splint. C++ is more complex to statically analyse, and tools are often very expensive. One of the best I have used is QAC++ from Programming Research. I am not aware of any free or open source C++ analysers of any repute.
gcc does warn in that situation (at least with -Wall):
#include <stdio.h>
int main(int argc, char *argv[])
{
int a[5];
int i = 0;
a[i] = ++i;
printf("%d\n", a[0]);
return 0;
}
Gives:
$ make
gcc -Wall main.c -o app
main.c: In function ‘main’:
main.c:8: warning: operation on ‘i’ may be undefined
Edit:
A quick read of the man page shows that -Wsequence-point will do it, if you don't want -Wall for some reason.
Contrarily, compilers are not required to make any sort of diagnosis for undefined behavior:
§1.4.1:
The set of diagnosable rules consists of all syntactic and semantic rules in this International Standard except for those rules containing an explicit notation that “no diagnostic is required” or which are described as resulting in “undefined behavior.”
Emphasis mine. While I agree it may be nice, the compiler's have enough problem trying to be standards compliant, let alone teach the programmer how to program.
GCC warns as much as it can when you do something out of the norms of the language while still being syntactically correct, but beyond the certain point one must be informed enough.
You can call GCC with the -Wall flag to see more of that.
If your compiler won't warn of this, you can try a Linter.
Splint is free, but only checks C http://www.splint.org/
Gimpel Lint supports C++ but costs US $389 - maybe your company c an be persuaded to buy a copy? http://www.gimpel.com/