C++ remove standard features - c++

Is it possible to remove standard features from C++
//Undefine some function from the standard like classes
class someclass
{
someclass(){
//Whatever
}
};
then get the error
"class undeclared identifier"

Not that I recommend it but you can use pre-processor macros to undefine anything you want.
If you compile the program below, you will get lots of errors.
#define vector
#include <vector>
int main()
{
vector<int> a;
}
You don't need the #define vector in the file. You can define it in the command used to invoke the compiler.
Note that using the above trick makes your program subject to undefined behavior:
From the C++11 Standard:
17.6.4.3 Reserved names
...
2 If a program declares or defines a name in a context where it is reserved, other than as explicitly allowed by this Clause, its behavior is undefined.
17.6.4.3.1 Macro names [macro.names]
1 A translation unit that includes a standard library header shall not #define or #undef names declared in any standard library header.
2 A translation unit shall not #define or #undef names lexically identical to keywords, to the identifiers listed in Table [tab:identifiers.special], or to the attribute-tokens described in [dcl.attr].

Related

Can a declaration affect the std namespace?

#include <iostream>
#include <cmath>
/* Intentionally incorrect abs() which seems to override std::abs() */
int abs(int a) {
return a > 0? -a : a;
}
int main() {
int a = abs(-5);
int b = std::abs(-5);
std::cout<< a << std::endl << b << std::endl;
return 0;
}
I expected that the output will be -5and 5, but the output is the -5 and -5.
I wonder why this case will happen?
Does it have anything to do with the use of std or what?
The language specification allows implementations to implement <cmath> by declaring (and defining) the standard functions in global namespace and then bringing them into namespace std by means of using-declarations. It is unspecified whether this approach is used
20.5.1.2 Headers
4 [...] In the C++ standard library, however, the declarations (except for names which are defined as macros in C) are within namespace scope (6.3.6) of the namespace std. It is unspecified whether these names (including any overloads
added in Clauses 21 through 33 and Annex D) are first declared within the global namespace scope and are then injected into namespace std by explicit using-declarations (10.3.3).
Apparently, you are dealing with one of implementations that decided to follow this approach (e.g. GCC). I.e. your implementation provides ::abs, while std::abs simply "refers" to ::abs.
One question that remains in this case is why in addition to the standard ::abs you were able to declare your own ::abs, i.e. why there's no multiple definition error. This might be caused by another feature provided by some implementations (e.g. GCC): they declare standard functions as so called weak symbols, thus allowing you to "replace" them with your own definitions.
These two factors together create the effect you observe: weak-symbol replacement of ::abs also results in replacement of std::abs. How well this agrees with the language standard is a different story... In any case, don't rely on this behavior - it is not guaranteed by the language.
In GCC this behavior can be reproduced by the following minimalistic example. One source file
#include <iostream>
void foo() __attribute__((weak));
void foo() { std::cout << "Hello!" << std::endl; }
Another source file
#include <iostream>
void foo();
namespace N { using ::foo; }
void foo() { std::cout << "Goodbye!" << std::endl; }
int main()
{
foo();
N::foo();
}
In this case you will also observe that the new definition of ::foo ("Goodbye!") in the second source file also affects the behavior of N::foo. Both calls will output "Goodbye!". And if you remove the definition of ::foo from the second source file, both calls will dispatch to the "original" definition of ::foo and output "Hello!".
The permission given by the above 20.5.1.2/4 is there to simplify implementation of <cmath>. Implementations are allowed to simply include C-style <math.h>, then redeclare the functions in std and add some C++-specific additions and tweaks. If the above explanation properly describes the inner mechanics of the issue, then a major part of it depends on replaceability of weak symbols for C-style versions of the functions.
Note that if we simply globally replace int with double in the above program, the code (under GCC) will behave "as expected" - it will output -5 5. This happens because C standard library does not have abs(double) function. By declaring our own abs(double), we do not replace anything.
But if after switching from int with double we also switch from abs to fabs, the original weird behavior will reappear in its full glory (output -5 -5).
This is consistent with the above explanation.
Your code causes undefined behaviour.
C++17 [extern.names]/4:
Each function signature from the C standard library declared with external linkage is reserved to the implementation for use as a function signature with both extern "C" and extern "C++" linkage, or as a name of namespace scope in the global namespace.
So you cannot make a function with the same prototype as the Standard C library function int abs(int);. Regardless of which headers you actually include or whether those headers also put C library names into the global namespace.
However, it would be allowed to overload abs if you provide different parameter types.

Can a #define be assigned the result of defined?

Is the result of the following code defined in C or C++?
#define FOO
#define BAR defined(FOO)
#if BAR
int x = 1;
#else
int x = 2;
#endif
Using defined as an expansion in conditional directive is considered as Undefined Behavior.
From C99 Standard: 6.10.1/4 | C++11 Standard Last Working Draft(n4296) 16.1/4
If the token defined is generated as a result of this replacement
process or use of the defined unary operator does not match one of the
two specified forms prior to macro replacement, the behavior is
undefined.
From C99 Standard: 6.10.8/4 | C++11 Standard Last Working Draft(n4296) 16.8/4
None of these macro names, nor the identifier defined, shall be the
subject of a #define or a #undef preprocessing directive.
GNU - CPP 4.2.3
If the defined operator appears as a result of a macro expansion, the
C standard says the behavior is undefined. GNU cpp treats it as a
genuine defined operator and evaluates it normally. It will warn
wherever your code uses this feature if you use the command-line
option ‘-pedantic’, since other compilers may handle it differently.

C++ function declarations

I'm a newbie to C++. I don't understand why it is okay (i.e. why the compiler allows it) for 1 function to be declared twice. For example, the following code is legal:
#include <iostream>
#include <string>
int hello();
int hello();
int main(){
cout << "hello, world" << endl;
}
int hello(){
return 1;
}
Why does the compiler not complain?
In C and C++ forward declarations are very weak. They provide a formal "promise" to the compiler that if a function with a specified signature appears at all, it would have the signature that you specify. The function is not even guaranteed to appear: unless you call or otherwise reference the declared function, the compiler is not going to complain that there is a declaration with no definition. The standard requires compilers to treat identical forward declarations as a single declaration.
Unlike definitions which must be unique according to the single definition rule
3.2 No translation unit shall contain more than one definition of any variable, function, class type, enumeration type, or template
declarations are merely required to refer to the same definition, i.e. be equivalent to each other:
3.3.4 Given a set of declarations in the same declarative region, each of which specifies the same unqualified name, they shall all refer to the same entity, or all refer to functions or function templates, [...]
Your doubt will be cleared by "One Definition Rule". It is defined in the ISO C++ Standard (ISO/IEC 14882) 2003, at section 3.2.
It states that:
In any translation unit, a template, type, function, or object can
have no more than one definition. Some of these can have any number of
declarations.
Read more about it on Wikipedia (http://en.wikipedia.org/wiki/One_Definition_Rule)

Can types in `cname` and `name.h` be different types?

Is this code standard conforming?
#include <stdio.h>
#include <cstdio>
int main() {
FILE *f1 = 0;
std::FILE *f2 = f1;
}
Explanation: The standard says [headers]:
[...] the contents of each header cname shall be the same as that of the corresponding header name.h [...] as if by inclusion. In the C++ standard library, however, the declarations [...] are within namespace scope (3.3.6) of the namespace std. It is unspecified whether these names are first declared within the global namespace scope and are then injected into namespace std by explicit using-declarations (7.3.3).
So in case they aren't injected by explicit using-declaration, may they be different type? I don't think the "as if by inclusion" phrase is conclusive since the other half of the text clearly contradicts this requirement, requiring the names are within the std namespace.
Yes, that's standard conforming: FILE* is declared in stdio.h, std::FILE* in cstdio, and the two are the same because of the paragraph you cited.
(The only thing that's unspecified is whether, if you only include <cstdio>, you also have the same FILE* in the global namespace or not.)
Update: I believe that the types are actually the same on the nose, and that each type is declared only once and then injected in to the other namespace via a using declaration. The only thing that's unspecified is which one comes first. The corresponding opposite standard quote is D.5(2):
Every C header, each of which has a name of the form name.h, behaves as if each name placed in the standard library namespace by the corresponding cname header is placed within the global namespace scope. It is unspecified whether these names are first declared or defined within namespace scope (3.3.6) of the namespace std and are then injected into the global namespace scope by explicit using-declarations (7.3.3).
Basically, that means that two implementations are possible:
"C came first":
// foo.h
struct Gizmo { /* ... */ };
// cfoo
#include "foo.h"
namespace std { using ::Gizmo; }
"C++ with C-compatibility:
// cfoo
namespace std
{
struct Gizmo { /* ... */ };
}
// foo.h
#include <cfoo>
using std::Gizmo;
I don't believe that paragraph says that they have to be identical. It is just a revision of the original (C++98) paragraph which said:
Every C header, each of which has a name of the form name.h behaves as if each name placed in the Standard library namespace by the corresponding cname header is also placed within the namespace scope of namespace std and is followed by an explicit using-declaration (7.3.3)
This was between hard and impossible to follow, because it conflicted with the existing real C headers on most systems. So, in C++11 the text was changed to the one you quote. It allows implementations to it the other way round, like they in practice have done all along - use existing system provided C headers and import the names to namespace std.
However, there is another paragraph saying that whichever way the implementation does this, the names in the headers must mean the same thing:
For each type T from the Standard C library, the types ::T and std::T are reserved to the implementation and, when defined, ::T shall be identical to std::T. ([extern.types], 17.6.4.3.4)
Yes, they can be different types. Use the C++ types; the C headers are only there for compatibility.
Consider if as the comment to the answer above suggests, the C++ header were implemented as namespace std { #include "stdio.h" }; then ::FILE and std::FILE would represent different types.

Defining a top-level no-op in C++?

Is the following legal according to the C++ standard? (If the answer differs from standard to standard, I would like to know that, too.)
#define VERY_OLD_COMPILER 1
#ifdef VERY_OLD_COMPILER
#define USING_NAMESPACE_STD enum { }
#else
#define USING_NAMESPACE_STD using namespace std
#endif
USING_NAMESPACE_STD;
int main(int argc, char *argv[]) {
// etc.
The goal is to define a macro that I can invoke at the top-level and follow with a semicolon, such that it has no effect. I am pretty sure stray semicolons at the top-level are not allowed (GCC complains about them, anyway), so simply defining an empty macro does not work.
Declaring an empty anonymous struct does not work because it needs a name, and I do not want to pollute the namespace.
Does an anonymous empty enum declaration (enum { }) do the trick? It works on all of the compilers I have tried, but of course that is not the same thing as being permitted by the spec.
Any other ideas/comments welcome. Well, anything other than "throw out that compiler". Believe me, I would love to.
Looking at the latest public C++0x draft, it seems semicolons at top-level are allowed and ignored.
The grammar treats a translation-unit as a sequence of declarations, and amongst the various kinds of declarations there is an empty-declaration that is just a simple semicolon.
Pragmatic solution: considering that your VERY_OLD_COMPILER constant suggests that the whole thing is to be a part of a workaround for an older compiler, I'd just pick a solution that works with this compiler, be it standardised or not.
#define USING_NAMESPACE_STD static int dummy##__LINE__
An empty enum is in fact illegal according to C++03:
7/3 Declarations:
In a simple-declaration, the optional init-declarator-list can be omitted only when declaring a class (clause 9) or enumeration (7.2), that is, when the decl-specifier-seq contains either a class-specifier, an elaboratedtype-specifier with a class-key (9.1), or an enum-specifier. In these cases and whenever a class-specifier or enum-specifier is present in the decl-specifier-seq, the identifiers in these specifiers are among the names eing declared by the declaration (as class-names, enum-names, or enumerators, depending on the syntax). In such cases, and except for the declaration of an unnamed bit-field (9.6), the decl-specifier-seq shall introduce one or more names into the program, or shall redeclare a name introduced by a previous declaration.
[Example:
enum { }; // ill-formed
typedef class { }; // ill-formed
—end example]
So I would agree with MSN's answer to declare a dummy enum, struct forward declaration, or typedef declaration with a name that is obviously not going to conflict with anything (throw a GUID in there for good measure). The nice thing about these things is that you can have the declaration show up more than once, and as long as it's the same as before there's no problem.
Comeau says no:
> Comeau C/C++ 4.3.10.1 (Oct 6 2008 11:28:09) for
> ONLINE_EVALUATION_BETA2 Copyright 1988-2008 Comeau Computing. All
> rights reserved. MODE:strict errors C++ C++0x_extensions
>
> "ComeauTest.c", line 1: error: declaration does not declare anything
> enum { }; ^
>
> 1 error detected in the compilation of "ComeauTest.c".
You can use
#define USING_NAMESPACE_STD struct very_long_name_that_i_hope_doesnt_collide_because_if_it_does_oh_noes
Sorry, I forgot you needed to do it at the top level.
How about
extern int _;
? I don't know what undesirable side effects this would have, though I can't think of any.
#define VERY_OLD_COMPILER 1
#ifdef VERY_OLD_COMPILER
#define USING_NAMESPACE_STD typedef unsigned long uint32
#else
#define USING_NAMESPACE_STD using namespace std
#endif
USING_NAMESPACE_STD;
Edit:
This should work also:
#define VERY_OLD_COMPILER 1
#ifdef VERY_OLD_COMPILER
#define USING_NAMESPACE_STD double fabs(double)
#else
#define USING_NAMESPACE_STD using namespace std
#endif
USING_NAMESPACE_STD;
#define VERY_OLD_COMPILER 1
#ifdef VERY_OLD_COMPILER
#define USING_NAMESPACE_STD ;
#else
#define USING_NAMESPACE_STD using namespace std
#endif
USING_NAMESPACE_STD;
int main(int argc, char *argv[]) {
// etc.