Why is C++ still using stdio.h? - c++

this is probably a dumb question, but I couldnt find the answer I was looking for. Also, I was unsure if this was a C++ question or a VS2010 question, but the answer I'm looking for is that of a technical POV, so I ended up here.
When you start a new Console Application project in VS2010, it automatically includes stdafx.h, which in turn includes stdio.h.
The answers I found regarding stdio.h vs. iostream was more or less:
stdio.h was used in C and iostream is
used in C++
I dont know if this is right or wrong, but...
My question is: Why is stdio.h still automatically included in C++ projects? Wouldnt iostream be sufficient?

IO streams in older C++ implementations were pretty slow, leading programmers to keep using stdio.h. Apparently, that got included in stdafx.h in the past and cannot be removed from that header anymore as removing it would break existing code.

Usually projects are created using Create Empty Project, so that you can customize your includes and precompiled headers yourself.
I have no idea why does this "default" include happens, but it's a good thing to setup your project from scratch as I've described before.

Even if you're using stream output, being able to do some formatting is nice. So, if nothing else, sprintf will sometimes be used. sprintf lives in stdio.h

Possibly because visual studio targets Mort programmers, who wouldn't be able to get 'my first c++' program done without printf, and they would decide that the product didn't work right.
Before down voting plea google visual studio mort persona.

Related

How to determine which header files to include?

Say I have the below (very simple) code.
#include <iostream>
int main() {
std::cout << std::stoi("12");
}
This compiles fine on both g++ and clang; however, it fails to compile on MSVC with the following error:
error C2039: 'stoi': is not a member of 'std'
error C3861: 'stoi': identifier not found
I know that std::stoi is part of the <string> header, which presumably the two former compilers include as part of <iostream> and the latter does not. According to the C++ standard [res.on.headers]
A C++ header may include other C++ headers.
Which, to me, basically says that all three compilers are correct.
This issue arose when one of my students submitted work, which the TA marked as not compiling; I of course went and fixed it. However, I would like to prevent future incidents like this. So, is there a way to determine which header files should be included, short of compiling on three different compilers to check every time?
The only way I can think of is to ensure that for every std function call, an appropriate include exists; but if you have existing code which is thousands of lines long, this may be tedious to search through. Is there an easier/better way to ensure cross-compiler compatibility?
Example with the three compilers: https://godbolt.org/z/kJhS6U
Is there an easier/better way to ensure cross-compiler compatibility?
This is always going to be a bit of a chore if you have a huge codebase and haven't been doing this so far, but once you've gone through fixing your includes, you can stick to a simple procedure:
When you write new code that uses a standard feature, like std::stoi, plug that name into Google, go to the cppreference.com article for it, then look at the top to see which header it's defined in.
Then include that, if it's not already included. Job done!
(You could use the standard for this, but that's not as accessible.)
Do not be tempted to sack it all off in favour of cheap, unportable hacks like <bits/stdc++.h>!
tl;dr: documentation
Besides reviewing documentation and doing that manually (painful and time consuming) you can use some tools which can do that for you.
You can use ReSharper in Visual Studio which is capable to organize imports (in fact VS without ReSharper is not very usable). If include is missing it recommends to add it and if it is obsolete line with include is shown in more pale colors.
Or you can use CLion (available for all platforms) which also has this capability (in fact this is the same manufacture JetBrains).
There is also tool called include what you used, but its aim is take advantages of forward declaration, I never used that (personally - my team mate did that for our project).

How to quickly identify functions from header files?

I'm somewhat new to C/C++, and I find myself spending exorbitant amounts of time searching through header files (one innocent include might actually bring dozens more header files with it). Even helpful IDEs like Visual Studio aren't always helpful; sometimes when I try to go to the definition of a function, it prompts me to choose from several--the very thing I was trying to avoid.
So...in a very large project with thousands of header files and many occurrences of functions that share the same name and parameters, what's the best way to determine, without question, which specific function is being called?
Try adding /showIncludes to the compiler command line (In the project settings, Configuration Properties, C/C++, Command Line). This outputs all the headers used in a given .cpp file compilation. This is not a fast way, but it is a sure way.
When Intellisense isn't working, I recommend Find in Files. It is easier to track down the definition in the header this way. I find I can usually tell which is the relevant declaration.
Keep in mind that you cannot find the source in the header, unless you are dealing with templates or inlined functions. So there is generally no reason to attempt to discriminate which declaration(s) are being applied. If the definition exists in a SOURCE file (.c,.cpp), then there can only be one function of that name and signature for it to compile. It is generally better to google the function name if it is a published API from Microsoft or another source.
Tools such as Visual Assist for Visual Studio improve the ability to locate such definitions, as well.
Also, you can massage Intellisense into working better. Try deleting the Intellisense Database and having it be rebuilt. You can see where it has trouble by the "errors" it shows in the error view. Often you need to improve the includes directories, especially if this is a makefile project. If it grays out code that shouldn't be grayed out, some preprocessor symbol is wrong. Maintaining the Intellisense is often worth it because it's great when it works.
Tools such as Visual Assist for Visual Studio have their own, often improved intellisense-like method of finding definitions.
Admittedly, I'm still new to C++ (and programming in general) as well. But, I think that the Visual Studio feature you describe in your question is the most help you'll get. It will narrow things down a little for you, but you'll still have to do some good ole sleuthing.
Ask the compiler. No, really! It's the only way to be sure.
Try using Microsoft visual studio 2005. You can easily jump onto the function definition, Function declaration also you can see the Function call and callers graph as well.

Abolish include-files in C++

Suppose i have the following code (literally) in a C++ source file:
// #include <iostream> // superfluous, commented-out
using std::cout;
using std::endl;
int main()
{
cout << "Hello World" << endl;
return 0;
}
I can compile this code even though #include <iostream> is commented-out:
g++ -include my_cpp_std_lib_hack source.cpp
Where my_cpp_std_lib_hack is a file in some central location that includes all the files of the C++ Standard Library:
#include <ciso646>
#include <climits>
#include <clocale>
...
#include <valarray>
#include <vector>
Of course, i can use proper compilation options for all compilers i care about (that being MS Visual Studio and maybe a few others), and i also use precompiled headers.
Using such a hack gives me the following advantages:
Fast compilation (because all of the Standard Library is precompiled)
No need to add #includes when all i want is adding some debugging output
No need to remember or look up all the time where the heck std::max is declared
A feeling that the STL is magically built-in to the language
So i wonder: am i doing something very wrong here?
Will this hack break down when writing large projects?
Maybe everyone else already uses this, and no one told me?
So i wonder: am i doing something very wrong here?
Yes. Sure, your headers are precompiled, but the compiler still has to do things like name lookups on the entire included mass of stuff which slows down compilation.
Will this hack break down when writing large projects?
Yes, that's pretty much the problem. Plus, if anyone else looks at that code, they're going to be wondering where std::cout (well, assume that's a user defined type) came from. Without the #includes they're going to have no idea whatsoever.
Not to mention, now you have to link against a ton of standard library features that you may have (probably could have) avoided linking against in the first place.
If you want to use precompilation that's fine, but someone should be able to build each and every implementation file even when precompilation is disabled.
The only thing "wrong" is that you are relying upon a compiler-specific command-line flag to make the files compilable. You'd need to do something different if not using GCC. Most compilers probably do provide an equivalent feature, but it is best to write portable source code rather than to unnecessarily rely on features of your specific build environment.
Other programmers shouldn't have to puzzle over your Makefiles (or Ant files, or Eclipse workspaces, or whatever) to figure out how things are working.
This may also cause problems for users of IDE's. If the IDE doesn't know what files are being included, it may not be able to provide automatic completion, source browsing, refactoring, and other such features.
(FWIW, I do think it is a good idea to have one header file that includes all of the Standard Library headers that you are using in your project. It makes precompilation easier, makes it easier to port to a non-standard environment, and also helps deal with those issues that sometimes arise when headers are included in different orders in different source files. But that header file should be explicitly included by each source file; there should be no magic.)
Forget the compilation speed-up - a precompiled header with templates isn't really "precompiled" except for the name and the parse, as far as I've heard. I won't believe in the compilation speed up until I see it in the benchmarks. :)
As for the usefulness:
I prefer to have an IDE which handles my includes for me (this is still bad for C++, but Eclipse already adds known includes with ctrl+shift+n with... well, acceptable reliability :)).
Doing 'clandestine' includes like this would also make testing more difficult. You want to compile a smallest-possible subset of code when testing a particular component. Figuring out what that subset is would be difficult if the headers/sources aren't being honest about their dependencies, so you'd probably just drag your my_cpp_std_lib_hack into every unit test. This would increase compilation time for your test suites a lot. Established code bases often have more than three times as much test code as regular code, so this is likely to become an issue as your code base grows.
From the GCC manual:
-include file
Process file as if #include "file" appeared as the first line of the
primary source file. However, the
first directory searched for file is
the preprocessor's working directory
instead of the directory containing
the main source file. If not found
there, it is searched for in the
remainder of the #include "..." search
chain as normal.
So what you're doing is essentially equivalent to starting each file with the line
#include "my_cpp_std_lib_hack"
which is what Visual Studio does when it gathers up commonly-included files in stdafx.h. There are some benefits to that, as outlined by others, but your approach hides this include in the build process, so that nobody who looked directly at one of your source files would know of this hidden magic. Making your code opaque in this way does not seem like a good style to me, so if you're keen on all the precompiled header benefits I suggest you explicitly include your hack file.
You are doing something very wrong. You are effectively including lots of headers that may not be needed. In general, this is a very bad idea, because you are creating unnecessary dependencies, and a change in any header would require recompilation of everything. Even if you are avoiding this by using precompiled headers, you are still linking to lots of object that you may not need, making your executable much larger than it needs to be.
There is really nothing wrong with the standard way of using headers. You should include everything you are using, and no more (forward declarations are your friends). This makes code easier to follow, and helps you keep dependencies under control.
We try not to include the unused or even the rarely used stuff for example in VC++ there is
#define WIN32_LEAN_AND_MEAN //exclude rarely used stuff
and what we hate in MFC is that if u want to make a simple application u will produce large executable file with the whole library (if statically linked), so it's not a good idea what if u only want to use the cout while the other no??
another thing i don't like to pass arguments via command line coz i may leave the project for a while, and forget what are the arguments... e.g. i prefer using
#pragma (comment, "xxx.lib")
than using it in command line, it reminds me at least with what file i want
That's is my own opinion make your code stable and easy to compile in order to to rot as code rotting is a very nasty thing !!!!!

Consistenty header file names between C++ libraries

In my project I use two libraries, v8 and boost. Boost uses the .hpp extension for its headers, while v8 uses the .h extension for its headers.
In the end of day, my source code starts like that:
#include "v8.h"
#include "boost/filesystem.hpp"
...
In other question I asked about this subject, the general answer was that it is okay, but I just should be consistent between names.
This code compiles well, but, coding styles/standards - is it okay? Is there any solution for this problem (like changing all .hpp to .h automatically somehow?)
Thanks. And sorry for those stupid questions.
Don't worry about the inconsistency, it doesn't matter. Too much time is often spent obsessing about such details, and everyone is guilty of it.
Just be consistent with your own coding standards.
You'll eventually use some 3rd party library or several that use different conventions than you. There's nothing you can do about it, and often 2 of those libraries you use will be conflicting with your standards and with each other. That's not only for include extensions, but also for naming convetions like function_that_does_something vs FunctionThatDoesSomthing .It's fine.
I would definitely strongly advice against trying to change someone else's library to fit into your coding standard. I.e. for example renaming boost .hpp to .h. This is a bad idea and when you want to upgrade to newer versions of the library it will be a nightmare.
Spend your time solving the problem you're solving in a more elegant way rather than worrying about details like this.
It's fine. Coding standards don't really come into it since you have to go with what you're given. If the v8 people only provide .h and the boost people only provide .hpp then, short of copying one set of files to the other choice or providing your own wrapper header files, you have few options.
Both of those option have their downsides for what is really dubious benefits, so I wouldn't concern yourself with the fact that you have to include two different file extensions.

Namespace loop or code leak in boost::function?

I'm really baffled by this. Have I managed to do something to cause this, or is it an unclosed namespace block in boost, or some bug in VS c++ 2008? I'm definitely sure I've closed all my own namespaces properly, all includes are outside of and above them, and all my header files got include guards.
alt text http://lowtown.se/stuffs/superboost.png
The boost/function.hpp is only included in this header. Two other headers in my library includes the boost/cstdint.hpp but they don't have this problem.
Visual C++'s intellisense is a bit quirky. Sometimes it screws up. That doesn't mean there is a problem in your code. Always take C++ intellisense with a grain of salt.
Sometimes intellisense does that. If you use Visual Assist X it will fix that, but it is a very expensive program :(
Usually deleting ncb-file solves most of Intellisense problems. If it doesn't help — buy VA.