I have been thinking this problem for a while but still no idea about it, if my project is mainly cpp file, should a c file name as .c, or should be named as .cpp to consistent with other .cpp file?
I just list some advantage and disadvantage (in my current knowledge) of using .c (I don't know if the following idea is correct):
advantage of .c:
fast to know it does not contain c++ content (e.g.:class,std::string)
easy to separate from .cpp file by searching name
disadvantage of .c:
not consistent with other files (because other files mostly .cpp)
may need to rename it as .cpp if I want to change the function as using oop or want to add some oop features into it
some scripts or files may need to add *.c as file input if the original version only handles *.cpp, (e.g.: need to add *.c in Android.mk in android jni)
Also I don't know if compiler handles .c and .cpp in different way,also don't know if it affects other behaviour (e.g.:performance,platform or compiler specific issues...), is anyone have idea about it?
Depends what you mean by "C" code.
Are you going to compile it with a C compiler?
Call it file.c
Or do you just mean "C-like" C++ code? C++ code that, at time of writing, happens to also be valid C?
Call it file.cpp
Rule of thumb - name it according to which compiler you intend to use for it. This keeps your makefiles nice and simple.
So if your "C code" is C++ code that could be compiled as valid C but that's not what you are doing, then name it *.cpp and let your makefile invoke the C++ compiler on it.
If your code is actual C, to be compiled with a C compiler, then name it *.c - and remember the (appropriately-#ifdefed) extern "C" in the header file so that C++ built against it can link successfully.
C++ fully supports c code. So the compiler would be just fine with c code in a .cpp file.
And like Quentin mentioned above. If your c code is never used in a c only project I would leave it in an cpp file.
I have an third party source file and corresponding header (containing the declarations and include directives for GSL etc) which are written in C. I am trying to build an R package around these source files, basically making a wrappers for the functions using Rcpp. Problem is that these files contain restrict qualifier which is not part of the C++ standard, so R CMD INSTALL cannot compile the package. It does use C compiler for .c file, but want's to compile .h file with C++ compiler and it fails. it fails when it finds restrict in header file (which is included in .cpp file).
I am not that familiar with C and compiler things and Rcpp etc, so I am not sure what would be a best approach here?
Easiest thing would probably be to remove the restrict keyword. This is what I have currently done (I am suprised that R CMD INSTALL works when I remove restrict from header file but leave them to .c file). But I rather not alter the .c and .h files as they are also used in non-R environment by others (executable and in Python) and it would be nice to have identical files for all projects.
I also tried to define empty keyword restrict so that it would just "remove" restrict from function definitions if the compilation was done in C++ compiler, but I couldn't get that work. I read about similar approach somewhere but apparently it doesn't work that way.
Would it work if I could somehow tell the compiler (via Makevars or something?) that the particular .h file should be compiled with C compiler? Or is there going to be problems with C++ function calling those functions?
Or will the whole keyword even matter in terms of performance if those functions are called from R via C++ wrapper?
One thing would be to just ditch the Rcpp and use .C instead of .Call from R, but as the performance is a key here, that doesn't feel a good option, as I understand that .Call is faster (and more reliable).
Note that eventually this package could find it's way to CRAN, so the solution should be fairly portable. There seems to be some C++ compiler specific keywords for restrict but I guess those are not an option due to the portability.
It sounds like you are making a .cpp file which does #include <x.h> where x.h is a C header which uses restrict. If that's true, I think you can modify your .cpp file to do this:
#define restrict // nothing
extern "C"
{
#include <x.h>
}
Then compilation of your C++ code will not see the restrict keyword, and also I have wrapped the header in extern "C" because if the header itself doesn't do that internally, you need to, in order that your C++ compiler will not apply C++ "name mangling" to the functions declared inside.
I'm using a header only library for a project (glm) and am currently trying to debug some problems I'm having. I trust that glm is giving me the correct values, however it is dog slow when built without optimisations (I'm using visual studio 2012/2013/2010 whichever is easiest to do this in, as all 3 are installed).
Is there a way to enable optimisations (specifically /O2), and disable debug symbols for just the GLM header files, while retaining the debug information for the rest of the solution?
EDIT:
I'd like to throw in, that I'd rather not change libraries at this point, as it's almost at the end of the project and I have other things to do aswell, so rewriting to use Eigen/CML isn't really on the table.
You can try:
1) Create one code file and include all headers you need.
2) Define all the template classes in this source file you want to use (e.g. "template ClassA;"
3) Compile this source File with optimization and link later against it.
4) Create a header file and declare all theses classes without the function definitions (simply copy the original header files and erase all functions definitions.)
5) Use this header file for your project.
for gcc they should be the same, right? which one of them is more popular , i am now preparing a project from scratch and i would like to pick one among these 2.
thanks
In C++, the file extension doesn't actually matter. The use of .h, .hpp, .hxx, or no file extension are all by convention.
The standard library uses no file extension for its header files. Many projects, including Boost, use .hpp. Many projects use .h. Just pick one and be consistent in your project.
The compiler doesn't distinguish between the two extensions, so technically it doesn't matter which one you use. Personally I use the .hxx extension for header files that are only used internally in the project, and .hpp for those that should be release with the library/software.
I propose that we re-open this discussion, in view of a recent discovery that I made. For the last 9 years, I have used the following naming convention for the source files in my C and C++ projects.
C = Straight C source code, containing one or more related entry points
CPP = C++ source code, containing one or more related entry points
H = Declaration of functions, macros, structures, typedefs, etc.
INL = Inline (function bodies) that are the bodies of two or more functions whose main definition file is a C or CPP file, into which they are incorporated by #include
An example of these common function bodies, MyStringFunctionA, the ANSI implementation, is defined in MyStringFunctionA.cpp, while MyStringFunctionW, the Unicode (wide character) implementation, is defined in MyStringFunctionW.cpp. MyStringFunctionA.cpp and MyStringFunctionW.cpp contain the prototype, opening and closing brackets, and headers, subject to UNICODE for the wide character version. The function body is defined in the INI file, which is #included inline, within the function definition block.
Combined with generic TCHAR mappings, this approach greatly simplifies maintaining Unicode and ANSI versions, both of which remain in active use.
This naming convention worked great with Visual Studio 6. However, when I began migrating my code base to Visual Studio 2013, I discovered an annoying change that was initially confusing. Although everything compiled cleanly, when one of my INL files was open in the code editor, I would see dozens of Intellisense "errors" listed in the Errors window. I quoted the term "errors" because they are not true errors; they vanish when the INL file is closed, and the C and CPP files into which the iNL is pulled compile without errors, link, and run correctly.
Header file extensions don't usually make a difference, but I know that in some cases the extension of the .cpp file can make a difference. Depending on your compiler the front-end may choose to compile the source file as either C or C++.
This can make a difference, especially if you are combining compilation with the link phase, as it can lead to different libraries being linked (e.g. g++ vs gcc) and so you can control the outcome in your makefile.
Source code header file written in the C++ programming language; may include data types, constants, variables, and other definitions; used for declaring and storing reusable components of code.
HXX files can be inserted into a C++ program using the #include directive. For example, #include myHeader.hxx instructs the C++ compiler to include "myHeader.hxx" into the current program file.
I am working on a large C++ project in Visual Studio 2008, and there are a lot of files with unnecessary #include directives. Sometimes the #includes are just artifacts and everything will compile fine with them removed, and in other cases classes could be forward declared and the #include could be moved to the .cpp file. Are there any good tools for detecting both of these cases?
While it won't reveal unneeded include files, Visual studio has a setting /showIncludes (right click on a .cpp file, Properties->C/C++->Advanced) that will output a tree of all included files at compile time. This can help in identifying files that shouldn't need to be included.
You can also take a look at the pimpl idiom to let you get away with fewer header file dependencies to make it easier to see the cruft that you can remove.
PC Lint works quite well for this, and it finds all sorts of other goofy problems for you too. It has command line options that can be used to create External Tools in Visual Studio, but I've found that the Visual Lint addin is easier to work with. Even the free version of Visual Lint helps. But give PC-Lint a shot. Configuring it so it doesn't give you too many warnings takes a bit of time, but you'll be amazed at what it turns up.
There's a new Clang-based tool, include-what-you-use, that aims to do this.
!!DISCLAIMER!! I work on a commercial static analysis tool (not PC Lint). !!DISCLAIMER!!
There are several issues with a simple non parsing approach:
1) Overload Sets:
It's possible that an overloaded function has declarations that come from different files. It might be that removing one header file results in a different overload being chosen rather than a compile error! The result will be a silent change in semantics that may be very difficult to track down afterwards.
2) Template specializations:
Similar to the overload example, if you have partial or explicit specializations for a template you want them all to be visible when the template is used. It might be that specializations for the primary template are in different header files. Removing the header with the specialization will not cause a compile error, but may result in undefined behaviour if that specialization would have been selected. (See: Visibility of template specialization of C++ function)
As pointed out by 'msalters', performing a full analysis of the code also allows for analysis of class usage. By checking how a class is used though a specific path of files, it is possible that the definition of the class (and therefore all of its dependnecies) can be removed completely or at least moved to a level closer to the main source in the include tree.
I don't know of any such tools, and I have thought about writing one in the past, but it turns out that this is a difficult problem to solve.
Say your source file includes a.h and b.h; a.h contains #define USE_FEATURE_X and b.h uses #ifdef USE_FEATURE_X. If #include "a.h" is commented out, your file may still compile, but may not do what you expect. Detecting this programatically is non-trivial.
Whatever tool does this would need to know your build environment as well. If a.h looks like:
#if defined( WINNT )
#define USE_FEATURE_X
#endif
Then USE_FEATURE_X is only defined if WINNT is defined, so the tool would need to know what directives are generated by the compiler itself as well as which ones are specified in the compile command rather than in a header file.
Like Timmermans, I'm not familiar with any tools for this. But I have known programmers who wrote a Perl (or Python) script to try commenting out each include line one at a time and then compile each file.
It appears that now Eric Raymond has a tool for this.
Google's cpplint.py has an "include what you use" rule (among many others), but as far as I can tell, no "include only what you use." Even so, it can be useful.
If you're interested in this topic in general, you might want to check out Lakos' Large Scale C++ Software Design. It's a bit dated, but goes into lots of "physical design" issues like finding the absolute minimum of headers that need to be included. I haven't really seen this sort of thing discussed anywhere else.
Give Include Manager a try. It integrates easily in Visual Studio and visualizes your include paths which helps you to find unnecessary stuff.
Internally it uses Graphviz but there are many more cool features. And although it is a commercial product it has a very low price.
You can build an include graph using C/C++ Include File Dependencies Watcher, and find unneeded includes visually.
If your header files generally start with
#ifndef __SOMEHEADER_H__
#define __SOMEHEADER_H__
// header contents
#endif
(as opposed to using #pragma once) you could change that to:
#ifndef __SOMEHEADER_H__
#define __SOMEHEADER_H__
// header contents
#else
#pragma message("Someheader.h superfluously included")
#endif
And since the compiler outputs the name of the cpp file being compiled, that would let you know at least which cpp file is causing the header to be brought in multiple times.
PC-Lint can indeed do this. One easy way to do this is to configure it to detect just unused include files and ignore all other issues. This is pretty straightforward - to enable just message 766 ("Header file not used in module"), just include the options -w0 +e766 on the command line.
The same approach can also be used with related messages such as 964 ("Header file not directly used in module") and 966 ("Indirectly included header file not used in module").
FWIW I wrote about this in more detail in a blog post last week at http://www.riverblade.co.uk/blog.php?archive=2008_09_01_archive.xml#3575027665614976318.
Adding one or both of the following #defines
will exclude often unnecessary header files and
may substantially improve
compile times especially if the code that is not using Windows API functions.
#define WIN32_LEAN_AND_MEAN
#define VC_EXTRALEAN
See http://support.microsoft.com/kb/166474
If you are looking to remove unnecessary #include files in order to decrease build times, your time and money might be better spent parallelizing your build process using cl.exe /MP, make -j, Xoreax IncrediBuild, distcc/icecream, etc.
Of course, if you already have a parallel build process and you're still trying to speed it up, then by all means clean up your #include directives and remove those unnecessary dependencies.
Start with each include file, and ensure that each include file only includes what is necessary to compile itself. Any include files that are then missing for the C++ files, can be added to the C++ files themselves.
For each include and source file, comment out each include file one at a time and see if it compiles.
It is also a good idea to sort the include files alphabetically, and where this is not possible, add a comment.
If you aren't already, using a precompiled header to include everything that you're not going to change (platform headers, external SDK headers, or static already completed pieces of your project) will make a huge difference in build times.
http://msdn.microsoft.com/en-us/library/szfdksca(VS.71).aspx
Also, although it may be too late for your project, organizing your project into sections and not lumping all local headers to one big main header is a good practice, although it takes a little extra work.
If you would work with Eclipse CDT you could try out http://includator.com to optimize your include structure. However, Includator might not know enough about VC++'s predefined includes and setting up CDT to use VC++ with correct includes is not built into CDT yet.
The latest Jetbrains IDE, CLion, automatically shows (in gray) the includes that are not used in the current file.
It is also possible to have the list of all the unused includes (and also functions, methods, etc...) from the IDE.
Some of the existing answers state that it's hard. That's indeed true, because you need a full compiler to detect the cases in which a forward declaration would be appropriate. You cant parse C++ without knowing what the symbols mean; the grammar is simply too ambiguous for that. You must know whether a certain name names a class (could be forward-declared) or a variable (can't). Also, you need to be namespace-aware.
Maybe a little late, but I once found a WebKit perl script that did just what you wanted. It'll need some adapting I believe (I'm not well versed in perl), but it should do the trick:
http://trac.webkit.org/browser/branches/old/safari-3-2-branch/WebKitTools/Scripts/find-extra-includes
(this is an old branch because trunk doesn't have the file anymore)
If there's a particular header that you think isn't needed anymore (say
string.h), you can comment out that include then put this below all the
includes:
#ifdef _STRING_H_
# error string.h is included indirectly
#endif
Of course your interface headers might use a different #define convention
to record their inclusion in CPP memory. Or no convention, in which case
this approach won't work.
Then rebuild. There are three possibilities:
It builds ok. string.h wasn't compile-critical, and the include for it
can be removed.
The #error trips. string.g was included indirectly somehow
You still don't know if string.h is required. If it is required, you
should directly #include it (see below).
You get some other compilation error. string.h was needed and isn't being
included indirectly, so the include was correct to begin with.
Note that depending on indirect inclusion when your .h or .c directly uses
another .h is almost certainly a bug: you are in effect promising that your
code will only require that header as long as some other header you're using
requires it, which probably isn't what you meant.
The caveats mentioned in other answers about headers that modify behavior
rather that declaring things which cause build failures apply here as well.