Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I was wondering if there are any official recommendations regarding the use of define in c++ language, precisely is it best to define in your header or your source file?
I am asking this to know if there are any official standards to live by, or is it just plain subjective... I don't need the whole set of standards but the source or a link to the guidelines, will suffice.
LATER EDIT:
What is the explanation of the fact that const and constexpr have become the status quo, I am referring to define used as means of avoiding repetitive typing, it is clear in my mind that programmers should use the full potential of the c++ oop compiler. On the other hand, if it is so feared, why not remove it altogether? I mean, as far as I understand, define is used solely for conditional compilation, especially, as in making the same code work on different compilers.
Secondary, tiny question, the potential for errors is also the main reason why java doesn't have true C-style define?
A short list of #define use guidelines for C++, points 2, 4, 6 and 7 actually address the question:
Avoid them
Use them for the the common "include guard" pattern in header files
Otherwise, don't use them, unless you can explain, why you are using #define and not const, constexpr, or an inline or a template function, etc, instead.
Use them to allow giving compile time options from compiler command line, but only when having the option as run-time option is not feasible or desirable.
Use them when whatever library you are using requires using them (example: disable assert() function )
In general, put everything in the most narrow possible scope. For some uses of #define macros, this means #define just before a function in .cpp file, then #undef right after the function.
The exact use case for #define determines if it should be in .h or in .cpp file. But note that most use cases are actually in violation of 3. above, and you should actually not use #define.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I found several places where it is discussed whether it is better to put definitions in headers or not (e.g. here). However, I could not find something like a "guide to header-only code". The answer to the linked question mentions some downsides:
increased compile time
not possible to have circular dependencies
no (simple) global objects
But is that all?
What are the consequences of putting (all) code in the header?
Am I save if I use header guards, or are there other pitfalls?
The reason I am asking this is the following:
I am in a situation where I think it would easiest to put all the code in my header files. It is a (rather small) collection of classes and functions that is supposed to be included by others in their code. It is supposed to be used in different environments and in different frameworks. At the moment, I do not see why I should build my code (into a lib), when the one using it can simply include the header she/he needs and compile it. However, independent of this project I always have a "bad feeling" when putting code in headers, even if none of the 3 points I mentioned above matters. Would be really nice if someone could shed some light on this for me so I can make the decision where to put the code on a more reasonable basis.
Starting from my personal experience, I usually put only one-line functions (getters and setters) in the header file because all other function bodies will make the header file difficult to read and understand at a glance.
Moreover if your project needs to include the header file multiple times (and you wrote function code in it), you would have an icreasing compile times since all code has to be processed every time it is included by the compiler.
There are several examples of brilliant libraries implemented mostly in header files, e.g. the std library or boost. In particular, if you want to distribute a template library, you have you have no real alternative.
Worst consequences of such an approach, imho, are:
exploding of compilation time: every edit you make to your code, you'll have to rebuild all files that include that header; this is a really serious issue, unless you keep using the ".h"/".cpp" approach while developing and then rearrange your code into the header just at the end
binary code bloat: all your functions will have to be declared "inline", so you may have a performance improvement but you may (1) also have a replication of binary code every time you use a function
(1) see Klaus comment and inline description at cppreference.com (quoted below):
The intent of the inline keyword is to serve as an indicator to the optimizer that inline substitution of the function is preferred over function call, that is, instead of executing the call CPU instruction to transfer control to the function body, a copy of the function body is executed without generating the call. This avoids extra overhead created by the function call (copying the arguments and retrieving the result) but it may result in a larger executable as the code for the function has to be repeated multiple times.
Since this meaning of the keyword inline is non-binding, compilers are free to use inline substitution for any function that's not marked inline, and are free to generate function calls to any function marked inline. Those choices do not change the rules regarding multiple definitions and shared statics listed above.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
Is there any way within a C or C++ program of getting information on all the functions that could be called? Perhaps a compiler macro of some sort? I know that there are programs that could take in source files or .o files and get the symbols or the prototypes, and I suppose I could just run those programs within a c program, but I'm curious about maybe returning function pointers to functions or an array of function prototypes available in the current scope, or something related?
I'm not phrasing this very well, but the question is part of my curiosity of what I can learn about a program from within the program (and not necessarily by just reading its own code). I kind of doubt that there is anything like what I'm asking for, but I'm curious.
Edit: It appears that what I was wondering about but didn't know how to describe very well was whether reflection was possible in C or C++. Thank you for your answers.
The language doesn't support reflection yet. However, since you are looking for some sources of information, take a look at the Boost.Reflect library to help you add reflection to your code, to a certain extent. Also, look at ClangTooling and libclang for libraries that let you do automated code-analysis.
C and C++ have no way to gather the names of all the functions available.
However, you can use macros to test standards (ANSI, ISO, POSIX, etc) compliance, which can then be used to guarantee the presence of each standard's functions.
For example, if _POSIX_C_SOURCE is defined, you can (usually) assume that functions specified by POSIX will be available:
#ifdef _POSIX_C_SOURCE
/* you can safely call POSIX functions */
#else
/* the system probably isn't POSIX compliant */
#endif
Edit: If you're on a Linux system, you can find some common compatibility macros under feature_test_macros(7). OS X and the BSDs should have roughly the same macros, even though they may not have that manual page. Windows uses the WINVER and _WIN32_WINNT macros to control function visibility across releases.
No.
C++ meta-programming power is weak don't include any form of reflection. You can however use tools like gcc-xml to parse a C++ program and export its content in a easier to analyze format.
Writing your own parser for C++ to extract function declaration is going to be a nightmare unless you only need to do that on your specific project and you're ready to cut some corners.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I would like to write a project in C on linux. The simplicity and universality of binding to C from other computer programming languages makes it a preferential choice over other computer languagues, such as C++, Obj-C, D, C#, etc.
Unfortunately, some of C's limitations drive me batty. (IDEs don't solve them all. They plaster over some. Besides, I use emacs, gcc, and cgdb.)
I would rather have multi-pass forward scanning of function definitions, so I don't need prototypes; and I would rather not have to have .h files. I can then put everything from one "module" into one and just one .c file. Maybe this needs a "public" keyword to designate any function symbols I want to export.
I would love optional arguments on functions: function x(y =0).
These are collections of itches. All fairly pedestrian. Nothing as complex as a full language, much less a real new feature such as garbage collection or inheritance. More like C 11.1. It would just require a more sophisticated preprocessor. Writing such a preprocessor for C [in perl] would not be too hard, but writing all the tools that go with it would require in-depth knowledge of the common support tools (emacs, gdb, etc.) which I do not have.
(more pedestrian request: a pragma that states to zero all structs and arrays upon creation. Pass through of '...' varargs. true doc support---doxygen has idiosyncracies. multiline support.)
are there any such extendable C solutions in gcc? the gap between C and C++ is way too far, but the valley in between seems to have few choices that retain the advantages of C.
this does not solve all of the C woes, but it improves it:
http://www.hwaci.com/sw/mkhdr/
provides makeheaders, which generates .h files from each .c file that take care of prototypes, including for forward references and keeping .c and .h files in sync. big improvement for me.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Currently, I am refactoring some old project which has been written by somebody of our ex-workers. I have encountered with wrapping of throwing exception with a define.
Something like that:
#define THROWIT(msg) throw common::error(msg)
Example from the code:
#define THROW_FD_ERROR( fd, op )\
throw common::error_fd( errno,\
__ERR_FD_API_3\
.arg( fd )\
.arg( op )\
.arg( strerror(errno) ),\
__FILE__,\
__LINE__ )
I can see some benefits of it, but they not so huge for me to do it in a such way.
Anyway, is it a common technic?
In you opinion what advantages can be gained from it?
Are you using defines for throwing exception?
if yes what the purpose of that ?
UPD: add define from the code
UPD2: Thanks all for your answers. I've decided to take out all macros. In purpose of debuging I will extend the base error class with backtrace info, in my opinion it is better than just using standart defines for file and line.
Typically, the preprocessor is only used if you need a preprocessor-specific feature, like __FILE__ or __LINE__. This macro does nothing a function cannot and therefore it is quite atypical and bad.
The Macro as presented doesn't have a whole lot of benefit.
However, a macro can have a benefit if you want to include file name, function name and line numbers in the exception message:
#define POSSIBLY_USEFUL_THROWIT(msg) throw common::error(__FILE__, __FUNCTION__, __LINE__, msg)
Oh, and THROWIT is a horrible name for this.
Alf highlights a good point:
You can use a macro to collect the information, and it's the only way
to do it. However, tying that to the throwing of an exception is a
conflation of responsibilities. This means you would need separate
such macros for logging, UI message, and so on. A single macro would
be far preferable.
I think what he means is having something like this:
// Construct new temporary object source_line_info
#define CURRENT_SRC_LINE_INFO() common::source_line_info(__FILE__, __FUNCTION__, __LINE__)
and then using it like this:
throw common::error(CURRENT_SRC_LINE_INFO(), msg);
to have only that part macro'fied that really needs it.
Personally, I would then prefer to have an additional macro like
#define THROW_COMMON_ERROR(...) throw common::error(CURRENT_SRC_LINE_INFO(), ...
Because if I'm going to have a "macro call" on multiple lines, I might just as well make it as short and as centralized as possible, even if that means introducing another macro.
No. Don't. Bad. It makes the code harder to understand and isn't all that shorter to type.
If you really must, use a function. But I don't think you really must, in this case.
Advantages are that there are less characters to type and that you could change the throw declaration (like throwing another type) at a single point (the macro). However, you could also use a usual function instead of a macro. Using macros where a function can do exactly the same is considered no good practice because of the problems macros have (like no scoping and possible pollution of other files that include the macro defining header. Macros are at most a tool to be used when no other language feature can do the same thing and you desperately need it.
Thus, I would not consider this good practice.
No, it's better to use inline functions in C++. Macro's are substituted without compiler's checks. Preprocessor macros should be used where no other way to do the task.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
Google C++ Style Guide (http://google-styleguide.googlecode.com/svn/trunk/cppguide.xml#Preprocessor_Macros) says:
"Instead of using a macro to conditionally compile code ... well, don't do that at all"
Why is it so bad to have functions like
void foo()
{
// some code
#ifdef SOME_FUNCTIONALITY
// code
#endif
// more code
}
?
As they say in the doc you linked to:
Macros mean that the code you see is not the same as the code the compiler sees. This can introduce unexpected behavior, especially since macros have global scope.
It's not too bad if you have just one conditional compilation, but can get quick complicated if you start having nested ones like:
#if PS3
...
#if COOL_FEATURE
...
#endif
...
#elif XBOX
...
#if COOL_FEATURE
...
#endif
...
#elif PC
...
#if COOL_FEATURE
...
#endif
...
#end
I believe some the arguments against it go:
#ifdef cuts across C++ expression/statement/function/class syntax. That is to say, like goto it is too flexible for you to trust yourself to use it.
Suppose the code in // code compiles when SOME_FUNCTIONALITY is not defined. Then just use if with a static const bool and trust your compiler to eliminate dead code.
Suppose the code in // code doesn't compile when SOME_FUNCTIONALITY is not defined. Then you're creating a dog's breakfast of valid code mixed with invalid code, and relevant code with irrelevant code, that could probably be improved by separating the two cases more thoroughly.
The preprocessor was a terrible mistake: Java is way better than C or C++, but if we want to muck around near the metal we're stuck with them. Try to pretend the # character doesn't exist.
Explicit conditionals are a terrible mistake: polymorphism baby!
Google's style guide specifically mentions testing: if you use #ifdef, then you need two separate executables to test both branches of your code. This is hassle, you should prefer a single executable, that can be tested against all supported configurations. The same objection would logically apply to a static const bool, of course. In general testing is easier when you avoid static dependencies. Prefer to inject them, even if the "dependency" is just on a boolean value.
I'm not wholly sold on any argument individually -- personally I think messy code is still occasionally the best for a particular job under particular circumstances. But the Google C++ style guide is not in the business of telling you to use your best judgement. It's in the business of setting a uniform coding style, and eliminating some language features that the authors don't like or don't trust.