I'm new to using macro functions and I understand there are some pitfalls in their use when it comes to order of operations. Is there a way to expand the macro after the preprocessor goes through it so I can see what it looks like?
In VS2017, I've tried Processor > C/C++ > Preprocessor > Preprocess to a file which creates an *.i file but it's around 50k lines long and I can't seem to find where my macro was expanded to.
edit: I know macros are bad news bears, however, the code base I'm stepping into uses them quite a bit so I'm trying to better understand them.
In VS2017, I've tried Processor > C/C++ > Preprocessor > Preprocess to a file which creates an *.i file but it's around 50k lines long and I can't seem to find where my macro was expanded to.
You can help yourself by declaring a dummy variable before the line where a macro is used.
E.g.
extern int dummyIntVariable;
MY_COMPLICATED_MACRO(arg1, arg2);
After that, you look for dummyIntVariable in the .i file. The line below it will contain what MY_COMPLICATED_MACRO expands to.
Or as #Sneftel pointed out in a comment, you can use any old string that helps you navigate through the .i file.
THIS IS A UNIQUE STRING
MY_COMPLICATED_MACRO(arg1, arg2);
Since the file will be just pre-processed, that should also work.
Related
I have some C/C++ source file (.hpp,.cpp) containing something like
...
#define SOME_DEFINE(t) some_ns::some_type<t>
...
// define is somehow used later in the code
I would like to have a modified source (for readability) to have all the SOME_DEFINE(t) to be substituted in this file.
So I'm definitely not willing to apply a preprocessor compiler step - only this #define substituted and only for this source file.
You have several options..
Run the preprocessor and store the output. With gcc it is -E to get the output after preprocessor. Depedning on how much other preprocessing is used in the sources, this might be viable or not.
Use a regex to search and replace.
Use a template alias template <typename T> SOME_DEFINE = some_ns::some_type<T>; . Then use search and replace via regex to replace SOME_DEFINE(t) with SOME_DEFINE<T>.
Find a tool that does it out of the box. Actually I am not aware of one, though tool recomenadations are offtopic anyhow. In a comment https://dotat.at/prog/unifdef/ was metioned.
It's easy to get a list unused functions and variables with linker feedback, but how can I detect those unused macro definitions & typedefs? Do I have to browse the code line by line and git grep in the whole project?
For macros defined in source files you might try -Wunused-macros gcc/clang flag.
There's also -Wunused-local-typedefs in gcc.
Static analysis tools for C and C++ programs may include a check on unused preprocessor macros.
For instance see PC-Lint.
Another possibility would be to go into specific include files and use #ifdef 0 to remove large sections of macros and then review the compiler errors using a kind of divide and conquer algorithm.
However I would expect that a static analysis tool would be much better approach as the source code size becomes large.
For unused macros, you might want to take a look at coan. It has options that might assist with this task. From the about page:
What symbols would appear within active preprocessor directives under a given configuration?
(A preprocessor directive is active if it is not within the scope of any false #if). Supposing again that you are interested in the C-source in app, you can display a list of these symbols, with file names and line numbers, with the command:
$>coan symbols --recurse --locate --active --once --filter c,h app
It has options to remove conditioned out chunks of code (#if 0 and friends), and many other useful features for dealing with the C preprocessor. I would use it to collect all the #defined symbols and all the #ifdef or defined symbols and friends. I'd sort and uniq those two collections of symbols and diff them. This is a pretty good way of locating typos. Then I'd take a histogram of them separately and start with the least frequent and work my way up the lists.
For unused typedefs, that's another challenge. You could use a cross-reference type program like OpenGrok or GNU Global, but that's not very automatic.
There is cscout (now open source) at :
https://github.com/dspinellis/cscout
that finds unused 'extern', #define.
I've looked at practically every file included in the libcurl source package and can't seem to find where the CURLOPT_* options are defined. I gather that they're probably integers, perhaps an enum, but for the life of me I can't find them.
The language I'm writing in is RealBasic, if that matters at all. Generally when using an external library written in C I need to manually find and translate the various #define blocks in the headers. But I have to know where the #define block is before I can do anything!
They're defined with the CINIT() macro within the curl/curl.h header file. In a very recent such file (as of me writing this) they start at line 782.
The macro actually creates a line within a big enum construct.
I am interested in defining my own language inside a C++ block (lets say for example main) and for that purpose I need to use the preprocessor and its directives my problem relies to the below rule:
#define INSERT create() ...
Is called a function-like definition and preprocessor does not allow any whitespaces in what we define ,
So when I use a function of my own language I got to parse right handy the below statement:
INSERT INTO variable_name VALUES(arg_list)
to a different two function calls lets say
insertINTO(variable_name) and valuePARSE(arg_list)
but since the preprocessor directive rules do not allow me to have whitespaces in my definition how I can reach the variable_name and then make the call to the first function call I want to achieve?
Any clues would be helpful.
PS: I tried using g++ -E file.cpp to see how preprocessor works and to adjust the syntax to be valid c++ rules.
The preprocessor included with most C++ compilers is probably way too weak for this kind of task. It was never designed for this kind of abuse. The boost preprocessor library could help you on the way, but I still think you're heading down a one-way street here.
If you really want to define your language this way, I suggest you either write your own preprocessor, or use one that is more powerful than the default one. Here is one chap who tried using Python as a C++ preprocessor.
1) define INSERT create() is not a function-like macro it's object-like, something like define INSERT(a, b, c) create(a, b, c) would be;
2) if you want to expand INSERT INTO variable_name VALUES(arg_list) into insertINTO(variable_name); valuePARSE(arg_list); you can do something like:
#define INSERT insertINTO(
#define INTO
#define VALUES(...) ); valueParse(__VA_ARGS__);
3) as you can see macros get ugly pretty easy and even the slightest error in your syntax will have you spend a lot of time tracking it down
4) since it's tagged C++ take a look at Boost.Proto or Boost.Preprocessor.
When we see #include <iostream>, it is said to be a preprocessor directive.
#include ---> directive
And, I think:
<iostream> ---> preprocessor
But, what is meant by "preprocessor" and "directive"?
It may help to think of the relationship between a "directive" and being "given directions" (i.e. orders). "preprocessor directives" are directions to the preprocessor about changes it should make to the code before the later stages of compilation kick in.
But, what's the preprocessor? Well, its name reflects that it processes the source code before the "main" stages of compilation. It's simply there to process the textual source code, modifying it in various ways. The preprocessor doesn't even understand the tokens it operates on - it has no notion of types or variables, classes or functions - it's all just quoted- and/or parentheses- grouped, comma- and/or whitespace separated text to be manhandled. This extra process gives more flexibility in selecting, combining and even generating parts of the program.
EDIT addressing #SWEngineer's comment: Many people find it helpful to think of the preprocessor as a separate program that modifies the C++ program, then gives its output to the "real" C++ compiler (this is pretty much the way it used to be). When the preprocessor sees #include <iostream> it thinks "ahhha - this is something I understand, I'm going to take care of this and not just pass it through blindly to the C++ compiler". So, it searches a number of directories (some standard ones like /usr/include and wherever the compiler installed its own headers, as well as others specified using -I on the command line) looking for a file called "iostream". When it finds it, it then replaces the line in the input program saying "#include " with the complete contents of the file called "iostream", adding the result to the output. BUT, it then moves to the first line it read from the "iostream" file, looking for more directives that it understands.
So, the preprocessor is very simple. It can understand #include, #define, #if/#elif/#endif, #ifdef and $ifndef, #warning and #error, but not much else. It doesn't have a clue what an "int" is, a template, a class, or any of that "real" C++ stuff. It's more like some automated editor that cuts and pastes parts of files and code around, preparing the program that the C++ compiler proper will eventually see and process. The preprocessor is still very useful, because it knows how to find parts of the program in all those different directories (the next stage in compilation doesn't need to know anything about that), and it can remove code that might work on some other computer system but wouldn't be valid on the one in use. It can also allow the program to use short, concise macro statements that generate a lot of real C++ code, making the program more manageable.
#include is the preprocessor directive, <iostream> is just an argument supplied in addition to this directive, which in this case happens to be a file name.
Some preprocessor directives take arguments, some don't, e.g.
#define FOO 1
#ifdef _NDEBUG
....
#else
....
#endif
#warning Untested code !
The common feature is that they all start with #.
In Olden Times the preprocessor was a separate tool which pre-processed source code before passing it to the compiler front-end, performing macro substitutions and including header files, etc. These days the pre-processor is usually an integral part of the compiler, but it essentially just does the same job.
Preprocessor directives, such as #define and #ifdef, are typically used to make source programs easy to change and easy to compile in different execution environments. Directives in the source file tell the preprocessor to perform specific actions. For example, the preprocessor can replace tokens in the text, insert the contents of other files into the source file...
#include is a preprocessor directive meaning that it is use by the preprocessor part of the compiler. This happens 'before' the compilation process. The #include needs to specify 'what' to include, this is supplied by the argument iostream. This tells the preprocessor to include the file iostream.h.
More information:
Preprocessor Directives on MSDN
Preprocessor directives on cplusplus.com