Using the function of a specific library - c++

I have a problem: I would like to use the function abs of the library complex.
However, I undergo an error warning me the function abs used is #define abs(x) (x > 0) ? x : -(x).
Thus, I think the problem comes from my imports. Because of I also include the stdio and stdlib libraries, the compiler may use the function abs defined in one of these libraries.
So my question is: how can I use the function abs of the library complex without removing any import ?
Thanks a lot in advance for your response.

Wrap parens around it.
(abs)(whatever);
This will force the compiler to use the function version because the macro no longer matches.
Function-like macros work by matching an identifier followed by a left paren (. Since we've wrapped the function name itself in parens, we have instead an identifier followed by a right paren ), which fails to match the macro. The parens are semantically transparent, but they inhibit the macro syntax.
IIRC, it was splint the C checker which taught this to me. While writing a postscript interpreter, I created nice short macros to access the stack.
#define push(o) (*tos++ = (o))
#define pop() (*--tos)
Which were great until the tricky parts where they were part of an expression involving tos. To avoid undefined behavior, I had to create function versions and use those for those tricky spots. For the new design, I skipped the macros altogether.
Edit: I've got a nagging feeling that it was actually the Coelocanthe book (Peter Van Der Linden's Deep C Secrets) where I learned this, the above situation being where I first needed it. IIRC his example involved putchar or getchar which are often implemented as both functions and macros in conforming C implementations.

Use #undef
#include "header1.h"
#include "header2.h"
#undef abs // remove abs macro
x = std::abs(y);

Whilst several suggestions above are very good, I would take a completely different angle. It is almost certainly something like windows.h that causes the "bad" macro definition of abs(). You should be able to NOT include "windows.h" in the file that does the complex math [in most types of programs, at least] (I'm not aware of a single function in Windows that takes complex<T> as an argument, so I'm pretty certain you don't need both "complex.h" and "windows.h" in the same source file. This method is called "isolating the system dependencies", and doing that is a very good thing.
Have a look at your code, and find where you are ACTUALLY using windows functions, and then only include "windows.h" in the files that actually need it. You'll probably find, if you are using Visual Studio, that "windows.h" is included as part of "stdafx.h", which means that all sort of interesting macros etc are being included all over the place, because "stdafx.h" is included in ALL source files.

Both #undef and parentheses solutions will work but I would advise to have something a little stronger, because both those solutions will required you to do them every time you want to call abs and next time you may forget and result in a bug.
what you can do:
change the name of your function to less common name : absolute, myAbs etc...
put you function\class under a name space (for C++ not for C) then the calls are explicit myNameSpace::abs(x)
If it won't work as the comment here suggested I would still warp the call in my function:
type myAbs(type param)
{
return (abs)(param);
}

Related

How to suppress #define locally?

Just caught a silly bug. I have a zip processing library with a CreateFile() function in it. Winbase.h, included somewhere deep in my headers, redefines it as CreateFileW and linker goes nuts.
Of course I will exclude winbase in this particular case. It just shouldn't be in the scope in the first place. But the theoretical question is still interesting,
Is there a way to suppress some defines locally?
You can get around the macro by putting parentheses around the name:
(CreateFile)(arguments);
This works because the macro CreateFile is a function-like macro (i.e. it takes a list of arguments in parentheses); the right parenthesis after the name doesn't match the syntax for using a function-like macro, so the preprocessor does not expand it.
Of course, the "right" solution is to name the function properly, i.e., create_file. <g>
Removing the offending header file is ALWAYS the best solution for this (especially one as large as windows.h or winbase.h - they are included far too freely for my taste in many projects).
The only other solution is #undef offending_symbol.
Of course, another important thing is "do not use names that match the Windows/Linux system call names" - but CreateFile is a very obvious name for a function that creates a file, so I can see the temptation.
Preprocessor macros have no notion of C++ scope. #defines are just text replacements. If you want to have a 'local' #define, you do something like this:
#define CreateFileW CreateFile
... // here I can use the macro
#undef CreateFileW
Or in your case
#undef CreateFileW
... // Here the macro is not available
#define CreateFileW CreateFile
There is
#undef
which removes defines (but nothing else).
Apart from the aforementioned #undef there technically is not much you can do against #defines, at least not portably.
The best way is to not use #define at all, or at least as little as possible and as constrained as possible. Sometimes you just need a macro to generate some boilerplate code a few times. Be sure to #undef that macro once you are done. The only other valid applications of #define I can think of are include guards and flags for conditional preprocessing.
For #define-deseases like the WinAPI headers you just should constrain them as much as possible. Don't use the #defined types of that API in your headers. You almost never want to use an API all over your application, so use it only in the cpps of a small layer around the API. Reducing the dependencies that way gives a lot more than just disinfecting the rest of your code.

A macro for long and short function names in C++

I am currently working on a general-purpose C++ library.
Well, I like using real-word function names and actually my project has a consistent function naming system. The functions (or methods) start with a verb if they do not return bool (in this case they start with is_)
The problem is this can be somewhat problematic for some programmers. Consider this function:
#include "something.h"
int calculate_geometric_mean(int* values)
{
//insert code here
}
I think such functions seem to be formal, so I name my functions so.
However I designed a simple Macro system for the user to switch function names.
#define SHORT_NAMES
#include "something.h"
#ifdef SHORT_NAMES
int calc_geometric_mean(int* values)
#else
int calculate_geometric_mean(int* values)
#endif
{
//some code
}
Is this wiser than using alias (since each alias of function will be allocated in the memory), or is this solution a pure evil?
FWIW, I don't think this dual-naming system adds a lot of value. It does, however, has the potential for causing a lot of confusion (to put it mildly).
In any case, if you are convinced is a great idea, I would implement it through inline functions rather than macros.
// something.h
int calculate_geometric_mean(int* values); // defined in the .cpp file
inline int calc_geo_mean(int* values) {
return calculate_geometric_mean(values);
}
What symbols will be exported to the object file/library? What if you attempt to use the other version? Will you distribute two binaries with their own symbols?
So - no, bad idea.
Usually, the purpose behind a naming system is to aid the readability and understanding of the code.
Now, you effectively have 2 systems, each of which has a rationale. You're already forcing the reader/maintainer to keep two approaches to naming in mind, which dilutes the end goal of readability. Never mind the ugly #defines that end up polluting your code base.
I'd say choose one system and stick to it, because consistency is the key. I wouldn't say this solution is pure evil per se - I would say that this is not a solution to begin with.

Is there any situation where you wouldn't want include guards?

I know why include guards exist, and that #pragma once is not standard and thus not supported by all compilers etc.
My question is of a different kind:
Is there any sensible reason to ever not have them? I've yet to come across a situation where theoretically, there would be any benefit of not providing include guards in a file that is meant to be included somewhere else. Does anyone have an example where there is an actual benefit of not having them?
The reason I ask - to me they seem pretty redundant, as you always use them, and that the behaviour of #pragma once could as well just be automatically applied to literally everything.
I've seen headers that generate code depending on macros defined before their inclusion. In this case it's sometimes wanted to define those macros to one (set of) value(s), include the header, redefine the macros, and include again.
Everybody who sees such agrees that it's ugly and best avoided, but sometimes (like if the code in said headers is generated by some other means) it's the lesser evil to do that.
Other than that, I can't think of a reason.
#sbi already talked about code generation, so let me give an example.
Say that you have an enumeration of a lot of items, and that you would like to generate a bunch of functions for each of its elements...
One solution is to use this multiple inclusion trick.
// myenumeration.td
MY_ENUMERATION_META_FUNCTION(Item1)
MY_ENUMERATION_META_FUNCTION(Item2)
MY_ENUMERATION_META_FUNCTION(Item3)
MY_ENUMERATION_META_FUNCTION(Item4)
MY_ENUMERATION_META_FUNCTION(Item5)
Then people just use it like so:
#define MY_ENUMERATION_META_FUNCTION(Item_) \
case Item_: return #Item_;
char const* print(MyEnum i)
{
switch(i) {
#include "myenumeration.td"
}
__unreachable__("print");
return 0; // to shut up gcc
}
#undef MY_ENUMERATION_META_FUNCTION
Whether this is nice or hackish is up to you, but clearly it is useful not to have to crawl through all the utilities functions each time a new value is added to the enum.
<cassert>
<assert.h>
"The assert macro is redefined according to the current state of NDEBUG each time that
<assert.h> is included."
It can be a problem if you have two headers in a project which use the same include guard, e.g. if you have two third party libraries, and they both have a header which uses an include guard symbol such as __CONSTANTS_H__, then you won't be able to successfully #include both headers in a given compilation unit. A better solution is #pragma once, but some older compilers do not support this.
Suppose you have a third party library, and you can't modify its code. Now suppose including files from this library generates compiler warnings. You would normally want to compile your own code at high warning levels, but doing so would generate a large set of warnings from using the library. You could write warning disabler/enabler headers that you could then wrap around the third party library, and they should be able to be included multiple times.
Another more sophisticated kind of use is Boost's Preprocessor iteration construct:
http://www.boost.org/doc/libs/1_46_0/libs/preprocessor/doc/index.html
The problem with #pragma once, and the reason it is not part of the standard, is that it just doesn't always work everywhere. How does the compiler know if two files are the same file or not, if included from different paths?
Think about it, what happens if the compiler makes a mistake and fails to include a file that it should have included? What happens if it includes a file twice, that it shouldn't have? How would you fix that?
With include guards, the worst that can happen is that it takes a bit longer to compile.
Edit:
Check out this thread on comp.std.c++ "#pragma once in ISO standard yet?"
http://groups.google.com/group/comp.std.c++/browse_thread/thread/c527240043c8df92

Can I redefine a C++ macro then define it back?

I am using both the JUCE Library and a number of Boost headers in my code. Juce defines "T" as a macro (groan), and Boost often uses "T" in it's template definitions. The result is that if you somehow include the JUCE headers before the Boost headers the preprocessor expands the JUCE macro in the Boost code, and then the compiler gets hopelessly lost.
Keeping my includes in the right order isn't hard most of the time, but it can get tricky when you have a JUCE class that includes some other classes and somewhere up the chain one file includes Boost, and if any of the files before it needed a JUCE include you're in trouble.
My initial hope at fixing this was to
#undef T
before any includes for Boost. But the problem is, if I don't re-define it, then other code gets confused that "T" is not declared.
I then thought that maybe I could do some circular #define trickery like so:
// some includes up here
#define ___T___ T
#undef T
// include boost headers here
#define T ___T___
#undef ___T___
Ugly, but I thought it may work.
Sadly no. I get errors in places using "T" as a macro that
'___T___' was not declared in this scope.
Is there a way to make these two libraries work reliably together?
As greyfade pointed out, your ___T___ trick doesn't work because the preprocessor is a pretty simple creature. An alternative approach is to use pragma directives:
// juice includes here
#pragma push_macro("T")
#undef T
// include boost headers here
#pragma pop_macro("T")
That should work in MSVC++ and GCC has added support for pop_macro and push_macro for compatibility with it. Technically it is implementation-dependent though, but I don't think there's a standard way of temporarily suppressing the definition.
Can you wrap the offending library in another include and trap the #define T inside?
eg:
JUICE_wrapper.h:
#include "juice.h"
#undef T
main.cpp:
#include "JUICE_wrapper.h"
#include "boost.h"
rest of code....
I then thought that maybe I could do some circular #define trickery like so:
The C Preprocessor doesn't work this way. Preprocessor symbols aren't defined in the same sense that a symbol is given meaning when, e.g., you define a function.
It might help to think of the preprocessor as a text-replace engine. When a symbol is defined, it's treated as a straight-up text-replace until the end of the file or until it's undefined. Its value is not stored anywhere, and so, can't be copied. Therefore, the only way to restore the definition of T after you've #undefed it is to completely reproduce its value in a new #define later in your code.
The best you can do is to simply not use Boost or petition the developers of JUCE to not use T as a macro. (Or, worst case, fix it yourself by changing the name of the macro.)

compile error: cpumask.h: "and" may not appear in macro parameter list

I'm trying to move a project from an old linux platform to a kubunutu 9.04. Now I get this error when compiling with gcc 4.3.3:
/usr/src/linux-headers-2.6.28-11-generic/include/linux/cpumask.h:600:37: error: "and" may not appear in macro parameter list
If I understand the message right, it is not allowed to use "and" as a macro parameter, since it is a "reserved command". Two questions about that:
How is this possible? I cannot imagine that there is such a mistake in the linux header files... Did I do something wrong before? I tried an #undef and but this won't help.
How do I fix this error? It cannot be true that I have to change the linux header files, can it?
Thanks for help.
I believe the problem is that and is a keyword in C++ but not C (they use &&).
The kernel guys sometimes macros as an alternative to inline functions. Sometimes, however, they need macros because what they want to do has to be done in the scope of the calling function, and defining a function to do that won't work (for instance a macro to find out the name of the current function).
Assuming the macros in question are really fake inlined functions, it would be possible to write your own .c file full of nothing but functions calling these macros, compile it, and refer to those functions via an extern "C" header. You would get the same behavior, but slightly worse performance (which is unlikely to be a problem).
If the macros actually have to be macros, then your best bet is to hand edit them to be C++ compliant.
The linux headers are C headers, not C++.
define for_each_cpu_and(cpu, mask, and) на #define for_each_cpu_and(cpu, mask, and_deb)
Found this solution # http://www.linux.org.ru/forum/development/4797542
It would help if you also showed the line in question. Perhaps it's all down to context, if you do something crazy before including the header, the compiler might be confused and generate a non-obvious error message.
There are cases when "and" is indeed a reserved word, and if it's C++-only the kernel developers won't care too much since the kernel is focused on C.