ASSERT vs. ATLASSERT vs. assert - c++

I am refactoring some MFC code that is littered with ASSERT statements, and in preparation for a future Linux port I want to replace them with the standard assert. Are there any significant differences between the two implementations that people know of that could bite me on the backside?
Similarly, I have also come across some code that uses ATLASSERT which I would also like to replace.

No. The MFC version just includes an easy to debug break point.

Replace them with your own assertion macro. That's how you get the most benefit out of it (logging, stack trace, etc.)

I would recommend either using your own macro, or #define's for the linux compilation. There's no compelling reason to give up any extra helpfulness on the Windows side (eg: built-in breakpoint), and no compelling reason to change a lot of code when some simple compatibility #define's will suffice.

Related

how to disable alternative tokens in C++ [duplicate]

GCC seems to allow "and" / "or" to be used instead of "&&" / "||" in C++ code; however, as I expected, many compilers (notably MSVC 7) do not support this. The fact that GCC allows this has caused some annoyances for us in that we have different developers working on the same code base on multiple platforms and occasionally, these "errors" slip in as people are switching back and forth between Python and C++ development.
Ideally, we would all remember to use the appropriate syntax, but for those situations where we occasionally mess up, it would be really nice if GCC didn't let it slide. Anybody have any ideas on approaches to this?
If "and" and "or" are simply #defines then I could #undef when using GCC but I worry that it is more likely built into the compiler at more fundamental level.
Thanks.
They are part of the C++ standard, see for instance this StackOverflow answer (which quotes the relevant parts of the standard).
Another answer in the same question mentions how to do the opposite: make them work in MSVC.
To disable them in GCC, use -fno-operator-names. Note that, by doing so, you are in fact switching to a non-standard dialect of C++, and there is a risk that you end up writing code which might not compile correctly on standard-compliant compilers (for instance, if you declare a variable with a name that would normally be reserved).
The words are standard in C++ without the inclusion of any header.
The words are standard in C if you include the header <iso646.h>.
MSVC is doing you no service by not supporting the standards.
You could, however, use tools to enforce the non-use of the keywords. And it can be a coding guideline, and you can quickly train your team not to make silly portability mistakes. It isn't that hard to avoid the problem.
Have you considered any code analysis tools? Something similar to FxCop? With FxCop you can write your own rules( check for && ) and you can set it to run during the pre-compile stage.
-pedantic-errors may help for this, among other gnu-isms.

What's the deal with the CRT SECURE warnings/errors on visual studio?

I encountered this issue a dozen, if not a million times already: I compile a c++ program on visual studio and get a dozen, if not a million warnings and/or errors suggesting that I am doing something very dangerous and that there is no way my compiler will let me do that. the warnings/errors tell me that I am using a deprecated function and that I should consider using some other safer function that may or may not do the same thing as this one, but I have no idea what this one does in the first place since I did not write it.
After some research (I do it everytime, I am not a quick learner) I find out I am not the first one facing this particular problem, and I can coerce my compiler to work with this program with the proper macro definition (for the future readers who don't care about my question but want to compile their program, you have to define _CRT_SECURE_NO_DEPRECATE, don't you ever dare following visual studio's advice and using the allegedly safe function).
I have often read in the manual or on this very website, along with the answer, the fact that I should not do that if I don't know precisely what I am doing.
I must confess: I have no idea what I am doing, and I would be very grateful if someone would accept to explain it to me.
So here are my questions:
What are those functions that are unsafe? Why do they exist in the first place?
What is unsafe about them?
Why are they so often found in perfectly honourable libraries?
I have come to the understanding that there is no safe and portable alternative to those functions: why is it so? How about we have some people think about it and try to define a way to do it, and everyone would accept to do it that way, and we would call it standard maybe?
To tackle your questions in order:
They exist in the first place because the standard wrote them in such a way. Standards authors are human so don't think of everything and this left some security weaknesses in the C API. You can find a list of these deprecated functions at http://msdn.microsoft.com/en-us/library/ms235384.aspx.
Many of the functions are unsafe as they allow such things as buffer overruns to occur but other security vulnerabilities may be exposed depending on the function.
Honourable libraries generally try for some cross platform compatibility so I suspect will try to stick to stand C rather than using compiler specific functions and extensions.
The "perfect" standard will probably never exist as in my first point :) Some of the C API problems can be avoided using C++ but that's a big hammer to crack a small nut and brings security vulnerabilities of its own.

Dos and Don'ts of Conditional Compile [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 12 years ago.
When is doing conditional compilation a good idea and when is it a horribly bad idea?
By conditional compile I mean using #ifdefs to only compile certain bits of code in certain conditions. The #defineds themselves may be in either a common header file or introduced via the -D compiler directive.
The good ideas:
header guards (you can't do much better for portability)
conditional implementation (juggling with platform differences)
debug specific checks (asserts, etc...)
per suggestion: extern "C" { and } so that the same headers may be used by the C++ implementation and by the C clients of the API
The bad idea:
changing the API between compile flags, since it forces the client to changes its uses with the same compile flags... urk!
Don't put ifdef in your code.
It makes it really hard to read and understand. Please make the code as easy to read as possable for the maintainer (he knows where you live and owns an Axe).
Hide the conditional code in separate functions and use the ifdef to define what functions are being used.
DONT use the else part to make a definition. If you are ding that you are saying one platform is unique and all others are the same. This is unlikely, what is more likely is that you know what happens on a couple of platforms but you should use the #else section to stick a #error so when it is ported to a new platform a developer has to explicitly fix the condition for his platform.
x.h
#if defined(WINDOWS)
#define MyPlatfromSleepSeconds(x) sleep(x * 1000)
#elif defined (UNIX)
#define MyPlatfromSleepSeconds(x) Sleep(x)
#else
#error "Please define appropriate sleep for your platform"
#endif
Don;t be tempted to expand a macro into multiple lines of code. That leads to madness.
p.h
#if defined(SOLARIS_3_1_1)
#define DO_SOME_TASK(x,y) doPartA(x); \
doPartB(y); \
couple(x,y)
#elif defined(WINDOWS)
#define DO_SOME_TASK(x,y) doAndCouple(x,y)
#else
#error "Please define appropriate DO_SOME_TASK for your platform"
#endif
If you develop the code on windows then test on solaris 3_1_1 later you may find unexpected bugs when people do things like:
int loop;
for(loop = 0;loop < 10;++loop)
DO_SOME_TASK(loop,loop); // Windows works fine()
// Solaras. Only doPartA() is in the loop.
// The other statements are done when the loop finishes
Basically, you should try to keep the amount of code that is conditionally compiled to a minimum, because you should be trying to test all that and having lots of conditions makes that more difficult. It also reduces the readability of the code; conditionally compiling whole files is clearer, e.g., by putting platform-specific code in a separate file for each platform and having that all have the same API from the perspective of the rest of the problem. Also try to avoid using it in function headers; again, that's because that's a place where it is particularly confusing.
But that's not to say that you should never use conditional compilation. Just try to keep it short and minimal. (Where I can, I use conditional compilation to control the definitions of other macros which are then just used in the rest of the code; that seems to be clearer to me at least.)
It's a bad idea whenever you don't know what you're doing. It can be a good idea when you're effectively solving an issue this way :).
The way you describe conditional compiling, include guards are part of it. It's not only a good idea to use it. It's a way to avoid compilation errors.
For me, conditional compiling is also a way to target multiple compilers and operating systems. I'm involved in a lib that's supposed to be compileable on Windows XP and newer, 32 or 64 bit, using MinGW and Visual C++, on Linux 32 and 64 bit using gcc/g++ and on MacOS using I-don't-know-what (I'm not maintaining that, but I assume it's a gcc port). Without the preprocessor conditions, it would be pretty much impossible to create a single source file that's compileable anywhere.
Another pragmatic use of conditional compiles is to "comment out" sections of code which contain standard "C" comments (i.e. /* */). Some compilers do not allow nesting of these comments, for example:
/* comment out block of code
.... code ....
/* This is a standard
* comment.
*/ ... oopos! Some compilers try to compile code after this closing comment.
.... code ....
end of block of code*/
(As you can see in the syntax highlighting, StackOverflow does not nest comments.)
instead you can use#ifdef to get the right effect, for example:
#ifdef _NOT_DEFINED_
.... code ....
/* This is a standard
* comment.
*/
.... code ....
#endif
In the past if you wanted to produce truly portable code, you'd have to resort to some form of conditional compilation. With there being a proliferation of portable libraries (such as APR, boost etc.) this reason has little weight IMHO. If you are using conditional compilation simply compile out blocks of code that are not need for particular builds, you should really revisit your design - I should imagine that this would become a nightmare to maintain.
Having said all that, if you do need to use conditional compilation, I would hide as much as I can away from the main body of the code and limit to to very specific cases that are very well understood.
Good/justifiable uses are based on cost/benefit analysis. Obviously, people here are very conscious of the risks:
in linking objects that saw different versions of classes, functions etc.
in making code hard to understand, test and reason about
But, there are uses which often fall into the net-benefit category:
header guards
code customisations for distinct software "ecosystems", such as Linux versus Windows, Visual C++ versus GCC, CPU-specific optimisations, sometimes word size and endianness factors (though with C++ you can often determine these at compile via template hackery, but that may prove messier still) - abstracts away lower-level differences to provide a consistent API across those environments
interacting with existing code that uses preprocessor defines to select versions of APIs, standards, behaviours, thread safety, protocols etc. (sad but true)
compilation that may use optional features when available (think of GNU configure scripts and all the tests they perform on OS interfaces etc)
request that extra code be generated in a translation unit, such as adding main() for a standalone app versus without for a library
controlling code inclusion for distinct logical build modes such as debug and release
It is always a bad idea. What it does is effectively create multiple versions of your source code, all of which need to be tested, which is a pain, to say the least. Unfortunately, like many bad things it is sometimes unavoidable. I use it in very small amounts when writing code that needs to be ported between Windows and Linux, but if I found myself doing it a lot, I would consider alternatives, such as having two separate development sub-trees.

debugging C++ when compared to debugging C

HI,
I am normally a C programmer.
I do regularly debug C programs on unix environment using tools like gdb,dbx.
i have never done debugging of big applications of C++.
Is that much different from how we debug in C.
theoretically i am quite good in C++ but have never got a chance to debug C++ programs.
I am also not sure about what kind of technical problems we face in c++ which will lead a developer to switch on the debugger for finding out the problem.
what are the common issues we face in C++ which will make debugger to be started
what are the challenges that a c programmer might face while debugging a C++ program?
Is it difficult and complex when compared to C?
It is basically the same.
Just remember when setting break points manually you need to fully qualify the method name with both the namespace(s) and class (As a resul i someti es find it easier to use line numbers to define break points)
Don't forget that calls to destructors are invisible in the source, but you can still step into them at the end of a block.
A few minor differences:
When typing a full-qualified symbol such as foo::bar::fum(args) in the gdb shell you have to start with a single quote for gdb to recognize it and calculate completions.
As others have said, library templates expose their internals in the debugger. You can poke around in std::vector pretty easily, but poking through std::map may not be a wise way to spend your time.
The aggressive and abundant inlining common in C++ programs can make a single line of code have seemingly endless steps. Things like shared_ptr can be particularly annoying because every access to the pointer expands inline to the template internals. You never really get to used it.
If you've got a ton of overloaded symbol names, selecting which one you want from the readline completion can be unpleasant. (Which "foo" did you want? All of them? Just these two?)
GDB can be used to debug C++ as well, so if you have an understanding of how C++ works (and understand problems that can stem from the object-oriented side of things), then you shouldn't have all that much trouble (at least, not much more than you would debugging a C program). I think...
Quite a few issues really, but it also depends on the debugger you are using, its versioning etc:
Accessing individual members of templatized class is not easy
Exception handling is a problem -- i have seen debuggers doing a better job with setjmp/longjmp
Setting breakpoints with something like obj1 == obj2, where these are not POD types may not work
The good thing that I like about debuggers is that to access private/protected class members I don't have to call get routines; just [obj-name].[var-name] is good enough.
Arpan
GDB has had a rocky past with regard to debugging c++. For a while it couldn't efficiently break inside constructors/destructors.
Also stl container were netoriously difficult to inspect in gdb. std::string was painful but generally workable. std::map was so difficult, that I generally added print statements unless there was no other way.
The constructor/destructor problem has been fixed for a few years.
The stl support got fixed in gdb 7.0.
You might still have issues with boost's libraries. I at time had difficulty getting gdb to give me asses to the contents of a shared_ptr.
So I guess debugging your own C++ isn't really that difficult, it's debugging 3rd party classes and template code that could be a problem.
C++ objects might be sometimes harder to analyze. Also as data is sometimes nested in several classes (across several layers) it might take some time to "unfold" it (as already said by others in this thread). Its hard to generally say so, as it depends very much on C++ features used and programming style and complexity of the problem to analyze (actually that is language independent).
IMO: if someone finds himselfself in the need to debug very often he should reconsider his programming style.
Usually for me it is all about error handling at the end. If a program behaves unexpected your error logs should indicate enough information to reconstruct what happened at any stage.
This also gives you the benefit that you can "debug" problems offline later once your program gets shipped to end users.

Is it possible to turn off support for "and" / "or" boolean operator usage in gcc?

GCC seems to allow "and" / "or" to be used instead of "&&" / "||" in C++ code; however, as I expected, many compilers (notably MSVC 7) do not support this. The fact that GCC allows this has caused some annoyances for us in that we have different developers working on the same code base on multiple platforms and occasionally, these "errors" slip in as people are switching back and forth between Python and C++ development.
Ideally, we would all remember to use the appropriate syntax, but for those situations where we occasionally mess up, it would be really nice if GCC didn't let it slide. Anybody have any ideas on approaches to this?
If "and" and "or" are simply #defines then I could #undef when using GCC but I worry that it is more likely built into the compiler at more fundamental level.
Thanks.
They are part of the C++ standard, see for instance this StackOverflow answer (which quotes the relevant parts of the standard).
Another answer in the same question mentions how to do the opposite: make them work in MSVC.
To disable them in GCC, use -fno-operator-names. Note that, by doing so, you are in fact switching to a non-standard dialect of C++, and there is a risk that you end up writing code which might not compile correctly on standard-compliant compilers (for instance, if you declare a variable with a name that would normally be reserved).
The words are standard in C++ without the inclusion of any header.
The words are standard in C if you include the header <iso646.h>.
MSVC is doing you no service by not supporting the standards.
You could, however, use tools to enforce the non-use of the keywords. And it can be a coding guideline, and you can quickly train your team not to make silly portability mistakes. It isn't that hard to avoid the problem.
Have you considered any code analysis tools? Something similar to FxCop? With FxCop you can write your own rules( check for && ) and you can set it to run during the pre-compile stage.
-pedantic-errors may help for this, among other gnu-isms.