I recently came upon the need to have compile-time assertions in C++ to check that the sizes of two types were equal.
I found the following macro on the web (stated to have come from the Linux kernel):
#define X_ASSERT(condition) ((void)sizeof(char[1 - 2*!!(condition)]))
which I used like so:
X_ASSERT(sizeof(Botan::byte) != sizeof(char));
This gets me curious - although this works, is there a cleaner way to do so? (obviously there's more than one way, as it is) Are there advantages or disadvantages to certain methods?
In C++0x, there is a new language feature, static_assert, which provides a standard way to generate compile-time assertions. For example,
static_assert(sizeof(Botan::byte) != 1, "byte type has wrong size");
Visual C++ 2010 supports static_assert, as do g++ 4.3 (and greater) and Intel C++ 11.0.
You might want to take a look at Boost StaticAssert. The internals aren't exactly clean (or weren't the last time I looked) but at least it's much more recognizable, so most people know what it means. It also goes to some pains to produce more meaningful error messages if memory serves.
Some other interesting options are here: http://www.jaggersoft.com/pubs/CVu11_3.html
Neat reading as the author walks the C (not C++) spec looking for syntax that can be leveraged as compile-time assertions.
To do it right you need a C++0x friendly compiler, see James McNellis' and Jerry Coffins answers.
You can't do much with the 1998 or 2003 C++ standards. Take a look at these links for examples:
http://en.wikipedia.org/wiki/Assertion_(computing)#Static_assertions
http://ksvanhorn.com/Articles/ctassert.html
There's an excellent #error preprocessor directive (see here for a good essay about it), but I believe it needs to be within a #if as opposed to being used in a "free-standing" as in your example use.
Related
GCC seems to allow "and" / "or" to be used instead of "&&" / "||" in C++ code; however, as I expected, many compilers (notably MSVC 7) do not support this. The fact that GCC allows this has caused some annoyances for us in that we have different developers working on the same code base on multiple platforms and occasionally, these "errors" slip in as people are switching back and forth between Python and C++ development.
Ideally, we would all remember to use the appropriate syntax, but for those situations where we occasionally mess up, it would be really nice if GCC didn't let it slide. Anybody have any ideas on approaches to this?
If "and" and "or" are simply #defines then I could #undef when using GCC but I worry that it is more likely built into the compiler at more fundamental level.
Thanks.
They are part of the C++ standard, see for instance this StackOverflow answer (which quotes the relevant parts of the standard).
Another answer in the same question mentions how to do the opposite: make them work in MSVC.
To disable them in GCC, use -fno-operator-names. Note that, by doing so, you are in fact switching to a non-standard dialect of C++, and there is a risk that you end up writing code which might not compile correctly on standard-compliant compilers (for instance, if you declare a variable with a name that would normally be reserved).
The words are standard in C++ without the inclusion of any header.
The words are standard in C if you include the header <iso646.h>.
MSVC is doing you no service by not supporting the standards.
You could, however, use tools to enforce the non-use of the keywords. And it can be a coding guideline, and you can quickly train your team not to make silly portability mistakes. It isn't that hard to avoid the problem.
Have you considered any code analysis tools? Something similar to FxCop? With FxCop you can write your own rules( check for && ) and you can set it to run during the pre-compile stage.
-pedantic-errors may help for this, among other gnu-isms.
I wanted to do some regular expressions in C++ so I looked on the interwebz (yes, I am an beginner/intermediate with C++) and found this SO answer.
I really don't know what to choose between boost::regex and boost::xpressive. What are the pros/cons?
I also read that boost::xpressive opposed to boost::regex is a header-only library. Is it hard to statically compile boost::regex on Linux and Windows (I almost always write cross-platform applications)?
I'm also interested in comparisons of compile time. I have a current implementation using boost::xpressive and I'm not too content with the compile times (but I have no comparisons to boost::regex).
Of course I'm open for other suggestions for regex implementations too. The requirements are free (as in beer) and compatible with http://nclabs.org/license.php.
One fairly important difference is that Boost Regex can support linking to ICU for Unicode support (character classes, etc) Boost Regex ICU Support.
As far as I can tell, Boost Xpressive doesn't have this kind of support built-in.
Well if you need to create a regular expression at runtime (i.e. Letting the user type in a regular expression to search for) you can't use xpressive as it is compile time only.
On the other hand, since it is a compile-time construct, it should benefit more from your optimizer than regex does.
I do enough stuff with Boost.MPL, StateChart, and Spirit that 220KB of compiler warning and errors don't really bother me much. If that sounds like hell to you, stick with Boost.Regex.
If you do use xpressive, I highly recommend turning on -Wfatal-errors as this will stop compilation (and further errors) after the first 'error:' line.
For compilation time, it's no contest. Boost.Regex will be faster*. The fact that xpressive uses MPL will cause compile times to be dramatically increased.
*This assumes you only build the dll/so once
When using the Boost libraries I tend to lean toward the use of header only libraries, due to cross platform compatability issues. The down side of that is that when your compiler reports an error related to your use of the the library, the header only output tends toward the arcane.
Assuming you're using a reasonably recent compiler, there's a pretty decent chance that it includes a regex package already. Try just doing #include <regex> and see if the compiler finds it.
The only trick to things is that it could be in either (or both) of two different namespaces. Regexes were included in TR1 of the C++ standard, and are also in (the final drafts of) C++11. The TR1 version is in a namespace named tr1, where the standard version is in std, just like the rest of the library.
FWIW, this is essentially the same as Boost regex, not Boost Xpressive.
I would try to supplement other people answers by get deeper into topic of compile-time regular expressions(CTR) vs run-time(dynamic) regular expressions(RTR) in a more theoretical way(this topic is implied by OP question indirectly IMHO). Run-time regex are more known and popular(most language core-libraries implementations), i suppose due to historical reasons. They are OK when regular expression is determined at run-time, unlike CTR. Both work on finite state machine basis.
RTR are "compiled" and interpreted by some kind of universal finite state machine(universal means its kind of interpreter which scheme is given at run-time, "compiled" in some internal data structure - when you pass regex string, then interpreted at run-time).
But CTR is "compiled" at compile-time and are specific for particular regex, so you can't use them, when regex is given at run-time(applications like text editors, file/internet search engines).
But they are a priori more efficient(theoretically however) as customized in compile-time finite state machine will be efficient, than interpreter with table-preset scheme of this machine(some similar cases are reflection field access vs compile-time access, or specialized function optimized for some fixed parameter as pointed out there). Another advantage is compile-time syntax checking. CTR can be implemented through meta-programming and/or code generation.
As for specific implementations - there are many RTR, but not so numerous CTR. For C++ they are above mentioned Boost and STL C++0x11 implementations. You may need them for optimizing regex perfomance/size of generated code/memory usage, mostly relevant for embedded systems or high perfomance specific applications.
SO question about CTR
Finding CTR-implementations is harder, one example if found is Re2C Code generator project, Java CTR implementation and C# implementation featuring run-time compilation(into IL code, not internal data structure) of Regex [there is SO question about it]
P.S. Sorry, couldn't post some relevant links due to reputation
I haven't coded in C++ for years. I recently discovered that during those years it has changed quite dramatically. I'm not sure I like the changes, but that's another discussion.
I still have some C++ code knocking around my hard drive. If I got it out and tried to compile it with a nice new C++ compiler, say the latest version of g++, would it compile? Without warnings (assuming it compiled without warnings before)?
I did get to mess around with a little VC++ 2010 recently and found some things I expected to work just not working, and got different messages depending on context when I tried to use NULL. But in one part of that code I used NULL without even a warning.
It depends. Usually newer compilers are more standard-compliant, so many constructs that compiled on earlier compilers don't compile now without fixing. For example:
for( int i = 0; ... );
i++;
compiled in Visual C++ 7, but not in Visual C++ 9.
In general, yes it is backwards compatible. However, the devil is in the details. You likely will find things where conventions change, or specific libraries fall into or out of use.
NULL is a macro - prefer to use 0 (or nullptr in C++0x).
Not sure just how old your code is, but Visual C++ v6 suffers from limitations that result in code that simply won't compile on newer compilers. VS2005 and up are a lot better (in the sense of more correct wrt the contemporaneous C++ standard).
I would not expect it to be a huge amount of work to get old code compiled, though. I've done some pretty major ports from VC6 -> VS2005 and for the most part it's a matter of hours, not days. Perhaps once the culture shock has worn off this will not seem so daunting. Really, VC++ 10 is very nice.
This depends on what you're comparing to.
Visual Studio 2010 partially implements the upcoming C++0x draft (recent versions of GCC also implement a subset of this draft, which is expected to be standardized next year)
C++98/C++03 was the first standardized version of C++, and is still the official one (as C++0x is still only a draft)
and of course, there are the dialects from before the language was standardized
C++0x is pretty much backwards compatible with C++03/98. There may be a couple of obscure corner cases that have changed, but you are unlikely to encounter them.
However, a lot of changes occurred when the language was first standardized, meaning that C++98 isn't entirely (but nearly) compatible with pre-standard C++.
But more likely, what you're into isn't a question of C++ backwards compatibility, but simply that compilers have gotten stricter. They have become much better at following the standard, and no longer allow many non-standard tricks that were common earlier. Most likely, your old code was never valid C++, but worked because compilers used to deviate more from the standard.
The language itself hasn't changed since it was standardized in 1998. Idioms have changed, and compilers have both improved their support for the standard and become stricter about non-standard code, but any standard-conforming C++ code that compiled in the past should still compile today. Code that relies on non-standard compiler features or quirks might not work, but compilers typically offer command-line options to make them accept non-standard code that used to be accepted by previous versions of the same compiler.
NULL is a macro that's defined in <cstddef>. Other standard headers may include that header as an implementation detail, so it's often possible to use NULL without explicitly including <cstddef>, but relying on that has always been non-portable. It's OK to use NULL as long as you include its header, though just using 0 is preferred in idiomatic C++.
All version of C++ should be backwards compatible.
Although there might be some unusual cases where there could be a problem, e.g. noexcept on destructos in C++0x (although this has not yet been decided).
Newer C++ standards come to clarify things, then take decissions on which usage is "more correct" or accepted, so I would expect warnings. Some other decisions change what can be done or not, so I would also expect errors. I've come across the same situation, and I had to tweak the code for it to compile. It was not a great deal, but this was taking into account my knowledge of C++. In general, you should encounter not great problems.
There are also other things. For example, not all compilers implemented the whole rules of the C++ standard, or they had bugs. When you get used with a compiler, some of those errors or missing features may pass unnoticed to your compiler, but may cause errors in future versions of the same compiler.
That's the major reason why we have standards. You don't have to worry about compatibility, you will just tell the compiler to compile the code as C++98 (or 2003).
MSVC is unfortunately very bad at supporting C++ standards. It is slowly getting better (and that's why old code doesn't compile, since it shouldn't compile in the first place).
Well, I know a lot of our old VS6 code started throwing tons of warnings when we upgraded to VS 2005 (with full warnings on of course, as everyone should work). Most were good things though, warning of potential information loss, warning that some types may be 64-bit instead of 32-bit like old code might expect, etc.
For your specific example of NULL, that wasn't even really standard C back in the day. Every compiler just had a #define for it that set it to 0. These days it is generally considered more correct (and clear) to just use 0.
If you have used old versions of various libraries such as Boost, expect some problems.
Slightly random, but the keyword export is being removed from the standard. So previously standards compliant code the used export would now be illegal. Of course, very few compilers even began implementing that keyword.
Very similar to sharptooth's answer, there are bits of older C and C++ code that will need /ZC:forScope- set (i.e. don't force conformance in for loop scope). e.g.
int myfunc(int x, int y, int z)
{
int j;
for (int i=0; i <= 10; i++)
{
if (f(i) == 0)
break;
}
j = i;
for (i=0; i <= 10; i++)
{
if (g(i) == 0)
break;
}
if (i > j)
return j;
return i;
}
This type of thing is quite common in much older code, where bytes cost money and variable re-use was common place.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
While C++ Standards Committee works hard to define its intricate but powerful features and maintain its backward compatibility with C, in my personal experience I've found many aspects of programming with C++ cumbersome due to lack of tools.
For example, I recently tried to refactor some C++ code, replacing many shared_ptr by T& to remove pointer usages where not needed within a large library. I had to perform almost the whole refactoring manually as none of the refactoring tools out there would help me do this safely.
Dealing with STL data structures using the debugger is like raking out the phone number of a stranger when she disagrees.
In your experience, what essential developer tools are lacking in C++?
My dream tool would be a compile-time template debugger. Something that'd let me interactively step through template instantiations and examine the types as they get instantiated, just like the regular debugger does at runtime.
In your experience, what essential developer tools are lacking in C++?
Code completion. Seriously. Refactoring is a nice-to-have feature but I think code completion is much more fundamental and more important for API discoverabilty and usabilty.
Basically, tools that require any undestanding of C++ code suck.
Code generation of class methods. When I type in the declaration you should be able to figure out the definition. And while I'm on the topic can we fix "goto declaration / goto definition" always going to the declaration?
Refactoring. Yes I know it's formally impossible because of the pre-processor - but the compiler could still do a better job of a search and replace on a variable name than I can maually. You could also syntax highlight local, members and paramaters while your at it.
Lint. So the variable I just defined shadows a higher one? C would have told me that in 1979, but c++ in 2009 apparently prefers me to find out on my own.
Some decent error messages. If I promise never to define a class with the same name inside the method of a class - do you promise to tell me about a missing "}". In fact can the compiler have some knowledge of history - so if I added an unbalanced "{" or "(" to a previously working file could we consider mentioning this in the message?
Can the STL error messages please (sorry to quote another comment) not look like you read "/dev/random", stuck "!/bin/perl" in front and then ran the tax code through the result?
How about some warnings for useful things? "Integer used as bool performance warning" is not useful, it doesn't make any performance difference, I don't have a choice - it's what the library does, and you have already told me 50 times.
But if I miss a ";" from the end of a class declaration or a "}" from the end of a method definition you don't warn me - you go out of your way to find the least likely (but theoretically) correct way to parse the result.
It's like the built in spell checker in this browser which happily accepts me misspelling wether (because that spelling is an archaic term for a castrated male goat! How many times do I write about soprano herbivores?)
How about spell checking? 40 years ago mainframe Fortran compilers had spell checking so if misspelled "WRITE" you didn't come back the next day to a pile of cards and a snotty error message. You got a warning that "WRIET" had been changed to WRITE in line X. Now the compiler happily continues and spends 10mins building some massive browse file and debugger output before telling you that you misspelled prinft 10,000 lines ago.
ps. Yes a lot of these only apply to Visual C++.
pps. Yes they are coming with my medication now.
If talking about MS Visual Studio C++, Visual Assist is a very handy tool for code completition, some refactorings - e.g. rename all/selected references, find/goto declaration, but I still miss the richness of Java IDEs like JBuilder or IntelliJ.
What I still miss, is a semantic diff tool - you know, one which does not compare the two files line-by-line, but statements/expressions. What I've found on the internet are only some abandoned tries - if you know one, please write in comment
The main problem with C++ is that it is hard to parse. That's why there are so very few tools out there that work on source code. (And that's also why we're stuck with some of the most horrific error messages in the history of compilers.) The result is, that, with very few exceptions (I only know doxygen and Visual Assist), it's down to the actual compiler to support everything needed to assist us writing and massaging the code. With compilers traditionally being rather streamlined command line tools, that's a very weak foundation to build rich editor support on.
For about ten years now, I'm working with VS. meanwhile, its code completion is almost usable. (Yes, I'm working on dual core machines. I wouldn't have said this otherwise, wouldn't I?) If you use Visual Assist, code completion is actually quite good. Both VS itself and VA come with some basic refactoring nowadays. That, too, is almost usable for the few things it aims for (even though it's still notably less so than code completion). Of course, >15 years of refactoring with search & replace being the only tool in the box, my demands are probably much too deteriorated compared to other languages, so this might not mean much.
However, what I am really lacking is still: Fully standard conforming compilers and standard library implementations on all platforms my code is ported to. And I'm saying this >10 years after the release of the last standard and about a year before the release of the next one! (Which just adds this: C++1x being widely adopted by 2011.)
Once these are solved, there's a few things that keep being mentioned now and then, but which vendors, still fighting with compliance to a >10 year old standard (or, as is actually the case with some features, having even given up on it), never got around to actually tackle:
usable, sensible, comprehensible compiler messages (como is actually pretty good, but that's only if you compare it to other C++ compilers); a linker that doesn't just throw up its hands and says "something's wrong, I can't continue" (if you have taught C++ as a first language, you'll know what I mean); concepts ('nuff said)
an IO stream implementation that doesn't throw away all the compile-time advantages which overloading operator<<() gives us by resorting to calling the run-time-parsing printf() under the hood (Dietmar Kühl once set out to do this, unfortunately his implementation died without the techniques becoming widespread)
STL implementations on all platforms that give rich debugging support (Dinkumware is already pretty good in that)
standard library implementations on all platforms that use every trick in the book to give us stricter checking at compile-time and run-time and more performance (wnhatever happened to yasli?)
the ability to debug template meta programs (yes, jalf already mentioned this, but it cannot be said too often)
a compiler that renders tools like lint useless (no need to fear, lint vendors, that's just wishful thinking)
If all these and a lot of others that I have forgotten to mention (feel free to add) are solved, it would be nice to get refactoring support that almost plays in the same league as, say, Java or C#. But only then.
A compiler which tries to optimize the compilation model.
Rather than naively include headers as needed, parsing them again in every compilation unit, why not parse the headers once first, build complete syntax trees for them (which would have to include preprocessor directives, since we don't yet know which macros are defined), and then simply run through that syntax tree whenever the header is included, applying the known #defines to prune it.
It could even be be used as a replacement for precompiled headers, so every header could be precompiled individually, just by dumping this syntax tree to the disk. We wouldn't need one single monolithic and error-prone precompiled header, and would get finer granularity on rebuilds, rebuilding as little as possible even if a header is modified.
Like my other suggestions, this would be a lot of work to implement, but I can't see any fundamental problems rendering it impossible.
It seems like it could dramatically speed up compile-times, pretty much rendering it linear in the number of header files, rather than in the number of #includes.
A fast and reliable indexer. Most of the fancy features come after this.
A common tool to enforce coding standards.
Take all the common standards and allow you to turn them on/off as appropriate for your project.
Currently just a bunch of perl scrips usullay has to supstitute.
I'm pretty happy with the state of C++ tools. The only thing I can think of is a default install of Boost in VS/gcc.
Refactoring, Refactoring, Refactoring. And compilation while typing. For refactorings I am missing at least half of what most modern Java IDEs can do. While Visual Assist X goes a long way, a lot of refactoring is missing. The task of writing C++ code is still pretty much that. Writing C++ code. The more the IDE supports high level refactoring the more it becomes construction, the more mallable the structure is the easier it will be to iterate over the structure and improve it. Pick up a demo version of Intellij and see what you are missing. These are just some that I remember from a couple of years ago.
Extract interface: taken a view classes with a common interface, move the common functions into an interface class (for C++ this would be an abstract base class) and derive the designated functions as abstract
Better extract method: mark a section of code and have the ide write a function that executes that code, constructing the correct parameters and return values
Know the type of each of the symbols that you are working with so that not only command completion can be correct for derived values e.g. symbol->... but also only offer functions that return the type that can be used in the current expression e.g. for
UiButton button = window->...
At the ... only insert functions that actually return a UiButton.
A tool all on it's own: Naming Conventions.
Intelligent Intellisense/Code Completion even for template-heavy code.
When you're inside a function template, of course the compiler can't say anything for sure about the template parameter (at least not without Concepts), but it should be able to make a lot of guesses and estimates. Depending on how the type is used in the function, it should be able to narrow the possible types down, in effect a kind of conservative ad-hoc Concepts. If one line in the function calls .Foo() on a template type, obviously a Foo member method must exist, and Intellisense should suggest it in the rest of the function as well.
It could even look at where the function is invoked from, and use that to determine at least one valid template parameter type, and simply offer Intellisense inside the function based on that.
If the function is called with a int as a template parameter, then obviously, use of int must be valid, and so the IDE could use that as a "sample type" inside the function and offer Intellisense suggestions based on that.
JavaScript just got Intellisense support in VS, which had to overcome a lot of similar problems, so it can be done. Of course, with C++'s level of complexity, it'd be a ridiculous amount of work. But it'd be a nice feature.
GCC seems to allow "and" / "or" to be used instead of "&&" / "||" in C++ code; however, as I expected, many compilers (notably MSVC 7) do not support this. The fact that GCC allows this has caused some annoyances for us in that we have different developers working on the same code base on multiple platforms and occasionally, these "errors" slip in as people are switching back and forth between Python and C++ development.
Ideally, we would all remember to use the appropriate syntax, but for those situations where we occasionally mess up, it would be really nice if GCC didn't let it slide. Anybody have any ideas on approaches to this?
If "and" and "or" are simply #defines then I could #undef when using GCC but I worry that it is more likely built into the compiler at more fundamental level.
Thanks.
They are part of the C++ standard, see for instance this StackOverflow answer (which quotes the relevant parts of the standard).
Another answer in the same question mentions how to do the opposite: make them work in MSVC.
To disable them in GCC, use -fno-operator-names. Note that, by doing so, you are in fact switching to a non-standard dialect of C++, and there is a risk that you end up writing code which might not compile correctly on standard-compliant compilers (for instance, if you declare a variable with a name that would normally be reserved).
The words are standard in C++ without the inclusion of any header.
The words are standard in C if you include the header <iso646.h>.
MSVC is doing you no service by not supporting the standards.
You could, however, use tools to enforce the non-use of the keywords. And it can be a coding guideline, and you can quickly train your team not to make silly portability mistakes. It isn't that hard to avoid the problem.
Have you considered any code analysis tools? Something similar to FxCop? With FxCop you can write your own rules( check for && ) and you can set it to run during the pre-compile stage.
-pedantic-errors may help for this, among other gnu-isms.