Macros and Visual C++ - c++

I'm trying to get a better understanding of what place (if any) Macros have in modern C++ and Visual C++, also with reference to Windows programming libraries: What problem (if any) do Macros solve in these situations that cannot be solved without using them?
I remember reading about Google Chrome's use of WTL for Macros (amonst other things) from this blog post, and they are also used in MFC - here is an example of a Macro from that blog post I'd like explained in a superb amount of detail if possible:
// CWindowImpl
BEGIN_MSG_MAP(Edit)
MSG_WM_CHAR(OnChar)
MSG_WM_CONTEXTMENU(OnContextMenu)
MSG_WM_COPY(OnCopy)
MSG_WM_CUT(OnCut)
MESSAGE_HANDLER_EX(WM_IME_COMPOSITION, OnImeComposition)
MSG_WM_KEYDOWN(OnKeyDown)
MSG_WM_LBUTTONDBLCLK(OnLButtonDblClk)
MSG_WM_LBUTTONDOWN(OnLButtonDown)
MSG_WM_LBUTTONUP(OnLButtonUp)
MSG_WM_MBUTTONDOWN(OnNonLButtonDown)
MSG_WM_MOUSEMOVE(OnMouseMove)
MSG_WM_MOUSELEAVE(OnMouseLeave)
MSG_WM_NCCALCSIZE(OnNCCalcSize)
MSG_WM_NCPAINT(OnNCPaint)
MSG_WM_RBUTTONDOWN(OnNonLButtonDown)
MSG_WM_PASTE(OnPaste)
MSG_WM_SYSCHAR(OnSysChar) // WM_SYSxxx == WM_xxx with ALT down
MSG_WM_SYSKEYDOWN(OnKeyDown)
END_MSG_MAP()
I've read these articles to do with Macros on MSDN but am trying to also pick up best practices for writing or avoiding Macros and where/when to use them.

All of those Macros are defined in public sdk header files, so you can go read what they do yourself if you want. Basically what is happening here is you're generating a WndProc function using Macros. Each MSG_WM_* entry is a case statement that handles the given window message by translating its arguments from wParam and lParam into their original types and calling a helper function (which you then get to go implement) that takes those types as arguments (if appropriate—some window messages have 0 or 1 arguments).
In my opinion all this is crap. You're trading off having to write a bunch of boiler-plate up front but in exchange you make it much harder to debug later. Macros don't have much of a place in my modern programming world, other than to check if they are defined or not. Macros that do flow control, or assume specific local variables are always defined, etc, make me quite unhappy. Especially when I have to debug them.

Peter, you might as well ask if vi or EMACS is the better editor. (EMACS, by the way.) There are a lot of people who think the C preprocessor is a horrible idea; Stroustrup and Jim Gosling among them. That's why Java has no preprocessor, and there are innumerable things Stroustrup put into C++, from const to templates, to avoid using the preprocessor.
There are other people who find it convenient to be able to add a new language, as in that code.
If you read the original Bourne shell code, you'll find it looks like
IF a == b THEN
do_something();
do_more();
ELSE
do_less();
FI
Steve Bourne basically used macros to give it an Algol68-like look (that being the Really Cool Language then.) It could be very difficult to work with for anyone but Steve Bourne.
Then have a look at the Obfuscated C contest, where some of the most amazing obfuscations take advantage of the #define.
Personally, I don't mind too much, although that example is a little daunting (how do you debug that?) But doing that kind of thing is working without a net; there aren't many Wallendas out there.

(The question is kind of fragmented, but ignoring the Windows part):
X-Macros use the preprocessor to avoid code duplication.

Related

Is the a way to pre-compile templates of c++ and get the c++ source

When dealing with macros we can use (for gcc -dM) option to let pre-compiler unravel the macro definition into the c++ source. I am looking for a tool, better yet compiler option to do the same with templates (even in the limited fashion). If I inherited the code with multilayered templates mixed with multiple inheritance that would be very help? Especially that machine has to know exactly what is the state of the code after template interpretation.
I would not even complained about the mangled names , as long as the flatten structure of the code is exposed.
You can't really do that.
Macros are super simple. They're little more than text replacement.
However, templates are a part of the semantic, hypothetical, theoretical, academic, unknowable, esoteric, satanic, ethereal juice that powers your hobby.
They exist only in the space between source code and program.
The void, between life and death.
There is no textual representation. There is only a feeling. A set of thoughts. An instinct that your compiler holds, from the time that you feed it words, to the time that it spits out actions.
Okay, sure, in theory there's some human-readable format in which a compiler could dump template instantiations in all their glory, but let's be honest: the easiest way for it to do that is to spit out the C++ that you gave it in the first place.
So, yeah, no.
That being said, if you really want a headache, learn to use the LLVM backend API.

What would make C++ preprocessor macros an accepted development tool?

Apparently the preprocessor macros in C++ are
justifiably feared and shunned by the C++ community.
However, there are several cases where C++ macros are beneficial.
Seeing as preprocessor macros can be extremely useful and can reduce repetitive code in a very straightforward manner --
-- leaves me with the question, what exactly is it that makes preprocessor macros "evil", or, as the question title says, which feature (or removal of feature) would be needed from preprocessor macros to make them useful as a "good" development tool (instead of a fill-in that everyone's ashamed of when using it). (After all, the Lisp languages seem to embrace macros.)
Please Note: This is not about #include or #pragma or #ifdef. This is about #define MY_MACRO(...) ...
Note: I do not intend for this question to be subjective. Should you think it is, feel free to vote to move it to programmers.SE.
Macros are widely considered evil because the preprocessor is a stupid text replacement tool that has little to none knowledge of C/C++.
Four very good reasons why macros are evil can be found in the C++ FAQ Lite.
Where possible, templates and inline functions are a better choice. The only reason I can think of why C++ still needs the preprocessor is for #includes and comment removal.
A widely disputed advantage is to use it to reduce code repetition; but as you can see by the boost preprocessor library, much effort has to be put to abuse the preprocessor for simple logic such as loops, leading to ugly syntax. In my opinion, it is a better idea to write scripts in a real high-level programming language for code generation instead of using the preprocessor.
Most preprocessor abuse come from misunderstanding, to quote Paul Mensonides(the author of the Boost.Preprocessor library):
Virtually all
issues related to the misuse of the preprocessor stems from attempting to
make object-like macros look like constant variables and function-like
macro invocations look like underlying-language function calls. At best,
the correlation between function-like macro invocations and function calls
should be incidental. It should never be considered to be a goal. That
is a fundamentally broken mentality.
As the preprocessor is well integrated into C++, its easier to blur the line, and most people don't see a difference. For example, ask someone to write a macro to add two numbers together, most people will write something like this:
#define ADD(x, y) ((x) + (y))
This is completely wrong. Runs this through the preprocessor:
#define ADD(x, y) ((x) + (y))
ADD(1, 2) // outputs ((1) + (2))
But the answer should be 3, since adding 1 to 2 is 3. Yet instead a macro is written to generate a C++ expression. Not only that, it could be thought of as a C++ function, but its not. This is where it leads to abuse. Its just generating a C++ expression, and a function is a much better way to go.
Furthermore, macros don't work like functions at all. The preprocessor works through a process of scanning and expanding macros, which is very different than using a call stack to call functions.
There are times it can be acceptable for macros to generate C++ code, as long as it isn't blurring the lines. Just like if you were to use python as a preprocessor to generate code, the preprocessor can do the same, and has the advantage that it doesn't need an extra build step.
Also, the preprocessor can be used with DSLs, like here and here, but these DSLs have a predefined grammar in the preprocessor, that it uses to generate C++ code. Its not really blurring the lines since it uses a different grammar.
Macros have one notable feature - they are very easy to abuse and rather hard to debug. You can write just about anything with macros, then macros are expanded into one-liners and when nothing works you have very hard time debugging the resulting code.
The feature alone makes one think ten times on whether and how to use macros for their task.
And don't forget that macros are expanded before actual compilation, so they automatically ignore namespaces, scopes, type safety and a ton of other things.
The most important thing about macros is that they have no scope, and do not care about context. They are almost a dump text replacement tool. So when you #define max(.... then everywhere where you have a max it gets replaced; so if someone adds overly generic macro names in their headers, they tend to influence code that they were not intended to.
Another thing is that when used without care, they lead to quite hard to read code, since no one can easily see what the macro could evaluate to, especially when multiple macros are nested.
A good guideline is to choose unique names, and when generating boilerplate code, #undef them as soon as possible to not pollute the namespace.
Additionally, they do not offer type safety or overloading.
Sometimes macros are arguably a good tool to generate boilerplate code, like with the help of boost.pp you could create a macro that helps you creating enums like:
ENUM(xenum,(a,b,(c,7)));
which could expand to
enum xenum { a, b, c=7 };
std::string to_string( xenum x ) { .... }
Things like assert() that need to react on NDEBUG are also often easier to implement as macros
There a many uses where a C developper uses Macros and an C++ developper uses templates.
There obviously corner cases where they're useful, but most of the time it's bad habits from the C world applied to C++ by people that believe there such a language called C/C++
So it's easier to say "it's evil" than risking a developper misuses them.
Macros do not offer type safety
Problems where parameters are executed twice e.g. #define MAX(a,b) ((a)>(b) ? (a) : (b)) and apply it for MAX(i++, y--)
Problems with debugging as their names do not occur in the symbol table.
Forcing the programmer to use proper naming for the macros... and better tools to track replacement of macros would fix most my problems. I can't really say I've had major issues so far... It's something you burn yourself with and learn to take special care later on. But they badly need better integration with IDEs, debuggers.

Is a preprocessor needed for a viable language?

How useful is the C++ preprocessor, really? Even in C#, it still has some functionality, but I've been thinking of ditching its use altogether for a hypothetical future language. I guess that there are some languages like Java that survive without such a thing. Is a language without a preprocessing step competitive and viable? What steps do programs written in languages without a preprocessor take to emulate it's functionality, e.g. different code for debug and release code, and how do these compare to #ifdef DEBUG?
In fact, most languages deal very well without a preprocessor. I'd move on to say that the necessity of using preprocessor with C/C++ roots in their lack of several parts of functionality.
For example:
Most languages don't need header files and include guards, because they have the notion of a "module".
Conditional compilation can be easily obtained through static ifs or an analogous mechanism.
Code repetition can almost always be reduced in more clear ways than what you can achieve with the preprocessor: using templates/generics, a reflection system, etc, etc.
So my conclusion is: for most "features" you can get with preprocessor and metaprogramming, a more clear alternative exists which is safer and more convenient to use.
The D programming language, as a compiled low-level language, is a nice example on "how to provide most features usually done via preprocessor, without actually preprocessing" - includes all I've mentioned, plus string mixins and template mixins, and probably some other clever solutions to problems usually solved with preprocessing in C/C++.
I would say no, macros are not needed for a language to be viable and competitive.
That doesn't mean macros are not needed in some languages.
If you need macros it's probably because there is a deficiency in your language. (Or because you're trying to be compatible with some other deficient language, like C++ is with C. :)). Make the language "good enough" and you will need macros so rarely that the language can do without them.
(Of course, it depends what your language's goals are what "good enough" actually means, and whether or not macros are a good way to achieve certain things or just a band-aid for missing concepts/features.)
Even in a "good enough" language, there may still be the odd time where you wish macros were there. Maybe they would still be worth having. Maybe not. Depends on what they bring to the language and what problems they introduce in terms of complexity (to the compiler/runtime and to the programmer). But that is true of any language feature. You wouldn't design a language with the aim to "have every single feature" so you have to pick & choose based on the trade-offs and benefits.
e.g. Templates are a fantastically powerful feature in C++, and something I find I miss occasionally in C#, but should C# have them? Should every language have the feature? Perhaps not, given the complexity they would bring with them (and the fact you can usually use C++/CLI for that kind of work).
BTW, I'm not saying "any good language doesn't have macros"; I'm just saying a good language doesn't need them. And FWIW, it used to irritate me that Java didn't have them, but that was probably because Java was lacking in certain areas and not because macros are essential.
It is very useful, however should be used with care.
Few examples where you need it.
Currently there is no other standard way to handle #include properly other then processor as it a part of standard. Also you need define to have include guard. But this is very C++ specific issue that does not exist in other languages.
Processor is very useful for conditional compilations when you need to configure your system to work with different API's, different OS's different toolkit, it is the only way to go (unless you want to create an abstract interfaces and then make conditional compilation on build system level).
With current C++ standard (2003) without variadic templates it makes life much easier in certain situations. For example, when you need to create a bunch of classes like:
template<typename R>
class function<R()> { ... }
template<typename R,typename T1>
class function<R(T1)> { ... }
template<typename R,typename T1,typename T2>
class function<R(T1,T2)> { ... }
...
It is almost impossible to do it properly without processor in current C++ standard. (in C++0x there is variadic templates that make it much easier).
In fact great tools like boost::function, boost::signal, boost::bind require quite
complicated templates handling to make this stuff work with current compilers.
Sometimes templates provide very nice structures that are impossible without preprocessor, for example:
assert(ptr!=0);
That prints aborts the program printing:
Assertion failed in foo.cpp, line 134 "ptr!=0"
And of course it is really useful for unit testing:
TEST(3.14159 <=pi && pi < 3.141599);
That prints aborts the program printing:
Test failed in foo.cpp, line 134 "3.14159 <=pi && pi < 3.141599"
Logging. Usually logging is something much easier to implement with macros. Why?
You need either to write:
if(should_log(info))
log(info) << "this is the message in file foo.cpp, line 10, foo::doit()" << "Value is " << x;
or simpler:
LOG_INFO() << "Value is " << x;
Which includes already: file, line number, function name and condition. Very valuable.
In fact boost::log apache logging use such things.
Yes... Preprocessor sometimes is evil, but it too many cases it is extremely useful, so use it smartly and with care and it is fine.
However if you implement with it:
Macros instead if inline function.
Unreadable and unclear macros to "reduce the code"
constants
You are do it wrong.
Bottom line
As every tool it can be abused (and beleive me I had seen very crazy preprocessor
abuse it real code) but today it is very useful thing.
You do not need preprocessing step to implement conditional compilation. You do not really need macros (one can live without stringize and token-paste operators, and even these could be done without PP). #includes are a very special kind of nightmare, which models the reference invocation all wrong.
What else is so important?
It would depend upon what you consider 'viable', but
In C++, a lot of the necessity for/desirability of using macros has been obviated by features such as templates, inlining and namespaces. The only reason I find myself using macros in C++ is for integration with C (#ifdef __cplusplus in headers and processing definitions). With other languages, this is unnecessary with tools like JNI or SWIG to generate C headers/libraries.
Rather than use macros to control 'debug' and 'nodebug' builds, it would make more sense for the compiler to include a debug compile option to include/enable the necessary features.
Several languages just work without macros; if you're looking for a C-like language to compare with, I suggest D http://www.digitalmars.com/d/2.0/comparison.html

debugging C++ when compared to debugging C

HI,
I am normally a C programmer.
I do regularly debug C programs on unix environment using tools like gdb,dbx.
i have never done debugging of big applications of C++.
Is that much different from how we debug in C.
theoretically i am quite good in C++ but have never got a chance to debug C++ programs.
I am also not sure about what kind of technical problems we face in c++ which will lead a developer to switch on the debugger for finding out the problem.
what are the common issues we face in C++ which will make debugger to be started
what are the challenges that a c programmer might face while debugging a C++ program?
Is it difficult and complex when compared to C?
It is basically the same.
Just remember when setting break points manually you need to fully qualify the method name with both the namespace(s) and class (As a resul i someti es find it easier to use line numbers to define break points)
Don't forget that calls to destructors are invisible in the source, but you can still step into them at the end of a block.
A few minor differences:
When typing a full-qualified symbol such as foo::bar::fum(args) in the gdb shell you have to start with a single quote for gdb to recognize it and calculate completions.
As others have said, library templates expose their internals in the debugger. You can poke around in std::vector pretty easily, but poking through std::map may not be a wise way to spend your time.
The aggressive and abundant inlining common in C++ programs can make a single line of code have seemingly endless steps. Things like shared_ptr can be particularly annoying because every access to the pointer expands inline to the template internals. You never really get to used it.
If you've got a ton of overloaded symbol names, selecting which one you want from the readline completion can be unpleasant. (Which "foo" did you want? All of them? Just these two?)
GDB can be used to debug C++ as well, so if you have an understanding of how C++ works (and understand problems that can stem from the object-oriented side of things), then you shouldn't have all that much trouble (at least, not much more than you would debugging a C program). I think...
Quite a few issues really, but it also depends on the debugger you are using, its versioning etc:
Accessing individual members of templatized class is not easy
Exception handling is a problem -- i have seen debuggers doing a better job with setjmp/longjmp
Setting breakpoints with something like obj1 == obj2, where these are not POD types may not work
The good thing that I like about debuggers is that to access private/protected class members I don't have to call get routines; just [obj-name].[var-name] is good enough.
Arpan
GDB has had a rocky past with regard to debugging c++. For a while it couldn't efficiently break inside constructors/destructors.
Also stl container were netoriously difficult to inspect in gdb. std::string was painful but generally workable. std::map was so difficult, that I generally added print statements unless there was no other way.
The constructor/destructor problem has been fixed for a few years.
The stl support got fixed in gdb 7.0.
You might still have issues with boost's libraries. I at time had difficulty getting gdb to give me asses to the contents of a shared_ptr.
So I guess debugging your own C++ isn't really that difficult, it's debugging 3rd party classes and template code that could be a problem.
C++ objects might be sometimes harder to analyze. Also as data is sometimes nested in several classes (across several layers) it might take some time to "unfold" it (as already said by others in this thread). Its hard to generally say so, as it depends very much on C++ features used and programming style and complexity of the problem to analyze (actually that is language independent).
IMO: if someone finds himselfself in the need to debug very often he should reconsider his programming style.
Usually for me it is all about error handling at the end. If a program behaves unexpected your error logs should indicate enough information to reconstruct what happened at any stage.
This also gives you the benefit that you can "debug" problems offline later once your program gets shipped to end users.

C++ Developer Tools: The Dark Areas [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
While C++ Standards Committee works hard to define its intricate but powerful features and maintain its backward compatibility with C, in my personal experience I've found many aspects of programming with C++ cumbersome due to lack of tools.
For example, I recently tried to refactor some C++ code, replacing many shared_ptr by T& to remove pointer usages where not needed within a large library. I had to perform almost the whole refactoring manually as none of the refactoring tools out there would help me do this safely.
Dealing with STL data structures using the debugger is like raking out the phone number of a stranger when she disagrees.
In your experience, what essential developer tools are lacking in C++?
My dream tool would be a compile-time template debugger. Something that'd let me interactively step through template instantiations and examine the types as they get instantiated, just like the regular debugger does at runtime.
In your experience, what essential developer tools are lacking in C++?
Code completion. Seriously. Refactoring is a nice-to-have feature but I think code completion is much more fundamental and more important for API discoverabilty and usabilty.
Basically, tools that require any undestanding of C++ code suck.
Code generation of class methods. When I type in the declaration you should be able to figure out the definition. And while I'm on the topic can we fix "goto declaration / goto definition" always going to the declaration?
Refactoring. Yes I know it's formally impossible because of the pre-processor - but the compiler could still do a better job of a search and replace on a variable name than I can maually. You could also syntax highlight local, members and paramaters while your at it.
Lint. So the variable I just defined shadows a higher one? C would have told me that in 1979, but c++ in 2009 apparently prefers me to find out on my own.
Some decent error messages. If I promise never to define a class with the same name inside the method of a class - do you promise to tell me about a missing "}". In fact can the compiler have some knowledge of history - so if I added an unbalanced "{" or "(" to a previously working file could we consider mentioning this in the message?
Can the STL error messages please (sorry to quote another comment) not look like you read "/dev/random", stuck "!/bin/perl" in front and then ran the tax code through the result?
How about some warnings for useful things? "Integer used as bool performance warning" is not useful, it doesn't make any performance difference, I don't have a choice - it's what the library does, and you have already told me 50 times.
But if I miss a ";" from the end of a class declaration or a "}" from the end of a method definition you don't warn me - you go out of your way to find the least likely (but theoretically) correct way to parse the result.
It's like the built in spell checker in this browser which happily accepts me misspelling wether (because that spelling is an archaic term for a castrated male goat! How many times do I write about soprano herbivores?)
How about spell checking? 40 years ago mainframe Fortran compilers had spell checking so if misspelled "WRITE" you didn't come back the next day to a pile of cards and a snotty error message. You got a warning that "WRIET" had been changed to WRITE in line X. Now the compiler happily continues and spends 10mins building some massive browse file and debugger output before telling you that you misspelled prinft 10,000 lines ago.
ps. Yes a lot of these only apply to Visual C++.
pps. Yes they are coming with my medication now.
If talking about MS Visual Studio C++, Visual Assist is a very handy tool for code completition, some refactorings - e.g. rename all/selected references, find/goto declaration, but I still miss the richness of Java IDEs like JBuilder or IntelliJ.
What I still miss, is a semantic diff tool - you know, one which does not compare the two files line-by-line, but statements/expressions. What I've found on the internet are only some abandoned tries - if you know one, please write in comment
The main problem with C++ is that it is hard to parse. That's why there are so very few tools out there that work on source code. (And that's also why we're stuck with some of the most horrific error messages in the history of compilers.) The result is, that, with very few exceptions (I only know doxygen and Visual Assist), it's down to the actual compiler to support everything needed to assist us writing and massaging the code. With compilers traditionally being rather streamlined command line tools, that's a very weak foundation to build rich editor support on.
For about ten years now, I'm working with VS. meanwhile, its code completion is almost usable. (Yes, I'm working on dual core machines. I wouldn't have said this otherwise, wouldn't I?) If you use Visual Assist, code completion is actually quite good. Both VS itself and VA come with some basic refactoring nowadays. That, too, is almost usable for the few things it aims for (even though it's still notably less so than code completion). Of course, >15 years of refactoring with search & replace being the only tool in the box, my demands are probably much too deteriorated compared to other languages, so this might not mean much.
However, what I am really lacking is still: Fully standard conforming compilers and standard library implementations on all platforms my code is ported to. And I'm saying this >10 years after the release of the last standard and about a year before the release of the next one! (Which just adds this: C++1x being widely adopted by 2011.)
Once these are solved, there's a few things that keep being mentioned now and then, but which vendors, still fighting with compliance to a >10 year old standard (or, as is actually the case with some features, having even given up on it), never got around to actually tackle:
usable, sensible, comprehensible compiler messages (como is actually pretty good, but that's only if you compare it to other C++ compilers); a linker that doesn't just throw up its hands and says "something's wrong, I can't continue" (if you have taught C++ as a first language, you'll know what I mean); concepts ('nuff said)
an IO stream implementation that doesn't throw away all the compile-time advantages which overloading operator<<() gives us by resorting to calling the run-time-parsing printf() under the hood (Dietmar Kühl once set out to do this, unfortunately his implementation died without the techniques becoming widespread)
STL implementations on all platforms that give rich debugging support (Dinkumware is already pretty good in that)
standard library implementations on all platforms that use every trick in the book to give us stricter checking at compile-time and run-time and more performance (wnhatever happened to yasli?)
the ability to debug template meta programs (yes, jalf already mentioned this, but it cannot be said too often)
a compiler that renders tools like lint useless (no need to fear, lint vendors, that's just wishful thinking)
If all these and a lot of others that I have forgotten to mention (feel free to add) are solved, it would be nice to get refactoring support that almost plays in the same league as, say, Java or C#. But only then.
A compiler which tries to optimize the compilation model.
Rather than naively include headers as needed, parsing them again in every compilation unit, why not parse the headers once first, build complete syntax trees for them (which would have to include preprocessor directives, since we don't yet know which macros are defined), and then simply run through that syntax tree whenever the header is included, applying the known #defines to prune it.
It could even be be used as a replacement for precompiled headers, so every header could be precompiled individually, just by dumping this syntax tree to the disk. We wouldn't need one single monolithic and error-prone precompiled header, and would get finer granularity on rebuilds, rebuilding as little as possible even if a header is modified.
Like my other suggestions, this would be a lot of work to implement, but I can't see any fundamental problems rendering it impossible.
It seems like it could dramatically speed up compile-times, pretty much rendering it linear in the number of header files, rather than in the number of #includes.
A fast and reliable indexer. Most of the fancy features come after this.
A common tool to enforce coding standards.
Take all the common standards and allow you to turn them on/off as appropriate for your project.
Currently just a bunch of perl scrips usullay has to supstitute.
I'm pretty happy with the state of C++ tools. The only thing I can think of is a default install of Boost in VS/gcc.
Refactoring, Refactoring, Refactoring. And compilation while typing. For refactorings I am missing at least half of what most modern Java IDEs can do. While Visual Assist X goes a long way, a lot of refactoring is missing. The task of writing C++ code is still pretty much that. Writing C++ code. The more the IDE supports high level refactoring the more it becomes construction, the more mallable the structure is the easier it will be to iterate over the structure and improve it. Pick up a demo version of Intellij and see what you are missing. These are just some that I remember from a couple of years ago.
Extract interface: taken a view classes with a common interface, move the common functions into an interface class (for C++ this would be an abstract base class) and derive the designated functions as abstract
Better extract method: mark a section of code and have the ide write a function that executes that code, constructing the correct parameters and return values
Know the type of each of the symbols that you are working with so that not only command completion can be correct for derived values e.g. symbol->... but also only offer functions that return the type that can be used in the current expression e.g. for
UiButton button = window->...
At the ... only insert functions that actually return a UiButton.
A tool all on it's own: Naming Conventions.
Intelligent Intellisense/Code Completion even for template-heavy code.
When you're inside a function template, of course the compiler can't say anything for sure about the template parameter (at least not without Concepts), but it should be able to make a lot of guesses and estimates. Depending on how the type is used in the function, it should be able to narrow the possible types down, in effect a kind of conservative ad-hoc Concepts. If one line in the function calls .Foo() on a template type, obviously a Foo member method must exist, and Intellisense should suggest it in the rest of the function as well.
It could even look at where the function is invoked from, and use that to determine at least one valid template parameter type, and simply offer Intellisense inside the function based on that.
If the function is called with a int as a template parameter, then obviously, use of int must be valid, and so the IDE could use that as a "sample type" inside the function and offer Intellisense suggestions based on that.
JavaScript just got Intellisense support in VS, which had to overcome a lot of similar problems, so it can be done. Of course, with C++'s level of complexity, it'd be a ridiculous amount of work. But it'd be a nice feature.