Using MSBuild in VC++ 2010 to do custom preprocessing on files - c++

I am trying to insert a custom preprocessor into the VC++ 2010 build-pipe after the regular preprocessor has finished, so far i figured that the way of doing this is via MSBuild.
To this point I wasn't able to find out much more, so my questions are:
Is this possible at all?
If so, what do I need to look at, to get going.

If you are talking about the c/c++ preprocessor, then you are probably out of luck. AFAIK, the preprocessor is built into the compiler itself. You can get the compiler to output a pre-processed file and then you MIGHT be able to send that through the compiler a second time to get the final output.
This may not work anyway due to the code being produced, at least in previous versions of cl.exe, doesn't seem to be 100% correct (white space gets mangled slightly which can cause errors).
If you want to take this path, what you'd need to do is have an MSBuild 'Target' that runs before the 'ClCompile' target.
This new target would have to run the 'cl.exe' program with all the settings that 'ClCompile' usually sends it with as well as the '/P' option which will "preprocess to a file".
Then you need to run your tool over the processed file and then finally feed these new files into 'ClCompile'.
If you need any more info just reply in the comments and I'll try to add some when I get the time (this question is rather old, so i'm not sure if it's worthwhile investing more time into this answer).

Related

What could be the simplest way to incorporate Windows WPP Software Tracing into SCons builds?

I ask my question in such a specific way because I am afraid that a more generic form could lead to excessively theoretic discussions of how the things should be done best and in the most appropriate way (like a question about pre and post-process actions in SCons).
WPP incorporation actually requires execution of an additional command (commands) before compilation of a file and only even if the build process finds necessity to compile the file without any regard to WPP.
I would remark that this is easily achieved with few lines of definitions in a shared Visual Studio property page file making this work for multiple files in multiple projects, folders, etc. in an absolutely transparent for developers way.
Thus I am wondering whether this can be done in a similarly simple way with SCons? I do not have any deep knowledge of either SCons or MSBuild frameworks; I work with them for simple practical use so I would truly appreciate a practical and useful advise.
Here's what I'd suggest.
SCons builds command lines from Environment() variables.
For example the compile command line for building shared object for c++ is stored in SHCXXCOM (and the variable for what is displayed to user when the command is run defaults to SHCXXCOM, but can be changed by modifying SHCXXCOMSTR).
Back to the problem at hand.
Assuming you have a limited number of build steps you want to wrap, you can do something like.
env['SHCXXCOM'] = [ 'MPP PRE COMMAND LINE', env['SHCXXCOM'], 'MPP POST COMMAND LINE']
You'll have to figure out which variables you need to do this with, but take a look at the manpage to figure that out.
https://scons.org/doc/production/HTML/scons-man.html
p.s. I've not tried this, but in theory it should work. Let us know if not.

Exclude all messages in PC-Lint

I am using PC-Lint for my C++ project.
Is there a way to switch off all error and warning messages by default, so I can then reintroduce the required messages explicitly?
I have read the chapter of the PC-Lint manual entitled "Error Inhibition Options" and the best I could find was setting the wLevel to -w0 No messages (except for fatal errors)
Yes, it is possible, you can simply use -e* or -w0. However, the manual truly states (Chapter 16. Living with Lint):
DO NOT simply suppress all warnings with something
like: -e* or -w0 as this can disguise hard errors and make subsequent diagnosis very difficult.
So, yes, you can use it if your code is basically cleaned, and you want to review recent changes for a certain set of messages. But if you want to start cleaning your code, and are swamped with messages because of the default warning level -w3, I suggest to start using -w1, and resolve all issues there; most of the warnings/errors given at level one indicate problems with finding all header files, having al implicit macros set properly, and/or mimicking the compiler you use normally in a sufficiently precise way.
As always, I hesitate to advertise my own work, but if you want, take a look at my "How to wield PC Lint" PDF, where I have documented detailed instructions to handle the initial deployment of PC Lint and tackling the many warnings/errors/infos/notes you may be buried under.
When I started introducing PC-Lint to a new project I did the following:
As suggested by Johan Bezem, ran a -w1 level check over the whole thing. This doesn't actually find any new problems, but checks that your program is syntactically valid and finds any configuration issues. Nothing major, assuming your project compiles already.
Run the test again with -w2 level. This found 53,000 issues, which was a bit much to tackle in one go.
Pick a typical bad file, then suppress any errors that seem
irrelevant or non-urgent (eg. error 525: (Warning -- Negative indentation from line xxx)
adding -e525 to the command line or config file, until you find one that seems serious.
In my case this was
error 442: (Warning -- for clause irregularity: testing direction inconsistent with increment direction), i.e. a 'for' loop
that looked like it should be counting up was actually counting
down.
Reset the test level back to -w1 but added in the critical problem by number, -w1 +e442 in this case. Re-run it over the whole project then fix all the instances of that problem.
Back to stage 2 and try again.
This combination of fixing actual problems and suppressing likely false alarms soon gets your numbers under control.
So that everything gets better over time we also implement a script that does a thorough (full -w2 or -w3) check on any files that are created or modified.
I also found the tool LintProject very helpful as it can do an entire Visual Studio solution in one go, with tables with numbers of errors and worst offenders!

Is it possible to regenerate symbols for an exe?

One of my co-workers shipped a hot fix build to a customer, and subsequently deleted the pdb file. The build in question is crashing (intermittently) and we have a couple of crash dumps. We have all the source code in version control, and can compile it to an equivalent .exe and get symbols for that one. However, those symbols don't match the crash dump exactly. It seems like several of the functions are off by some constant offset, but we've only looked at a handful.
I'd love to be able to do the following (I can fake parts of this manually, but it's a huge amount of work): get a stack trace for each thread in the dump and cast pointers in the dump to the appropriate type and have them show up in the Visual Studio debugger. I'm using 2005, if that matters.
Is there a tool to let us recreate a pdb given the source code, all the .obj files, and the original .exe? Or is there a setting when we compile/link to say "make it exactly like this other exe you just did" or something like that?
Quick update, based on answers so far: I have the exe file that we sent to the customer, just not the pdb that corresponds to it, if that helps. I'd just as soon not send them a new build (if possible), because it takes about a week of running to get the crash dumps, and the customer is already at the "why isn't this already fixed?" stage. (If we do send another build, I'd prefer it to be one that either fixes the problem or has additional debugging in the area of interest, not just the same code.) I know it's possible to do some of this manually with a lot of guesswork; that's what we're currently doing. But it's a pain, so I'm hoping there's a way to automate it.
You cannot recreate a PDB to match a pre-existing executable. The PDB contains a "finger print" that is unique for each compilation. Unless you can make the old PDB magically reappear, you should whack your cow-orker in the back of the head (Gibbs-style, if you watch NCIS), recompile the whole thing, store the PDB somewhere safe, and ship a new executable to your customer, and let the crashes come.
If your build system enables you to recreate any binary from any revision you have in your history, then you should be able to get the build ID from the customer, and regenerate that same exact build ID, along with all the binaries and so forth. That will take a while if you have a large project, of course, but it will also yield the debugging file that you need.
If you have no way to perform an exact reproduction of a build, then look at this situation, think hard about some others that might crop up, and start moving to make it possible to regenerate all successful builds and associated files in the project's history. This will make it much easier to be able to work problems like this in the future.
When you have the sources, it's quite easy to find the correspondence between them and the exe file. Just ask them to send you the exe file along with the crash log and use IDA.
What you are asking is much more difficult than that, considering also that you need it for "one use only".

VS2008 is very slow on a specific large C++ solution

I have a solution with 21 C++ projects and 1 VB.NET project.
The IDE responds very slowly when I simply move the carret in a file or try to open the menu. The process seems to take 50% of CPU for each movement.
It only happens with this solution and only on my machine.
The solution has total of 2380 source and header files, of which 1280 are header files.
I tried to remove all connection to the source control (Perforce) but it didn't help.
Also, I have Visual Assist installed but even after removing it (uninstall), the same behavior continued.
Any idea?
Deactivate intellisense.
Link
Intellisense parses the whole project and slows down the IDE drastically. If you use Visual Assist then you won't really need it. Visual Assist is less resource hungry and scans in the background, intellisense steals too many resources during its parsing.
Could this apply in your case?
http://coolthingoftheday.blogspot.com/2008/03/visual-basic-2008-hotfix-to-fix-slow.html
Note that disabling Intellisense may also break stuff like the Class Wizard (at least I'm pretty sure it does in VS2005). As already suggested it's a good idea to get rid of all the temporary files like .ncb regularly, because they can get huge and will slow down the IDE.
Also, if you're using Visual Assist, try rebuilding the database, disabling it or installing a different version.
I have a few solutions with over 100 projects, so I know exactly how you feel. Solutions containing some managed projects are especially bad. Disabling Intellisense helps a lot. I've never seen such problems from Visual Assist (or other similar refactoring tools), and that fills in a lot of the missing functionality from losing Intellisense.
I've also encountered some projects that had code that would cause the Intellisense thread to endlessly loop and never finish parsing the code. Most of those times we were never able to pin down the exact bit of code that caused the problem. Certain heavy use of templates and nested macros were often high on the suspicion list.
The only good way to be sure that Intellisense is disabled is to create a directory with the same name as the ncb file. Go to your solution directory, delete the ncb, and create a directory named your_solution_name.ncb. Because it can't recreate the ncb file, you'll get an error box to click through every time you open the solution, but that's a small price to pay.
Simply deleting the ncb will mean that VS will just create it again. The methods that I've seen from inside the VS options will turn off some of the features but will not prevent it from trying to parse all your code.

C++ vim IDE. Things you'd need from it

I was going to create the C++ IDE Vim extendable plugin. It is not a problem to make one which will satisfy my own needs.
This plugin was going to work with workspaces, projects and its dependencies.
This is for unix like system with gcc as c++ compiler.
So my question is what is the most important things you'd need from an IDE? Please take in account that this is Vim, where almost all, almost, is possible.
Several questions:
How often do you manage different workspaces with projects inside them and their relationships between them? What is the most annoying things in this process.
Is is necessary to recreate "project" from the Makefile?
Thanks.
Reason to create this plugin:
With a bunch of plugins and self written ones we can simulate most of things. It is ok when we work on a one big "infinitive" project.
Good when we already have a makefile or jam file. Bad when we have to create our owns, mostly by copy and paste existing.
All ctags and cscope related things have to know about list of a real project files. And we create such ones. This <project#get_list_of_files()> and many similar could be a good project api function to cooperate with an existing and the future plugins.
Cooperation with an existing makefiles can help to find out the list of the real project files and the executable name.
With plugin system inside the plugin there can be different project templates.
Above are some reasons why I will start the job. I'd like to hear your one.
There are multiple problems. Most of them are already solved by independent and generic plugins.
Regarding the definition of what is a project.
Given a set of files in a same directory, each file can be the unique file of a project -- I always have a tests/ directory where I host pet projects, or where I test the behaviour of the compiler. On the opposite, the files from a set of directories can be part of a same and very big project.
In the end, what really defines a project is a (leaf) "makefile" -- And why restrict ourselves to makefiles, what about scons, autotools, ant, (b)jam, aap? And BTW, Sun-Makefiles or GNU-Makefiles ?
Moreover, I don't see any point in having vim know the exact files in the current project. And even so, the well known project.vim plugin already does the job. Personally I use a local_vimrc plugin (I'm maintaining one, and I've seen two others on SF). With this plugin, I just have to drop a _vimrc_local.vim file in a directory, and what is defined in it (:mappings, :functions, variables, :commands, :settings, ...) will apply to each file under the directory -- I work on a big project having a dozen of subcomponents, each component live in its own directory, has its own makefile (not even named Makefile, nor with a name of the directory)
Regarding C++ code understanding
Every time we want to do something complex (refactorings like rename-function, rename-variable, generate-switch-from-current-variable-which-is-an-enum, ...), we need vim to have an understanding of C++. Most of the existing plugins rely on ctags. Unfortunately, ctags comprehension of C++ is quite limited -- I have already written a few advanced things, but I'm often stopped by the poor information provided by ctags. cscope is no better. Eventually, I think we will have to integrate an advanced tool like elsa/pork/ionk/deshydrata/....
NB: That's where, now, I concentrate most of my efforts.
Regarding Doxygen
I don't known how difficult it is to jump to the doxygen definition associated to a current token. The first difficulty is to understand what the cursor is on (I guess omnicppcomplete has already done a lot of work in this direction). The second difficulty will be to understand how doxygen generate the page name for each symbol from the code.
Opening vim at the right line of code from a doxygen page should be simple with a greasemonkey plugin.
Regarding the debugger
There is the pyclewn project for those that run vim under linux, and with gdb as debugger. Unfortunately, it does not support other debuggers like dbx.
Responses to other requirements:
When I run or debug my compiled program, I'd like the option of having a dialog pop up which asks me for the command line parameters. It should remember the last 20 or so parameters I used for the project. I do not want to have to edit the project properties for this.
My BuildToolsWrapper plugin has a g:BTW_run_parameters option (easily overridden with project/local_vimrc solutions). Adding a mapping to ask the arguments to use is really simple. (see :h inputdialog())
work with source control system
There already exist several plugins addressing this issue. This has nothing to do with C++, and it must not be addressed by a C++ suite.
debugger
source code navigation tools (now I am using http://www.vim.org/scripts/script.php?script_id=1638 plugin and ctags)
compile lib/project/one source file from ide
navigation by files in project
work with source control system
easy acces to file changes history
rename file/variable/method functions
easy access to c++ help
easy change project settings (Makefiles, jam, etc)
fast autocomplette for paths/variables/methods/parameters
smart identation for new scopes (also it will be good thing if developer will have posibility to setup identation rules)
highlighting incorrect by code convenstion identation (tabs instead spaces, spaces after ";", spaces near "(" or ")", etc)
reformating selected block by convenstion
Things I'd like in an IDE that the ones I use don't provide:
When I run or debug my compiled program, I'd like the option of having a dialog pop up which asks me for the command line parameters. It should remember the last 20 or so parameters I used for the project. I do not want to have to edit the project properties for this.
A "Tools" menu that is configurable on a per-project basis
Ability to rejig the keyboard mappings for every possible command.
Ability to produce lists of project configurations in text form
Intelligent floating (not docked) windows for debugger etc. that pop up only when I need them, stay on top and then disappear when no longer needed.
Built-in code metrics analysis so I get a list of the most complex functions in the project and can click on them to jump to the code
Built-in support for Doxygen or similar so I can click in a Doxygen document and go directly to code. Sjould also reverse navigate from code to Doxygen.
No doubt someone will now say Eclipse can do this or that, but it's too slow and bloated for me.
Adding to Neil's answer:
integration with gdb as in emacs. I know of clewn, but I don't like that I have to restart vim to restart the debugger. With clewn, vim is integrated into the debugger, but not the other way around.
Not sure if you are developing on Windows, but if you are I suggest you check out Viemu. It is a pretty good VIM extension for Visual Studio. I really like Visual Studio as an IDE (although I still think VC6 is hard to beat), so a Vim extension for VS was perfect for me. Features that I would prefer worked better in a Vim IDE are:
The Macro Recording is a bit error prone, especially with indentation. I find I can easily and often record macros in Vim while I am editing code (eg. taking an enum defn from a header and cranking out a corresponding switch statement), but found that Viemu is a bit flakey in that deptartment.
The VIM code completion picks up words in the current buffer where Viemu hooks into the VS code completion stuff. This means if I have just created a method name and I want to ctrl ] to auto complete, Vim will pick it up, but Viemu won't.
For me, it's just down to the necessities
nice integration with ctags, so you can do jump to definition
intelligent completion, that also give you the function prototype
easy way to switch between code and headers
interactive debugging with breaakpoints, but maybe
maybe folding
extra bonus points for refactoring tools like rename or extract method
I'd say stay away from defining projects - just treat the entire file branch as part of the "project" and let users have a settings file to override that default
99% of the difference in speed I see between IDE and vim users is code lookup and navigation. You need to be able to grep your source tree for a phrase (or intelligently look for the right symbol using ctags), show all the hits, and switch to that file in like two or three keystrokes.
All the other crap like repository navigation or interactive debugging is nice, but there are other ways to solve those problems. I'd say drop the interactive debugging even. Just focus on what makes IDEs good editors - have a "big picture" view of your project, instead of single file.
In fact, are there any plugins for vim that already achieve this?