C++ header files intermittently not found - c++

I have noticed some odd behaviour in VS2012 while working with some C++ projects. If I put the following line at the top of a header file: (myclass.h)
#include <D3DX11.h>
The compiler will tell me that it can't find the file (even though intellisense suggests it). However, if I put the same line in a different file (myclass.cpp) everything works just fine. I don't understand why it can be found when used in one file, but not the other? What is going on ?
EDIT / NOTE:
This isn't really a DirectX specific issue. That is just what I was working with when I decided to ask. This can happen in other places as well.

In the newer versions of visual studio the include paths are set at project-level. Maybe the project containing the cpp has the proper path set (see the comment from Jesse Good in your question). However that might not be the case for another project in your solution. As soon as you put the include in the header, all other projects including your header will need to know where to find the directX-headers as well.
Beware of intellisense - if you open a header it has to guess what cpp you might include this header from and the displayed info is not always correct. I think in the newer versions it even depends on what other files are opened / project is selected.
The compiler should inform you what cpp actually causes problems when including the header.

Related

IntelliSense Including Things that Shouldn't be Included

I'm working on a project in Visual Studio 2010. My project is not supposed to be limited to Windows, however, one of the files is OS-dependant.
For this reason, I have #include <windows.h> in only one of the .cpp files. No other file includes this .cpp files. Therefore, logically, the windows.h header should be invisible to the rest of the solution.
This means that I don't have any name conflicts with declarations in the Windows library in all files but that one .cpp, and my project compiles just fine.
However, IntelliSense keeps on insisting that I have name conflicts. When I press Ctrl+Space, IntelliSense suggests identifiers from windows.h. And this is in the scope where windows.h should be invisible!
Is there a setting I can change to stop this annoying behaviour?
I'm 100% positive that the issue is not with the structure of my solution because if I use an identifier already defined in windows.h in another part of my project, the compiler does not recognize it and it doesn't compile...but IntelliSense recognizes it!
Quoting Andy Rich, a Microsoft employee who works on VC++, from a comment on this blog article: Troubleshooting Tips for IntelliSense Slowness
The browsing database will find all source files that are somehow included in your project, either directly or as a result of other #include directives. This is not configurable, and is necessary in order for the IDE to be able to provide accurate answers.
So unfortunately the answer is no, there's nothing you can do beyond disabling IntelliSense altogether.
Since IntelliSense works on a per-project base (.vcproj), not per-solution (.sln), the easy answer is to move the Windows-specific parts to their own project within your solution.
This also helps with porting, as you can more easily replace the Windows-specific parts.

Visual Studio 2010, Intellisense and PCH: what are the alternatives to ugly stdafx.h?

I recently switched to Visual Studio 2010 and for Intellisense not to take half a minute to show up when using boost libraries, Microsoft's suggestion seems to use precompiled headers.
Except that I never used them before (except when forced to by Ugly ATL Wizards (TM)), so I searched around to figure out how they work.
Basically, the Big Centralized stdafx.h approach seems plain wrong. I never want to include (even cheaply) a whole bunch of header files in all my sources. Since I don't use windows libraries (I make C++/CLI higher level wrappers, then use .NET for talking to the outside world), I don't have "a whole truckload of non-changing enormous headers". Just boost and standard library headers scattered around.
There is an interesting approach to this problem, but I can't quite figure out how to make this work. It seems that each source file must be compiled twice (please correct me if I'm wrong): once with /Yc and once with /Yu. This adds burden on the developper which must manually tweak the build system.
I was hoping to find some "automatically generate one precompiled header for each source file" trick, or at least some "best practices", but most people seem happy with including the world into stdafx.h.
What are the options available to me to use precompiled headers on a per source file basis ? I don't really care about build times (as long as they don't skyrocket), I just want intellisense to work fast.
For starters, you are reading the article wrong. Every file is NOT compiled twice. The file stdafx.cpp gets compiled once with /Yc (c, for create) before anything else and then every other file in your project gets compiled once with /Yu (u, for use) and imports the result of the previously created saved state from stdafx.cpp.
Secondly, the article is 7 years old and is talking about VC++ 6, so you should start off distrusting it. But even assuming the information in it still applies to VC++ 2008 or 2010, it seems like bad advice. The approach it recommends using /pragma hdrstop is solution looking for a problem. If you have headers that contain things you don't want in every file, then they simply shouldn't go in your pre-compiled header.
Your problem basically seems to be that Intellisense is slow for Boost in VS2010? I don't have a direct solution for this problem, but could Visual Assist X be an option for you? I have used it in various versions of Visual Studio now and with great pleasure. Not a direct solution, but it might work for you.
Precompiled headers aren't too bad if you use them properly.
Don't use them as a replacement for proper and precise #includes, but as a way to speed things up. Achieve this by making the precompiled header do nothing in release builds, only speeding stuff up in debug.
You are wrong, each file is only compiled once. You have one .cpp file that is compiled with /Yc and the rest are compiled with /Yu. The file with /Yc, which is stdafx.cpp by default, contains one line, #include "myMainHeader.h" (changed the name from the default) All other .cpp files must start with #include "myMainHeader.h" When your /Yc file is compiled, the entire internal state of the compiler is saved. That file is loaded when each of your other files is compiled. That is why you must start with including the PCH, so that the /Yu option doesn't change the result of compilation, only the time. Xcode does not make this requirement and will use a PCH regardless of if your .cpp file starts with the right include directive. I have used libraries that relied on this and could not be built without PCH.

Using precompiled headers, header file changes arent picked up, expected?

Visual Studio C++ lib project
Project is set to use precompiled headers
stdafx.cpp is set to create precompiled header
I have a header file, MyClass.h
If I build, then make a change to MyClass.h that should fail to compile, compile still succeeds.
If I do a rebuild, or if I make a change to a cpp file that includes "MyClass.h", then the compile fails as expected.
Is this expected because I'm using precompiled headers? Is there any way to fix it so a 2nds buid picks up header changes without turning off precompiled headers?
Make sure that the header file you are altering is referenced by your project in Solution Explorer. If this is the case, the full build should trigger when it is changed.
In the project properties, set "Enable Minimal Rebuild" to No
Are you sure that stdafx.cpp includes the header in question?
VisualStudio can often get pretty damn stupid over changes. It can go either way, but usually it goes the way you're running into.
I've had it catch onto changes in a header used by one file, but not the fact that it's used in others. So it compiles the one but not the others. Then I get really weird linker errors.
It could of course still be your own damn fault, but VS is, in fact, notoriously stupid. Sometimes a complete rebuild will fix the issue permanently, until next time. Other times you have somehow hosed the project file and hopefully you can get back to the original (like a source server revert). "Undo" most usually does not undo this kind of fubar.
I've noted this several times not needing to be a header that's in the precompiled header. It seems to be somewhat random but one more common correlation is that the header is full of templates. VS is just plain retarded wrt templates.

How does visual studio know which cpp files to rebuild when an include file is changed?

In some of my VS 2005 projects, when I change an include file some of the cpp files are not rebuilt, even though they have a simple #include line in them.
Is this a known bug, or something strange about the projects? Is there any information about how VS works out the dependencies and can I view the files for that?
btw I did try some googling but couldn't find anything about this. I probably need the right search term...
I've experienced this problem from time to time, and with other IDEs too, not just VS. It seems thatv their internal dependency tree sometimes gets out of whack with reality. In these cases, I've found deleting pre-compiled headers (this is important) and doing a complete rebuild always solves the problem. Luckily, it doesn't happen often.
To be honest I never faced such a problem using Visual Studio. Your CPP should be rebuild as well if it includes the header. The only reason I can come up with: same include file is taken from 2 different sources.
You can try do debug this at compile time, by enabling the preprocessor to output preprocessed files. Click on the CPP file go to properties and then to C/C++->Preprocessor and select in "Generate Preprocessed File" the item with or without line numbers.
Go to you include file put the pragmas around your newly added definitions like:
#pragma starting_definition_X
...
#pragma ending_definition_X
Now compile everything. There will be a newly created file with the same name as CPP but with extension .I (or .i).
Make a search if your pragmas are there. If not, your include come from another place.
If you use pre-compiled headers, you cpp should rebuild. There is also a pragma once statement in MS VC, which parses the include file only once, but that should still recompiler you cpp-file.
Hope that helps,
Ovanes
Do you have the "Minimal rebuild" option turned on?
Visual studio compares the timestamps on the files. So you might want to check that your system clock is set correctly and also that none of the files has a funny timestamp on it. Look at the include files, the cpp files, the pch files and obj files and make sure all the timestamps look reasonable. In particular, make sure none of them are in the future.
Was the .h files added in the project? If not, then vs maybe unable to find out the dependency.
Thanks for all the answers they have helped point me in the right direction.
I have discovered that deleting the idb file and rebuilding will then allow subsequent modifications of .h files to cause the correct .cpp files to be built. However this causes the entire project to be rebuilt which just brings me back to Neil Butterworth's suggestion of doing a full rebuild. I don't think there is much else I can do about it.
As an aside, looking at the bad and good idb files I can see that the cpp file that was not being built is not in the bad idb, whereas it is in the good idb. The header file that is being changed is mentioned several times in both files.
win_pdbx (download) can extract the idb file and moyix has published some information about the streams in these files.
Stream 4 contains the file paths of the cpp files but I have not been able to determine the format.

Visual C++ 'Force Includes' option

I have just come across a Visual C++ option that allows you to force file(s) to be included - this came about when I was looking at some code that was missing a #include "StdAfx.h" on each .cpp file, but was actually doing so via this option.
The option can be found on the Advanced C/C++ Configuration Properties page and equates to the /FI compiler option.
This option could prove really useful but before I rush off and start using it I thought I'd ask if there are any gotchas?
I would say the opposite to litb above if you're using precompiled headers. If you use "stdafx.h" as your precompiled header and have code like this:
#include "afile.h"
#include "stdafx.h"
then you'll be spending an age trying to figure out why "afile.h" isn't being included. When using precompiled headers, all the includes and #defines are ignored up to the "stdafx.h". So, if you force include the "stdafx.h" then the above will never happen and you'll get a project that uses the precompiled option efficiently.
As for litb's comment about finding macros, good IDE's usually have an option to jump to the definition of a symbol, whether it be a #define, function, class, etc.
I would discourage from /FI (MSDN says it's called /FI . Not sure whether i looked at the right page though), simply because people or yourself reading the files don't notice a header is magically included anyway.
You can be sure this will cause much debugging time for someone that wants to figure out where specific macros come from, even though there are no #include lines at the top of the file.
Force includes is also helpful for automatically generated source files. Our system uses a tool that generates many source files but they don't include our pre-compiled header file. With "force includes" compilation of these files is now faster.
Optionally, it would be possible to write a script to insert the #include line in those files after generation and before compilation. But why go to that trouble?
I'd side with litb: don't do the forced includes. Having the code be explicit makes it easier to know what's going on for a new user. It also makes it easier to know what's going on if you ever need to port the code to a new platform.
Note that if the code uses templates, Visual Studio usually can't track down the definitions correctly. Perhaps 2010 will be better, but even VS 2008 is problematic on this front.
I wouldn't use it that often but it has its uses. I have used it to add a header that suppressed some warnings to all cpp files so that I could turn on /W4 or /Wall for the project and not have to edit all of the cpp files to include the warning suppression header first. Once eveything was working I DID go back and edit all the cpp files but for a proof of concept /FI was useful.
Likewise you can use it to force a precompiled header into cpp files in some build configurations but not in all (in case you want to have a build configuration that DOESNT use precompiled headers and that makes sure that each cpp only includes exactly what it needs to). However using #pragma hdrstop is, IMHO, a better way to achieve this.
I've talked about all of this on my blog here: http://www.lenholgate.com/blog/2004/07/fi-stlport-precompiled-headers-warning-level-4-and-pragma-hdrstop.html in a little more detail.
Save this function for when something weird comes up - like if you want to include a header in a generated source file. Even then there are likely better ways.