Files related to C++ - c++

I program in C++. When I see the folder in which .cpp file is saved I found some files like cpp, o and exe.
Are there any other files also related when the program is run, such as bak, tds? What is the difference between them and when are they made. I mean I am in the impression that cpp is formed when we save, object file when we compile. When are the bak and exe files are made? Or correct me.

Now, unless you created the file yourself it's most likely somehow related to your toolset.
*.cpp, *.h => Source files. These are the ones you'll edit to do your programming.
*.o, *.obj => Object-files. These are the translated version of the source files (the *.cpp files to be more precise) and are the raw material for the Linker.
*.exe => The executable. After the Linker is through with your object files it chains all of them together to create the actual executable which can be run by your OS.
*.bak => Typically a 'backup' file, which is often used if there's a risky operation going on, so it will be easy to restore damage.
*.tds => I know this one as Turbo Debugger-File. It is required by the IDE to allow stepping through the compiled code, keeping symbols etc.

.cpp files are for source code
.h files are for headers
there may be also other files with various extensions which contain resources, project definition,
build definitions, make-files - it depends on the development environment / compiler.
the .o files are binaries generated from the source code
the .exe file is windows executable - this is probably the single file, you want to run the program.
and this files are created during build from all libraries and .o files.
.bak files are only backup of some previous version of .cpp or .h files. The creation depends on the development environment.

bak tends to indicate backup files of some sort, probably made by an editor when you edit the files. I had no idea initially about tds files but a cursory web search turned up the fact that they are likely files produced for Borland Turbo Debugger (tds stands for Turbo Debugger Symbols).
You may want to consider upgrading to a compiler that's been updated sometime in the last fifteen years. There's rarely an excuse for using technology that dated when there are far better (and free) alternatives available.
Generally, you write the cpp files and the compiler compiles them. o files are usually files that have been compiled to objects and these are then combined into an exe executable file.
Note that this all depends on your environment of course. For example, UNIX compilers rarely produce exe files since that's a DOS/Windows-ism.

Related

Do I have to recompile C++ code everytime during development?

Let say we have a large code base and we are doing development in C++. Do we have to recompile everytime in order to test the code?
If yes then it is going to take ages to do development.
What's the solution to this problem?
Yes, you'll definitely need to compile C++ code if you want to test it. C++ code can't be executed without being compiled.
However, if you organize your project smartly, compilation could take only a few seconds, or maybe up to a minute, even if there are thousand (or even more) of files.
By default, your build system will run an incremental build, except if you explicitly request a "rebuild" or did a "clean" previously. It will then invoke the compiler/linker accordingly and make sure it compiles/links only what needs to be (if a cpp file did not change, no need to compile it, this is all based on file timestamps, if the "object" file (generated) is older than the cpp file (source), the build system knows it's up-to-date and won't generate it again. If you use Visual Studio and/or CMake or whatever IDE, build system, they all support that!
Additionally, you can follow some guidelines to make this even faster:
Firstly, organize your project in modules (libraries), ideally with dynamic link. Then when a file from a library is changed, only this library needs to be compiled (other libraries or programs using the modified library won't have to be compiled again).
When you'll modify only an implementation file (cpp file), only this file + link of the module using it will be needed.
When you'll modify an header file (h file), all cpp files including it will need to be recompiled, so you must be careful to optimize your includes. Prefer forward declaration (see why here)to includes whenever it's possible (else, your header becomes a dependency of all cpp files using the other header file including yours...as a cascade, modifying this header file will end up requiring compilation of tones of cpp files). Don't include files you don't need (because it will fire a new useless build when the header file changes). Possibly use precompiled headers to speedup compilation.
Note: As commented, there are apparently some tools that can interpret C++ without compiling it...but that's not what C++ was designed for at the first time. And I doubt they will be fast as compiled code at runtime....so you'll probably save 20sec of incremental build time and then loose minutes at runtime....

Path of least resistance when unit testing C++ code in an exe, in Visual Studio 2012

I'm in need of some sage advice here. Long story short, I'm rebuilding a - for me - relatively complex app comprised of about 7000 lines of code. I ran into a number of issues when I created the first iteration of my application and it seems to me that test driven development might just be the ticket.
I was pleased to see that Visual Studio 2012 now natively supports TDD in C++, so I went ahead and read as much as I could. Unfortunately, Vs2012 is fairly new and I feel the documentation is somewhat lacking. But this is a little beside the point. I'm relying mainly on the following guide on the MSDN site:
http://msdn.microsoft.com/en-us/library/hh419385.aspx#objectRef
It fairly clearly states that if the code under testing is to be built as an .exe, then the way forward is creating a separate test project and linking the output object file. I'm guessing they mean the object files? Or maybe not?
I'm honestly a little confused as to how many .obj's I need to link. At first I thought I needed to link every single obj file which is fairly tedious.
If anyone has experience doing this and could perhaps also recommend which macros or similar short cuts to use in order to make this process as painless as possible, I'd be much obliged!
This will depend on how you have your solution structured. The way I like to structure my solutions is to have three projects.
A .lib project that has my source code in it.
An executable project, linked with the .lib. This calls into the .lib in the main() call
A test project (exe), linked with the .lib.
With this structure you can use the Add New Reference... button in the Common Properties section and the references will be sorted for you (except the header include path found in C++\General\Additional include directories).
If you do not want to restructure your projects you can tell the linker about each obj file (Linker\Input\Additional dependencies). This may be a significant number of .obj files if you have a lot of classes that you want to test. Unfortunately, you may have issues if you use pre-compiled headers.
I would suggest restructuring the projects if you can.
There's a nifty option when you use a project dependency, that lets you choose between linking the output file or having the IDE automatically select all the object files from the other project as dependencies.
(Don't worry about the .NET stuff in the screenshot, this was taken from an project where a C++/CLI DLL included a native static library project. Just do the same thing with a native test project including a native DLL or EXE project, choosing to link with the inputs.)
Unit Test Project for a Native Application (.exe) Project
Add the Unit Test Project to the Solution
Right Click on the Solution, Add, New Project. Under Visual C++, choose Native Unit Test Project.
Add the Application as a Reference to the Unit Test Project
Right click the unit test project, Properties, Common Properties, References: Add the .DLL project as a reference. This tells MSVC to rebuild the application if it has changed since the last unit test build, before rebuilding the unit test project.
Tell MSVC to Where to Find the Application's Library and Object Files
Right click the unit test project, Properties, Linker, General: Edit Additional Library Directories and add the path(s) to your applications object and library files.
Collect all the .obj and .lib Names
Run this batch file from the subdirectory or subdirectories where your Application's object and library files are located, concatenate the .txt files if there is more than one directory. For convenience you might want to add the .txt file to your project.
: *** CollectObjLibFilenames.bat ***
dir /B *.obj > ObjLibFilenames.txt
dir /B *.lib >> ObjLibFilenames.txt
Tell MSVC to Link the Application Object Files to the Unit Test Application
Right click the unit test project, Properties, Linker, Input: Edit Additional Dependencies and add the application object filenames and library (.obj and .lib) file names (copy and past the files from ObjLibFileNames.txt).
If your Application project uses precompiled headers, don't forget to include the precompiled header object file(s), usually
stdafx.obj, If you omit it, you will get a LNK2011 error.
Microsoft says "If you use precompiled headers, LINK requires that all of the object files created with precompiled headers must be linked in."
I thought there would be a name collision if I added the object file containing my application's entry point, main(int argc, char *argv), but my unit test projects link successfully with or without main.obj. I have not tried linking a file with other entry point flavors (WinMain, wWinMain, wmain). If you have a name collision with one of those, you could aways change the name of your entry point (which would be weird): Properties, Linker, Advanced, edit the Entry point, and rename the Application's entry point function correspondingly. The option is not specified in the unit test project I just looked at, which I assume means default, which almost surely is main(int argc, char *argv).
My main.cpp files have only one function (main) and no globals, i.e. no other part of the application refers to anything in main.cpp. I assume you can get away with omitting any object file if nothing in it is referenced by a linked file. Not worth the effort to figure out which satisfy that requirement for small applications. For large applications...good luck with that; Eventually you'll want to test all your execution paths anyway.
You will likely have a precompiled header object file, stdafx.obj file in the unit test project as well as the one in your application project. That will not be a problem, as the default object file names for the precompiled header files are $(TargetName).pch, where $(TargetName) resolves the project name. I.e., the pch object files will have different names.
Suggetion: Rather than copying the contents of my application's stdafx.h file into the corresponding unit test file, include the application's stdafx.h in the unit test project's stdafx.h file, so you don't have to update the unit test's version when the application's file changes. #include <stdafx.h> works, but I use the relative path between the two projects (if their relative paths are stable), or the full pathname of the application's source file if that's more stable, to be sure the right file is found. See difference-between-include-hpp-and-include-hpp for an unsettling explanation about how #include"header.h" and #include are interpreted. Spoiler: it's another implementation specific feature of C++.
_________________________________________________________________________
As an aside, precompiled header files are specified on a per source file (.cpp) basis. An individual .cpp file can use only one precompiled header file, but you can have more than one precompiled header file in the same project. See this:

Building project on Codelite(c++) recompiles too many files

I have a c++ project with many files. When I build the project making even small changes to the code, It recompiles a large no of files. This is increasing the compile time of the project. So I need suggestion about the ways I can improve the structure of the project or any other optimisations possible which will help in reducing the compile time of the project.
Also there are a couple of files which are getting recompiled even when I make no changes to the project. Somehow make doesn't detect that those files need not be recompiled or may be I am missing something.
I am using Codelite on linux(Ubuntu) for my project. The language is C++.
The link provided above will give you a detailed explanation. Just to put it in a lighter way I am adding a few more things.
If you change something in a CPP file only that file will get recompiled. If it is a header file then it will be a different story. If you have a header file included across multiple modules and when you make changes to that header file, all the associated CPP files will get compiled and again that is the way it should be. So you need to be careful while writing the code if you really care about this aspect. At a later stage it will be difficult to manage such things.
Regarding the compilation of some files even if you don't change anything may happen when
1] we tell the compiler to do so.
2] some CPP or header files are generated or modified automatically while run your build
3] there is a change in file time stamp
4] there is a change in folder name/structure
Last but not least, also try changing the code lite build target.
These are the possibilities I know. There will be more [definitely :)]...

How to build midas.obj from the midas source code

Recently I discovered a problem on the midas and I fixed it, the problem now is that I want to use MidasLib not the midas.dll and with the source code I'm only able to build the DLL.
The source is C++ and I have very few knowledge with it. I know the MidasLib.pas uses internally midas.obj, so I need to create it to statically link the midas to my application. How to do it on C++ Builder? (XE)
When you compile C++ code, the compiler creates an .OBJ file for each .CPP/.C file you have and saves them somewhere on your computer. What happens in most cases is that one would run a linker on all of those .OBJ files to join them into a single EXE or DLL, but in your case you don't need those results. Your C++ Builder is, like most programming IDEs, automatically doing both the compilation and linking.
If you just want the .OBJ, you need to find where in your project folder C++ Builder is placing its .OBJ files (called its "intermediate output", typically, as it is the intermediate step between compilation and linking). So you must have a source file called midas.cpp or midas.c that produces a corresponding output file called midas.obj.

Sharing precompiled headers between projects in Visual Studio

I have a solution with many Visual C++ projects, all using PCH, but some have particular compiler switches turned on for project-specific needs.
Most of these projects share the same set of headers in their respective stdafx.h (STL, boost, etc). I'm wondering if it's possible to share PCH between projects, so that instead of compiling every PCH per-project I could maybe have one common PCH that most projects in the solution could just use.
It seems possible to specify the location of the PCH as a shared location in the project settings, so I have a hunch this could work. I'm also assuming that all source files in all projects that use a shared PCH would have to have the same compiler settings, or else the compiler would complain about inconsistencies between the PCH and the source file being compiled.
Has anyone tried this? Does it work?
A related question: should such a shard PCH be overly inclusive, or would that hurt overall build time? For example, a shared PCH could include many STL headers that are widely used, but some projecst might only need <string> and <vector>. Would the time saved by using a shared PCH have to be paid back at a later point in the build process when the optimizer would have to discard all the unused stuff dragged into the project by the PCH?
Yes it is possible and I can assure you, the time savings are significant. When you compile your PCH, you have to copy the .pdb and .idb files from the project that is creating the PCH file. In my case, I have a simple two file project that is creating a PCH file. The header will be your PCH header and the source will be told to create the PCH under project settings - this is similar to what you would do normally in any project. As you mentioned, you have to have the same compile settings for each configuration otherwise a discrepancy will arise and the compiler will complain.
Copying the above mentioned files every time there is a rebuild or every time the PCH is recompiled is going to be a pain, so we will automate it. To automate copying, perform a pre-build event where the above mentioned files are copied over to the appropriate directory. For example, if you are compiling Debug and Release builds of your PCH, copy the files from Debug of your PCH project over to your dependent project's Debug. So a copy command would look like this
copy PchPath\Debug*.pdb Debug\ /-Y
Note the /-Y at the end. After the first build, each subsequent build is incrementally compiled, therefore if you replace the files again, Visual Studio will complain about corrupted symbols. If they do get corrupted, you can always perform a rebuild, which will copy the files again (this time it won't skip them as they no longer exist - the cleanup deletes the files).
I hope this helps. It took me quite some time to be able to do this, but it was worth it. I have several projects that depend on one big framework, and the PCH needs to be compiled only once. All the dependent projects now compile very quickly.
EDIT: Along with several other people, I have tested this under VS2010
and VS2012 and it does appear to work properly.
While this is an old question I want to give a new answer which works in Visual Studio 2017 and does not involve any copying. Only disadvantage: Edit and continue doesn't work anymore.
Basically you have to create a new project for the precompiled header and have all other project depend on it. Here is what I did:
Step by step:
Create a new project withnin your solution which includes the header (called pch.h from hereon) and a one line cpp file which includes pch.h. The project should create a static lib. Setup the new project to create a precompiled header. The output file needs to be accessible by all projects. for me this relative to IntDir, but for default settings it could be relative to $(SolutionDir). The pch project must only have defines all others projects have too.
Have all other projects depend on this new project. Otherwise the build order might be wrong.
Setup all other projects to use the pch.h. See, how the output file parameters are the same as in the pch project. Additional include directories also need to point to the pch.h directory. Optionally you can force include the pch file in every cpp (or you include it manually in the first line of every cpp file).
Setup all projects (including the pch project) to use the same compiler symbol file (the linker symbol file is not affected). Again, in my example this is OutDir but in your solution this might vary. It has to point to the same file on disk. The Debug Information Format needs to be set to C7 (see screenshot above), otherwise Visual Studio will not be able to compile projects in parallel.
I hope I didn't forget anything. For my solution (130k loc, 160 projects) this lead to a compile time of ~2:30mins instead of ~3:30mins.
It seems it's not possible because each source file has to be compiled against the same PDB against which the PCH was compiled. darn it.
Samaursa's answer worked for me.
I also saw this link that works (look for Reginald's answer near the bottom).
This one uses copy while Reginald's uses xcopy (I prefer xcopy). Either way, thanks--this sped up my builds considerably.
This sounds like a case of "diminishing returns" to me. Suppose including the common headers directly wastes 1 second per .cpp file, and each target (DLL/EXE) has 10 .cpp files. By using a .pch per target, you save 10 seconds per target. If your whole project has 10 targets, you save 1.5 minutes on the whole build, which is good.
But by reducing it to one .pch for the whole project, you'd only save a further 9 seconds. Is it worth it? The extra effort (which may be a lot more fiddly to set up, being a non-standard configuration unsupported by VS wizards) produces only a 10th of the saving.
On 2012, you can use a PDB and just build the pch from a lib project that only builds the pch and the main project that depends on the pch lib into the same directory (no copying), unfortunately this doesn't work with 2013+ except via a long winded work-a-round.