I have a project in IAR Workbench that requires a custom build step to build an intermediate file; this file is put together from a set of other files with a common extension (i.e. there is an intermediate linking step for the input files in a domain specific language).
It appears to me that the "Custom Tool" can only process a single input file at a time; is this true, or is there a checkmark that needs to be set so all files matching the extension list are passed in a single run?
There is no support for having custom build steps which can consume more than one source file at a time. The custom build step works like a compiler, and not like a linker.
Put the files with a common extension into their own group folder. The right-click the group folder and select options, custom build, override inherited settings.
Then you can use a "make" program to generate the intermediate file from all the secondary source files by putting those filenames into a file that make executes.
yes, the make will be run for every file, but since the intermediate file will be newer than all the secondary source files after the first iteration, it won't do much when called for each of the remaining secondary source files.
Not perfect, but should work. Downside is managing the file listing all the input files to create the intermediate file you need.
In newer workbench versions (I checked EWARM 7.60 and newer) the custom build step accepts a list of files for both the input and output of the tool. These file lists are both added to the internal dependency tree.
The file extension for the custom build step does not necessarily need to match the "real" generated files. You can also use a "fake" file (e.g. dummy.step) to run an external tool with a external batch file, which then provides all necessary files at once to the tool.
The disadvantage of this approach is, that you need to manage the list of files manually and twice (within the external batch file for the tool and within the build step configuration for the correct dependency tree).
Related
I have a C++ project for which I need to run a custom build tool on some header files to generate code that is required when compiling this project. In general my configuration works. I trigger a build, VS/MSBuild detects whether the output files are up-to-date and runs the custom build tool only if necessary.
However, a problem arises if the build is run in combination with another configuration of the same project. Both configurations depend on the output files of the custom build tool. So if run sequentially only one configuration should trigger the custom build tool to run. For which ever configuration a build is triggered second the output files of the custom build tool are already present and up-to-date. So there is no need to build them again. Unfortunately this is exactly what is happening. Since the custom build tool takes quite some time to run, this increases build times dramatically.
Another interesting aspect is, that once both configuration have run, I can trigger any of them again and the custom build tool is not invoked.
What I would have expected from the documentation is that the custom build tool is triggered:
If any of the files specified as Outputs is missing
If the file for which I specified the custom build tool was modified later than any of the existing files specified as Outputs
If any of the files I specified as Additional Dependencies were modified later than any of the existing files specified as Outputs
But all of this independent from the configuration for which the build was triggered.
Does anyone have an idea on why this might happen? I checked that the settings for the custom build tool are identical for both configurations. The output files are generated into the same folder for both configurations.
The documentation you're referring to is basically correct but it omits to say that everything in there is basically per project configuration/platform because it uses tracker.exe which depends on .tlog files which by default go into the intermediate directory. So as you figured out, making all configurations use the same location for the tlog files should keep the tracker happy and only invoke the custom build tool when needed, independent of configuration/platform. I'm not sure I'd recommend any of this though, sharing temporary object files might cause you problems later.
Another way to deal with this is adding a seperate project with just one configuration, say 'Custom', and do the custom build there. Than make your current project(s) depend on that project and in the solution's Configuration Manager adjust all entries so each configuration you have now builds the 'Custom' configuration for the new project.
We debug our binary using an IAR workspace (.eww) that wasn't used to build the binary - it was done using make from the command line. The make files were generated by a build system (exactly how is lost in the mists of time).
Is there a way to add the sources to the .eww after the make i.e. automatically traverse the source file directory structure and add the same sources that make uses? There are multiple copies of some of the sources in the structure due to some sloppy copy & pasting i.e. same file, 2 copies, possibly identical, different directories.
Project files (.ewp) are simply xml files. Source files are listed as
<file><name>path\to\source.c</name></file>
in <project> node.
You could create a script (Python for example) which searches for files or make gives them as parameters. Script could then filter out duplicates and update IAR project file accordingly.
But if you are using IAR compiler to build and workbench to debug, then maybe ditch make file completely and create project file from ground up, so you could build and debug directly from the IDE?
I have several projects in my solution, one of which has some test scripts that get copied as part of a post build rule, is there a way to run the post build rule with out doing a "rebuild only" for that project when I want them run?
You could use Custom Build Step instead of post-build event and specify some dummy non-existent Output file. In this case Custom Build Step will run on every build even if project itself is up to date.
Quote from MSDN:
In Outputs, specify the name of the output file. This is a required entry; without a value for this property, the custom build step will not run. If a custom build step has more than one output, separate file names with a semicolon. The name of the output file should be what is specified in the Command Line property. The project build system will look for the file and check its date. If the file is newer than the input file or if the file is not found, then the custom build step will run.
I'm setting up an TFS 2010 Build server.
But I currently have the problem that projects with an "custom build tool" won't build because of "no access".
The projects are C++.
The custom build tool is "Pro*C". Basically, you have an *.pc file, and it generates an *.cpp file.
But on the build server, the directory/files are readonly. So it will crash on "no access".
I have tried to remove the *.cpp file from TFS. So that the generated file does not exists. But apparently the folder is also readonly and won't allow to create a new file.
Does anyone has a workaround for this?
I would suggest one of these options:
Don't put the .cpp files in source control if they are generated by the build process
Change the settings for Pro*C to write its output files to a writable intermediate folder
Add a build step that copies the problem files to an intermediate location for processing
Add a build step that forces the file access on those files to be readable for processing
If you only need to build those files occasionally, then build them manually with Pro*C on your developer machine and check in the results, so you don't waste time unnecessarily rebuilding unchanging files with every desktop or build-server build.
There is a project called MSBuild Community Tasks, which can be downloaded from http://msbuildtasks.tigris.org/. I used it for my own for an automated tfs-build.
It provides you with several extensions for your msbuild-project. One of this is the so called Attrib task, which gives you the opportunity to change file (and probably folder) attributes out of you project. It's not listed in the table on the website, but its documented in their documentation. As a sample from their documentation, you can add the following line to a target of your choice:
<Attrib Files="Test\version.txt" ReadOnly="true" Hidden="true" System="true"/>
I think this will also work for an item-group as follows
<Attrib Files="#(AllYourFiles)" ReadOnly="true" Hidden="true" System="true"/>
If you want to use it, don't forget to install it on the build server. ;)
I know this can be done using a makefile, alas I am not using a makefile, but rather Eclipse's "managed" C++ project. :( In any case, this is what I'm trying to accomplish:
I have a C++ project in Eclipse. I have a number of XML files in this project that specify information about source files that are auto-generated. When the XML files change, I would like a custom tool executed to convert them. The challenges are:
These resources need to be built prior to compiling the project since they specify source and header files used in the compilation.
They should be built only when the XML file is modified (header files are generated, so this is to avoid needless recompilation because a file timestamp changed).
I'd like it included as part of the build process of this project, not a separate project.
I see moving to CMake in the future, but for the time being I am trying to make do.
You can add a pre-build command in Project Properties -> C/C++ Build -> Settings -> Build Steps. This command will be executed before each build.
I'm not aware of a built-in way to get the command to run only when the XML file is modified. If you want that, then instead of making the pre-build command the XML-converting command itself, you can make it a "make" command with a makefile with a single rule whose prerequisite is the XML file and which calls the XML-converting command.