I have a series of standard cpp files and each of these files contain a file-specific #include statement. However, the content of those included files must be populated by a pre-processing tool prior to invoking the standard C++ compiler.
The tricky part is that I want this to be fully integrated into Visual Studio using MSBuild. Therefore, when I bring up Visual Studio's Property Window on a cpp file, I want to see all the standard C++ compiler options and, ideally, some Custom Properties controlling the pre-processor tool. As a OOP analogy, I want my build tool to inherit everything from the standard CL MSBuild Rule, and add some Custom Properties & Build Steps to it.
I have successfully done this through an extremely laborious process of basically creating a Custom MSBuild Rule and Copy/Paste most of the C++ Options into my Custom Rule. Finally, I then pass along the million C++ Options to the standard C++ compiler through the CommandLineTemplate entry in my MSBuild .props file. It's ridiculously complicated and the C++ options don't automatically get updated as I update Visual Studio.
I can find plenty of examples of Custom MSBuild Rules, but I haven't been able to find one where it piggybacks onto an existing one.
Not a lot of love for MSBuild, I take it...
Anyway, after years of going back and forth on that one, I finally found something, soon after I posted my question. The key was to search for "extending" existing Rule which, apparently, I hadn't tried before.
Usually, when you create a Build Customization in VS, you end up with 3 files:
MyCustomBuild.xml:
Contains the Properties & Switches, as shown on the VS's Property Sheet.
MyCustomBuild.props:
Contains default values for those Properties. They can be made conditional through the use of the Condition attribute.
MyCustomBuild.targers:
Contains a line to load up your xml and the Target/Task entries.
So the first part was to extend the existing C/C++ Properties as shown in Visual Studio. I found this link, which finally gave me something to work with:
https://github.com/Microsoft/VSProjectSystem/blob/master/doc/extensibility/extending_rules.md
Here's the xml bit.
<Rule
Name="RuleToExend"
DisplayName="File Properties"
PageTemplate="generic"
Description="File Properties"
OverrideMode="Extend"
xmlns="http://schemas.microsoft.com/build/2009/properties">
<!-- Add new properties, data source, categories, etc -->
</Rule>
Name attribute:
The Name attribute must match the rule being extended. In this case, I wanted to extend the CL rule, so I set that attribute = "CL".
DisplayName attribute:
This is optional. When provided, it will overwrite the tool's name seen on the Property Sheet. In this case, the tool name shown is "C/C++". I can change it to show "My C/C++" by setting this attribute.
PageTemplate attribute:
If this attribute is provided, it must match the overwritten rule's value. In this case, it would be "tool". Just leaving it out seems to work fine. I suspect this could be needed if 2 rules had the same name, but a different template. You could use this to clarify which one you wanted to extend.
Description attribute:
Optional. I don't know where that even shows up within the VS GUI. Maybe it's just for documenting the xml file.
OverrideMode attribute:
This is an important one! It can be set to either "Extend" or "Replace". In my case, I chose "Extend".
xmlns attribute:
Required. Doesn't work properly if not present.
As the link suggest, you can then provide the properties, data source and categories. Keep in mind that categories are usually displayed in the order they appear in the xml file. Since I was extending an existing Rule, my custom categories would all show up after the standard C/C++ categories. Given that my tool is for pre-processing the files, I would have preferred having my custom options at the top of the Property Sheet. But I couldn't find way around that.
Note that you do NOT need the ItemType/FileExtension or ContenType Properties, typically found for custom Rules.
So once I entered all of that, my custom pre-processing options showed up alongside the standard C/C++ properties on the Property Sheet. Note that all these new properties would be attached to the "ClCompile" Item list, with all the other C/C++ properties.
The next step was to update the .props file. I'm not going to get into it since it's pretty much standard when create these custom build Rules. Just know that you need to set them using the "ClCompile" Item, as mentioned above.
The final step was to get the .targets file to do what I wanted.
The first part was to "import" (not really an import entry) the custom Rule through the typical entry:
<ItemGroup>
<PropertyPageSchema Include="$(MSBuildThisFileDirectory)MyCustomBuild.xml" />
</ItemGroup>
Then I needed to pre-process every source file. Ideally, it would have been nicer to pre-process a file and then compile it - one file a time. I could have done this by overwriting the "ClCompile" target within my own .targets file. This Target is defined under the "Microsoft.CppCommon.targets" file (location under "C:\Program Files (x86)" varies, depending on the VS version). I basically could have Cut & Pasted the whole Target into my file, then add my pre-processing task code before the "CL" Task. I also would have needed to convert the Target into a Target Batch, by adding an "Outputs=%(ClCompile.Identity)" attribute to the "ClCompile" Target. Without this, my pre-processing task would have ran on all files before moving on to the "CL" task, bringing me back to square one. Finally, I would have needed to deal with Pre-Compiled Header files, since they need to be compiled first.
All of this was just too much of a pain. So I selected the simpler option of defining a Target which looks like this:
<Target Name="MyPreProcessingTarget"
Condition="'#(ClCompile)' != ''"
Outputs ="%(ClCompile.Identity)"
DependsOnTargets="_SelectedFiles"
BeforeTargets="ClCompile">
There are a number of attributes defined but the most important one is the BeforeTargets="ClCompile" attribute. This is what forces this target to execute before the cpp files are compiled.
I also chose to do a Target Batch processing here [Outputs ="%(ClCompile.Identity)"] because it was just easier to do what I wanted to do, if I assumed to have 1 file being processed at a time, in my Target.
The attribute DependsOnTargets="_SelectedFiles" is used to know if a GUI user has some selected file within VS Solution Explorer. If so, the files will be stored in the #(SelectedFiles) Item List (generated by the "_SelectedFiles" Target). Typically, when selecting specific files within the Solution Explorer and choosing to compile them, VS will forcefully compile them even if they are up-to-date. I wanted to preserve that functionality for the automatically-generated pre-processed include files, and forcefully regenerate them as well, for those selected files. So I added this block:
<ItemGroup Condition="'#(SelectedFiles)' != ''">
<IncFilesToDelete Include="%(ClCompile.Filename)_pp.h"/>
</ItemGroup>
<Delete
Condition="'#(IncFilesToDelete)' != ''"
Files="%(IncFilesToDelete.FullPath)" />
Note that the automatically-generated include files are named SourceFileName_pp.h. By deleting those files, my pre-processing Task will forcefully re-generate them.
Next, I build a new Item list from the "ClCompile" Item list, but with the "_pp.h" versions of the files. I do so with the following code:
<ItemGroup>
<PPIncFiles
Condition="'#(ClCompile)' != '' and '%(ClCompile.ExcludedFromBuild)' != 'true'"
Include="%(ClCompile.Filename)_pp.h" />
</ItemGroup>
The final part is a little uglier.
In order to run my pre-processing exe, I use the standard "Exec" Task. But I obviously only want to run it if the source file is newer than the generated file. I do so by storing the well-known metadata "ModifiedTime" of the source file, and the generated file into a couple of dynamic Properties. But I can't use the ModifiedTime metadata directly, as it's not an comparable value. So I used the following code, which I have found on StackOverflow here:
Comparing DateTime stamps in Msbuild
<PropertyGroup>
<SourceFileDate>$([System.DateTime]::Parse('%(ClCompile.ModifiedTime)').Ticks)</SourceFileDate>
<PPIncFileDate Condition="!Exists(%(PPIncFiles.Identity))">0</PPIncFileDate>
<PPIncFileDate Condition="Exists(%PPIncFiles.Identity))">$([System.DateTime]::Parse('%(PPIncFiles.ModifiedTime)').Ticks)</PPIncFileDate>
</PropertyGroup>
Note that I can store the timestamps in Properties, given that the Item Lists only contain one Item per Target Pass, because of Target Batching.
Finally, I can invoke my pre-processor using the "Exec" Task, as follows:
<Exec
Condition="'#(PPIncFiles)' != '' and $(SourceFileDate) > $(PPIncFileDate)"
Command="pptool.exe [options] %(ClCompile.Identity)" />
Supplying the options was yet, another headache.
Typically, the switches as defined under the xml file are just passed to a "CommandLineTemplate" metadata under the .props file using [OptionName]. This will pass the "Switch" attribute of the Property defined under the xml file. But that implies defining your own TaskName item, made from a TaskFactory, under the .targets file. But in my case, I was just using the existing "Exec" Task, which doesn't know anything about my custom Properties. I didn't know how to retrieve the "Switch" attribute in this case, and what seems to be available is just whatever the "Name" attribute contains. Luckily, a Property has both a Name and a DisplayName. The DisplayName is what the GUI user sees. So I just copied the "Switch" value into the "Name" value, when defining the Properties under the xml file. I could then pass the option to the Exec Task using something like:
<Exec
Condition="'#(PPIncFiles)' != '' and $(SourceFileDate) > $(PPIncFileDate)"
Command="pptool.exe %(ClCompile.Option1) %(ClCompile.Option2)... %(ClCompile.Identity)" />
Where I defined all my Properties as "EnumProperty", with an "EnumValue" having Name="" for disabled options, and other EnumValue having Name="switch" for the others. Not very elegant, but I didn't know a way around this.
Finally, it is recommended that when automatically generating files, the .targets file should also include a way to clean them up when the user Cleans up the Project. That's pretty standard but I'll include it here for convenience.
<PropertyGroup>
<CleanDependsOn>$(CleanDependsOn);PPIncCleanTarget</CleanDependsOn>
</PropertyGroup>
<Target Name="PPIncCleanTarget" Condition ="'#(ClCompile)' != ''">
<ItemGroup>
<PPIncFilesToDelete Include="%(ClCompile.Filename)_pp.h" />
</ItemGroup>
<Delete Files="%(PPIncFilesToDelete.FullPath)" Condition="'#(PPIncFilesToDelete)' != ''"/>
</Target>
I'm using Kate to process text to create an XML file but I've hit a roadblock. The text now contains additional data that I need to remove based on its content.
To be specific, I have an XML element called <officers> that contains 0 or more <officer> elements, which contain further elements such as <title>, <name>, etc.. While I probably could exclude these at run time using XSL, the file also drives another process that I don't want to touch - it's a general purpose data importer for Scribus so I don't want to touch the coding.
What I want to do is remove an <officer> element if the <title> content isn't what I want. For example, I don't want the First VP, so I'd like to remove:
<officer>
<title>First VP</title>
<incumbent>Joe Somebody</incumbent>
<address>....</address>
<address>....</address>
......
</officer>
I don't know how many lines will be in any <officer> element nor what positions they will in within the <officers> element.
The easy part it getting to the start of the content I want removed. The hard part is getting to the </officer> end tag. All the solutions I've found so far just result in Kate deciding that the RegEx is invalid.
Any suggestions are appreciated.
Regex is the wrong tool for this job; never process XML without a proper parser, except possibly for a one-off job on a single document where you will throw the code away after running it and checking the results by hand. You might find a regex that works on one sample document, but you'll never get it to work properly on a well-designed set of 100 test documents.
And it's easily done using XSLT. It's a stylesheet with two template rules: a default "identity template" rule to copy elements unchanged, and a second rule to delete the elements you don't want. In fact in XSLT 3.0 it gets even simpler:
<xsl:mode on-no-match="shallow-copy"/>
<xsl:template match="officer[title='First VP']"/>
I have following Import in a .csproj file. How can I find the value of Variable at this point of props?
<Import Project="<path_to_abc>\$(Variable)\abc.props>"
Condition=" '$(Variable)' != '' "
/>
I get build error:
can not find props files : <path_to_abc>\\abc.props
How can I see the value of Variable here? If I put Message in the .csproj file above Import, what Target dependency should I give?
<Target Name="PrintInfo" BeforeTargets="BeforeBuild">
<Message Text="'$(Variable)' $(Variable.length) " />
</Target>
gives me '' 0.
But is that because of BeforeBuild?
How does MSBuild work?
Does it process all properties before and in the first come first order?
And then if it processes the Targets, does it print the value at the current time?
How can I see what values are put in while evaluating the props?
Sadly the imports aren't currently logged, but this is about to change with the upcoming MSBuild 15.3 release and its binary logging feature.
A call to MSBuild using /flp:Verbosity=diagnostic will emit property reassignment events to an msbuild.log file like this:
0>Property reassignment: $(Foo)="bar" (previous value: "foo") at /Users/martin.ullrich/tmp/test.proj (10,5)
The log will then contain an Initial Properties list with the project's fully evaluated properties (including imports).
It is essential to understand that the statements property definition and import statements are processed in order so when an <Import> uses a property - either in a condition or the project path - it will use the value of the property at that moment.
There are a few other important aspects:
Property groups are processed before item groups and item definition groups. Even across all imported projects! (so an <ItemGroup> with a condition will see the values of properties defined/imported afterwards)
See Property and Item evaluation order
Target conditions are evaluated at the time the target is considered for execution and may be affected by both all imported project files and modifications that happened in other targets that already ran.
See Target Build Order
I have a Target which I want to run once if none of the files in my ItemGroup exist.
<ItemGroup>
<Foo Include="a.txt;b.txt;c.txt" />
</ItemGroup>
<Target Name="Bar" Condition="?">
My question is what is what to put in the '?'.
You can use another Target to go through the file list (what Targets are good at!) and leave a result in a Property. Make a wrapper Target that is dependent on the tester Target and on Bar. Bar has a condition that uses the mechanism set by the first target.
Note that the global property set within a task is not seen until that task has finished, so the idiom is to wrap the thing producing the result and the thing consuming the result as dependents of an empty task.
I think you need to also make the tester a dependency of Bar as well, to make sure it gets the order correct.
Something like this:
<Target Name="TestLoop" Outputs="%(Foo.Identity)">
<PropertyGroup>
<Tested Condition="Exists(%(Foo.Identity))">Present</Tested>
</PropertyGroup>
</Target>
After TestLoop is triggered, the Property Tested will be set to "Present" if and only if at least one of the files is present. That is, it codes a looping logical OR.
Now if you use this as a dependency:
<Target Name="Wrapper" DependsOnTargets="TestLoop;Bar" />
then you can have Bar look at the state left by TestLoop.
Assuming they are executed in the correct order, not in parallel! To ensure that, also make TestLoop a prerequisite for Bar, and the build engine will determine the needed sequence and know not to try doing Bar until after TestLoop is done.
Oh, and Wrapper is the target to ask for. As described above, if you ask for Bar directly it will not see the property update (I think). So you might name them to make the Wrapper the exposed Bar, and your Bar an internal Bar_helper.
Which class of tomcat is responsible for converting .jsp file to .class file? I want to see the source code written for the conversion. My aim is to check the logic how scriptlet comments are eliminated and based on that I'll write my own code that will remove HTML comments as well (I've not decided how will I implement it).
I am sure source code should be available as it's open source.
Or is it possible to implement some kind of filter so the each time server returns a JSP page it removes the comments. I can replace all HTML comments into Scriptlet comments. But I want to ensure, if someone use html comments in future, it is not displayed. It's basically for security.
[Added]
As per the suggestion given by JB Nizet, we will be modifying build.xml file to remove comments. I have come up with this to remove HTML comments -
<target name="-trim.html.comments">
<echo message="Inside trim html comments" />
<fileset id="html.fileset" dir="${build.dir}" includes="**/*.jsp, **/*.html" />
<!-- HTML Comments -->
<replaceregexp replace="" flags="g" match="\<![ \r\n\t]*(--([^\-]|[\r\n]|-[^\-])*--[ \r\n\t]*)\>">
<fileset refid="html.fileset"/>
</replaceregexp>
</target>
However, I am not sure how to remove comments that starts with // or /* */. Any suggestion how can I do so? I have searched over internet but didn't get a clue.
We are using ant script for build.
[Added]
To remove single line comment that starts with // I am using below regex. But somehow it's not working. Can anyone please help me what I'm doing wrong? Thanks in advance.
<replaceregexp flags="gs" match="?:^\s*\/\/(?:.*)$" replace="">
Rather than doing it in Tomcat, use Apache directly. It supports modules which do exactly what you need. mod pagespeed is probably closest to what you want; mod deflate may be configurable to do the same thing, though it also compresses the data, which might be overkill.
As a nice side-effect, this allows you to leave your handy comments in and they'll be served to your internal users (developers) who use port 8080, while those using port 80 will see only the minified product.