An unwanted parameter DATA_PATH_1 appears in XMLs when exporting a job - kettle

I have a transformation that uses a variable defined in kettle.properties to define a folder where some files must be read. Let's call it T1.
This transformation is called from a job J1.
For our production environment we export this job using the File->Exporte->Linked Resourses to XML... option and this file is called by kitchen later on.
In the T1.xml generated file a parameter is generated a used instead of the desired variable. This is the declaration of said variable in the xml file:
<parameter>
<name>DATA_PATH_4</name>
<default_value>file:///C:/Users/Downloads/</default_value>
<description>Data file path discovered during export</description>
</parameter>
Is there a way to prevent this kind of "discovered" parameters?

Related

Can I force the XSLT collection function to treat all files as XSL?

I'm currently building an XSLT stylesheet used to document other XSLT stylesheets in a series of folders and sub-folders. My code pulls out specific details about variables, functions, etc and renders it in an output format. The sheets being read are created by a 3rd party product. Most of them have an XSL extension but some of them are proprietary extensions. I have some files with a DTCBS extension but they are just XSL stylesheets.
I'm currently loading the content of these files into a variable using the XSLT function "collection" as follows:
<xsl:variable name="Collection" select="collection(concat('file:///', encode-for-uri(replace($filePath, '\\', '/')),'?select=*.(xsl|dtcbs|xml);recurse=yes'))" as="node()*"/>
The variable works just fine if I use XSL|XML. But if I include the DTCBS extension, the variable blows up citing "the supplied value is xs:base64Binary".
If I manually put the xml declaration line at the top of my DTCBS file, the variable works fine. Those DTCBS files are auto-generated without the declaration line so I can't fix that, nor can I manually edit them each time I want to run my documenter code.
From what I can tell, because it's not an XSL extension, and the XML declaration line isn't present, the XSLT parser thinks it's base64 when it isn't.
I'm using Saxon as my XSLT parser and the Saxon documentation says it uses file extensions and http headers to detect the file type.
Does anyone know if there is a way to force collection() to treat every file as an XSL?
Tried adding the XML declaration line in the DTCBS file. This did correct the issue but I can't do this in all cases as I am trying to automate the entire thing.
I also renamed the DTCBS extension to XSL and the problem went away as well.
As well as Martin's suggestion, you can register content types with the Saxon configuration:
processor.getUnderlyingConfiguration()
.registerFileExtension("dtcbs", "application/xml");
This has been available since Saxon 9.7.
Try to add e.g. content-type=application/xml e.g. '?select=*.(xsl|dtcbs|xml);recurse=yes;content-type=application/xml'.

Pre-Processing C++ file using MSBuild and Visual Studio 2012+

I have a series of standard cpp files and each of these files contain a file-specific #include statement. However, the content of those included files must be populated by a pre-processing tool prior to invoking the standard C++ compiler.
The tricky part is that I want this to be fully integrated into Visual Studio using MSBuild. Therefore, when I bring up Visual Studio's Property Window on a cpp file, I want to see all the standard C++ compiler options and, ideally, some Custom Properties controlling the pre-processor tool. As a OOP analogy, I want my build tool to inherit everything from the standard CL MSBuild Rule, and add some Custom Properties & Build Steps to it.
I have successfully done this through an extremely laborious process of basically creating a Custom MSBuild Rule and Copy/Paste most of the C++ Options into my Custom Rule. Finally, I then pass along the million C++ Options to the standard C++ compiler through the CommandLineTemplate entry in my MSBuild .props file. It's ridiculously complicated and the C++ options don't automatically get updated as I update Visual Studio.
I can find plenty of examples of Custom MSBuild Rules, but I haven't been able to find one where it piggybacks onto an existing one.
Not a lot of love for MSBuild, I take it...
Anyway, after years of going back and forth on that one, I finally found something, soon after I posted my question. The key was to search for "extending" existing Rule which, apparently, I hadn't tried before.
Usually, when you create a Build Customization in VS, you end up with 3 files:
MyCustomBuild.xml:
Contains the Properties & Switches, as shown on the VS's Property Sheet.
MyCustomBuild.props:
Contains default values for those Properties. They can be made conditional through the use of the Condition attribute.
MyCustomBuild.targers:
Contains a line to load up your xml and the Target/Task entries.
So the first part was to extend the existing C/C++ Properties as shown in Visual Studio. I found this link, which finally gave me something to work with:
https://github.com/Microsoft/VSProjectSystem/blob/master/doc/extensibility/extending_rules.md
Here's the xml bit.
<Rule
Name="RuleToExend"
DisplayName="File Properties"
PageTemplate="generic"
Description="File Properties"
OverrideMode="Extend"
xmlns="http://schemas.microsoft.com/build/2009/properties">
<!-- Add new properties, data source, categories, etc -->
</Rule>
Name attribute:
The Name attribute must match the rule being extended. In this case, I wanted to extend the CL rule, so I set that attribute = "CL".
DisplayName attribute:
This is optional. When provided, it will overwrite the tool's name seen on the Property Sheet. In this case, the tool name shown is "C/C++". I can change it to show "My C/C++" by setting this attribute.
PageTemplate attribute:
If this attribute is provided, it must match the overwritten rule's value. In this case, it would be "tool". Just leaving it out seems to work fine. I suspect this could be needed if 2 rules had the same name, but a different template. You could use this to clarify which one you wanted to extend.
Description attribute:
Optional. I don't know where that even shows up within the VS GUI. Maybe it's just for documenting the xml file.
OverrideMode attribute:
This is an important one! It can be set to either "Extend" or "Replace". In my case, I chose "Extend".
xmlns attribute:
Required. Doesn't work properly if not present.
As the link suggest, you can then provide the properties, data source and categories. Keep in mind that categories are usually displayed in the order they appear in the xml file. Since I was extending an existing Rule, my custom categories would all show up after the standard C/C++ categories. Given that my tool is for pre-processing the files, I would have preferred having my custom options at the top of the Property Sheet. But I couldn't find way around that.
Note that you do NOT need the ItemType/FileExtension or ContenType Properties, typically found for custom Rules.
So once I entered all of that, my custom pre-processing options showed up alongside the standard C/C++ properties on the Property Sheet. Note that all these new properties would be attached to the "ClCompile" Item list, with all the other C/C++ properties.
The next step was to update the .props file. I'm not going to get into it since it's pretty much standard when create these custom build Rules. Just know that you need to set them using the "ClCompile" Item, as mentioned above.
The final step was to get the .targets file to do what I wanted.
The first part was to "import" (not really an import entry) the custom Rule through the typical entry:
<ItemGroup>
<PropertyPageSchema Include="$(MSBuildThisFileDirectory)MyCustomBuild.xml" />
</ItemGroup>
Then I needed to pre-process every source file. Ideally, it would have been nicer to pre-process a file and then compile it - one file a time. I could have done this by overwriting the "ClCompile" target within my own .targets file. This Target is defined under the "Microsoft.CppCommon.targets" file (location under "C:\Program Files (x86)" varies, depending on the VS version). I basically could have Cut & Pasted the whole Target into my file, then add my pre-processing task code before the "CL" Task. I also would have needed to convert the Target into a Target Batch, by adding an "Outputs=%(ClCompile.Identity)" attribute to the "ClCompile" Target. Without this, my pre-processing task would have ran on all files before moving on to the "CL" task, bringing me back to square one. Finally, I would have needed to deal with Pre-Compiled Header files, since they need to be compiled first.
All of this was just too much of a pain. So I selected the simpler option of defining a Target which looks like this:
<Target Name="MyPreProcessingTarget"
Condition="'#(ClCompile)' != ''"
Outputs ="%(ClCompile.Identity)"
DependsOnTargets="_SelectedFiles"
BeforeTargets="ClCompile">
There are a number of attributes defined but the most important one is the BeforeTargets="ClCompile" attribute. This is what forces this target to execute before the cpp files are compiled.
I also chose to do a Target Batch processing here [Outputs ="%(ClCompile.Identity)"] because it was just easier to do what I wanted to do, if I assumed to have 1 file being processed at a time, in my Target.
The attribute DependsOnTargets="_SelectedFiles" is used to know if a GUI user has some selected file within VS Solution Explorer. If so, the files will be stored in the #(SelectedFiles) Item List (generated by the "_SelectedFiles" Target). Typically, when selecting specific files within the Solution Explorer and choosing to compile them, VS will forcefully compile them even if they are up-to-date. I wanted to preserve that functionality for the automatically-generated pre-processed include files, and forcefully regenerate them as well, for those selected files. So I added this block:
<ItemGroup Condition="'#(SelectedFiles)' != ''">
<IncFilesToDelete Include="%(ClCompile.Filename)_pp.h"/>
</ItemGroup>
<Delete
Condition="'#(IncFilesToDelete)' != ''"
Files="%(IncFilesToDelete.FullPath)" />
Note that the automatically-generated include files are named SourceFileName_pp.h. By deleting those files, my pre-processing Task will forcefully re-generate them.
Next, I build a new Item list from the "ClCompile" Item list, but with the "_pp.h" versions of the files. I do so with the following code:
<ItemGroup>
<PPIncFiles
Condition="'#(ClCompile)' != '' and '%(ClCompile.ExcludedFromBuild)' != 'true'"
Include="%(ClCompile.Filename)_pp.h" />
</ItemGroup>
The final part is a little uglier.
In order to run my pre-processing exe, I use the standard "Exec" Task. But I obviously only want to run it if the source file is newer than the generated file. I do so by storing the well-known metadata "ModifiedTime" of the source file, and the generated file into a couple of dynamic Properties. But I can't use the ModifiedTime metadata directly, as it's not an comparable value. So I used the following code, which I have found on StackOverflow here:
Comparing DateTime stamps in Msbuild
<PropertyGroup>
<SourceFileDate>$([System.DateTime]::Parse('%(ClCompile.ModifiedTime)').Ticks)</SourceFileDate>
<PPIncFileDate Condition="!Exists(%(PPIncFiles.Identity))">0</PPIncFileDate>
<PPIncFileDate Condition="Exists(%PPIncFiles.Identity))">$([System.DateTime]::Parse('%(PPIncFiles.ModifiedTime)').Ticks)</PPIncFileDate>
</PropertyGroup>
Note that I can store the timestamps in Properties, given that the Item Lists only contain one Item per Target Pass, because of Target Batching.
Finally, I can invoke my pre-processor using the "Exec" Task, as follows:
<Exec
Condition="'#(PPIncFiles)' != '' and $(SourceFileDate) > $(PPIncFileDate)"
Command="pptool.exe [options] %(ClCompile.Identity)" />
Supplying the options was yet, another headache.
Typically, the switches as defined under the xml file are just passed to a "CommandLineTemplate" metadata under the .props file using [OptionName]. This will pass the "Switch" attribute of the Property defined under the xml file. But that implies defining your own TaskName item, made from a TaskFactory, under the .targets file. But in my case, I was just using the existing "Exec" Task, which doesn't know anything about my custom Properties. I didn't know how to retrieve the "Switch" attribute in this case, and what seems to be available is just whatever the "Name" attribute contains. Luckily, a Property has both a Name and a DisplayName. The DisplayName is what the GUI user sees. So I just copied the "Switch" value into the "Name" value, when defining the Properties under the xml file. I could then pass the option to the Exec Task using something like:
<Exec
Condition="'#(PPIncFiles)' != '' and $(SourceFileDate) > $(PPIncFileDate)"
Command="pptool.exe %(ClCompile.Option1) %(ClCompile.Option2)... %(ClCompile.Identity)" />
Where I defined all my Properties as "EnumProperty", with an "EnumValue" having Name="" for disabled options, and other EnumValue having Name="switch" for the others. Not very elegant, but I didn't know a way around this.
Finally, it is recommended that when automatically generating files, the .targets file should also include a way to clean them up when the user Cleans up the Project. That's pretty standard but I'll include it here for convenience.
<PropertyGroup>
<CleanDependsOn>$(CleanDependsOn);PPIncCleanTarget</CleanDependsOn>
</PropertyGroup>
<Target Name="PPIncCleanTarget" Condition ="'#(ClCompile)' != ''">
<ItemGroup>
<PPIncFilesToDelete Include="%(ClCompile.Filename)_pp.h" />
</ItemGroup>
<Delete Files="%(PPIncFilesToDelete.FullPath)" Condition="'#(PPIncFilesToDelete)' != ''"/>
</Target>

generating subdocuments with doxygen

I have a large C++ software application documented with doxygen. How can I set it up so that I can generate subdocuments for specific classes? The classes are documented with in-source commenting, their own .dox files, and images/ directory. I need to be able to generate a standalone pdf file specific to a single class.
I can use grouping to identify what will be included in that subdocument, but how do I generate output for a single group?
If you have a specific .dox file per requested output entity, then all you need to do is define in that file as input the files declaring and defining that class.
Say for example you want an output only for class MyClass which is declared in file myclass.hpp and whose implementation is in myclass.cpp, then in myclass.dox, just add this:
INPUT = ./myclass.cpp \
./myclass.hpp
Of course, you can have different paths for .cpp and .hpp. Or you can document more than one class.
Then, run doxygen on that myclass.dox file.
Also watch out for the output folder name. For the html output, the default name is html so you might want to rename it to avoid mixing up all the different outputs. For example, you might want to add in the dox file something like:
HTML_OUTPUT = html_myclass

How to use the original filename in a multi file template in resharper?

I have a multi file template in resharper and I can use $NAME$ macro to get the name of the original file to use to name the other files in the template. But I also want to use the $NAME$ of the original file in the content of the other file template.
Is this possible? I can't see a macro which seems suitable for the internal variables as onlt the Current File Name seems available.
Anyone know if this is possible or how I might workaround this?
As a workaround, you may create a parameter $FILENAME$ (macro "Current file name without extension") in the first file e.g. in the comments, like:
class Foo
{
//$FILENAME$
}
Then you may call this parameter in other files of the multifile template - this parameter will contain the name of the first file since the first file will be generated before other ones.
Unfortunately, there isn't a macro that will give you this. I've added a feature request that you can vote on and track (and more specific detail as to what your requirements are would be useful) - http://youtrack.jetbrains.com/issue/RSRP-415055
It is possible to write your own macros as part of a plugin, but there isn't a sure-fire way of getting the name of the first document in the created file set. The IHotspotSessionContext instance that is passed to the macro via IHotspotSession.Context property includes an enumerable of IDocument, from which you can get IDocument.Moniker, which will be the full path for file based documents. However, there's no guarantee of the order of the enumerable - it's backed by a hashset. You might be able to rely on implementation details (small set, no removes) to be able to use the first document as the original, but there is really no guarantee of this.

Specifying Source file name using parameter variables in informatica 9?

I have a mapping like
SA-->SQ--->EXPR--->TGT
The source will be of the same structure and the tartget also.
There are a bunch of files(with the same structure) which will go through this mapping .
So i want to use a parameter file through which i will give the file names for every run manually.
How to use the param file in session for Source filename attribute
Please suggest..
you could use indirect source type, wherein your source file is basically a list of files, and in turn the session reads each of the files one by one.
the parameter file could reference a source file name (the list) as
$InputFile_myName=/a/b/c.list
In line with what Raghav says, indicate the name of a file that will hold a list of input files in the 'Source filename' property box for the SQ in question in the Mapping tab, making the file 'Source filetype' be 'Indirect', specified in the Session Properties. If you already know ahead of time the names of the input files, you can specify them in that file and deploy that file with the workflow to the location you indicate in the 'Source file directory' property box. However if you won't know the names of the input files until run-time but know the files' naming standard (e.g: "Input_files_name_ABC_" where "" represents variable text, such as a numeric value incremented per input file generated by some other process), then one way to deal with that is to use a Pre-Session Command specifiable in the 'Components' tab of the Session. Create one that will build a new file at the location and with the name specified for the Indirect input file referenced above by using the Unix shell (or if running on Windows, the cmd shell) to list the files conforming to the naming standard for them and redirect the listing output to that file.
Tricky thing is that there must be one or more files listed in that Indirect type of input file. If that file is empty, the workflow will fail (abend). An Indirect file type must have in it listed at least one file (even if that file is empty) and that file must exist. The workflow fails if the indirect file reader gets no files to read from or if a file listed in it is not present on the server to be read from. One way to get around this is to make sure an empty file is present at all times that conforms to the naming standard. This can be assured by creating a "touchfile" before executing the listing command to build the Indirect file type listing file. In Unix, you'd use the 'touch {path}/{filename}' command ({filename} could be, for example, "Input_files_name_ABC_TOUCHFILE"), or on Windows you'd redirect an empty string to a file likewise named via cmd shell process. Either way, that will help you avoid an abend. Cleaning up that file is easy to do: a Post-Session command can be used to delete the empty touchfile. Likewise, you can do the same for the Indirect type of file if desired.