How to generate javadoc for unit tests in netbeans - unit-testing

I'm using Netbeans (currently 6.7) and I really like how I can generate javadoc from my source code. However my tests also have documentation (valuable documentation!). Is there anyway I can generate javadocs (ideally for both at the same time).
Thanks!

JUnit tests are productive, normal Java Code. They may contain normal JavaDoc comments.
Why don't you treat them like normal Java Code? Just run javadoc over this package and you're done.
You can generate your JavaDoc via the Build Menu.
Good luck!

Adding the following Ant target to build.xml does the trick for me:
<target depends="init" name="-javadoc-build">
<javadoc additionalparam="${javadoc.additionalparam}" author="${javadoc.author}" charset="UTF-8" destdir="${dist.javadoc.dir}" docencoding="UTF-8" encoding="${javadoc.encoding.used}" failonerror="true" noindex="${javadoc.noindex}" nonavbar="${javadoc.nonavbar}" notree="${javadoc.notree}" private="${javadoc.private}" source="${javac.source}" splitindex="${javadoc.splitindex}" use="${javadoc.use}" useexternalfile="true" version="${javadoc.version}" windowtitle="${javadoc.windowtitle}">
<classpath>
<path path="${javac.classpath}:${javac.test.classpath}"/>
</classpath>
<fileset dir="${test.src.dir}" excludes="*.java,${excludes}" includes="${includes}">
<filename name="**/*.java"/>
</fileset>
<fileset dir="src/try" excludes="*.java,${excludes}" includes="${includes}">
<filename name="**/*.java"/>
</fileset>
<fileset dir="${src.dir}" excludes="*.java,${excludes}" includes="${includes}">
<filename name="**/*.java"/>
</fileset>
<fileset dir="${build.generated.sources.dir}" erroronmissingdir="false">
<include name="**/*.java"/>
<exclude name="*.java"/>
</fileset>
</javadoc>
<copy todir="${dist.javadoc.dir}">
<fileset dir="${src.dir}" excludes="${excludes}" includes="${includes}">
<filename name="**/doc-files/**"/>
</fileset>
<fileset dir="${build.generated.sources.dir}" erroronmissingdir="false">
<include name="**/doc-files/**"/>
</fileset>
</copy>
</target>
It's copied from nbproject/build-impl.xml with three changes:
expand classpath to "${javac.classpath}:${javac.test.classpath}"
add a fileset element with dir="${test.src.dir}" there are the normal netbeans "Test Packages"
add a fileset element with dir="src/try" this is and extra source directory that my project uses. (I don't know how to use a variable like ${test.src.dir} to refer to this directory.)

In Build-impl.xml there is a target named "-javadoc-build". It contains two "fileset" sections. Perhaps its possible to add a third section here.

Related

compile *.clj sources only if modified (with `clojure.core.Compile`)

I am using the method proposed in this answer to compile Clojure *.clj files to *.class (and subsequently jar them), using more or less the structure of the compile-clojure target from the build.xml file that's found at the root of the Clojure distribution (e.g. in clojure-1.5.1.zip). In my case:
<java classname="clojure.lang.Compile"
failonerror="true"
fork="true">
<classpath refid="compile.classpath"/>
<sysproperty key="clojure.compile.path" value="${cljbuild.dir}"/>
<arg value="${project.MainClass.name}"/>
</java>
The problem with this approach is that it keeps compiling the *.clj files even though they haven't changed. Any ways around this?
For building Clojure projects where for various reasons I could not use Leiningen I have been much happier using the Zi plugin and letting maven decide what needs to be re-compiled.
I ended up using ant-contrib's OutOfDate task (e.g. as also described in this answer for the more general case of invoking Ant's exec task).
<contrib:outofdate>
<deletetargets all="true"/>
<sourcefiles>
<path refid="compile.dependency.artifacts"/>
</sourcefiles>
<targetfiles>
<fileset dir="${cljbuild.dir}">
<include name="**/*.class"/>
</fileset>
</targetfiles>
<sequential>
<java classname="clojure.lang.Compile"
failonerror="true"
fork="true">
<classpath refid="compile.classpath"/>
<sysproperty key="clojure.compile.path" value="${cljbuild.dir}"/>
<arg value="${project.MainClass.name}"/>
</java>
</sequential>
</contrib:outofdate>

Convert Ant build file to Makefile

I have Ant build file which is used in Docbook. Now I am going to convert that Ant build file to a Makefile which uses Xsltproc processor. I am not particularly familiar with either Makefile or Ant. So please help me to convert it. Are there any resources which I should follow?
Here I want to,
1. copy folder structure and its content into another folder
2. configure java system properties
3. configure classpath
In ant script, it has code like this,
<copy todir="${output-dir}">
<fileset dir="${ant.file.dir}/template">
<include name="**/*"/>
</fileset>
</copy>
<java classname="com.nexwave.nquindexer.IndexerMain" fork="true">
<sysproperty key="htmlDir" value="${output-dir}/content"/>
<sysproperty key="htmlExtension" value="${html.extension}"/>
<classpath>
<path refid="classpath"/>
<pathelement location="${xercesImpl.jar}"/>
<pathelement location="/usr/share/xml-commons/lib/xml-apis.jar"/>
</classpath>
</java>
I want to convert above 2 codes in make.
Thank you..!!
Make and ANT are very different technologies. Your requirement would difficult to fufil for all but the simplest use cases.
Here are some of the technical challenges:
ANT is not Make. On the surface it looks similar, but underneath works quite differently.
Surprisingly make is not very cross platform. Different flavours have subtle differences that could break an ANT to Makefile convertor.
ANT is designed to support Java programs, this means it has a rich syntax for managing nasty things like Java classpaths. Again difficult to translate.
Update
The following ANT java task
<java classname="com.nexwave.nquindexer.IndexerMain" fork="true">
<sysproperty key="htmlDir" value="${output-dir}/content"/>
<sysproperty key="htmlExtension" value="${html.extension}"/>
<classpath>
<path refid="classpath"/>
<pathelement location="${xercesImpl.jar}"/>
<pathelement location="/usr/share/xml-commons/lib/xml-apis.jar"/>
</classpath>
</java>
can be translated into the following unix java command-line.
java \
-DhtmlDir=$PUT_OUTPUT_DIR_HERE \
-DhtmlExtension=$PUT_EXT_HERE \
-cp $CLASSPATH:$PATH_TO_XERCES_JAR:/usr/share/xml-commons/lib/xml-apis.jar \
com.nexwave.nquindexer.IndexerMain

Build with nant and reference libraries in subdirectories

I'm creating bild-file for a project containing several 3rd-party libraries located inside a lib-folder. So my build-script looks like this:
<csc target="library" ....>
<sources>
<include name="**/*.cs" />
<!-- common assembly-level attributes -->
<include name="../../src/CommonAssemblyInfo.cs" />
<exclude name="Properties/AssemblyInfo.cs" />
</sources>
<references>
<include name="${build.dir}/bin/lib/Should.Fluent.dll" />
</references>
</csc>
The compilation runs fine, however, runtime doesn't work, saying it can't find the library Should.Fluent.dll. How can I make the program find it?
The library has to be present either in GAC or in the same directory that the referencing assembly is in. You can copy it manually to check if this fixes the problem - if yes, then add a <copy> task that makes sure you references are present in your output problem.

Running groovy unit tests in ant for a java project

I have a Java project with some unit tests written using JUnit. Recently some new unit tests have been added that are written in groovy (also using JUnit) as it's easier to make those
more expressive and generally easier to read. It also allows us to use the spock framework.
The project is build and tested with ant.
Before the groovy classes were added unit tests were run using the following ant task:
<target name="test" depends="test-compile">
<junit printsummary="yes">
<classpath>
<path refid="test.classpath"/>
</classpath>
<formatter type="plain"/>
<batchtest fork="yes" todir="${test.dir}/report">
<fileset dir="${test.dir}/unit" includes="**/*.java"/>
</batchtest>
</junit>
</target>
However, this approach does not work for groovy tests as those are in *.groovy files and the JUnit Ant task, understandably, does not recognise them in the fileset.
The alternative approach is to use *.class files for the batchtest fileset like this:
<batchtest fork="yes" todir="${test.dir}/report">
<fileset dir="${test.dir}/${build.dir}">
<include name="**/*Test*.class" />
</fileset>
</batchtest>
This generates false negatives as closure class files are also included so a possible workaround is to exclude those files.
<batchtest fork="yes" todir="${test.dir}/report">
<fileset dir="${test.dir}/${build.dir}">
<include name="**/*Test*.class" />
<exclude name="**/*$*.class" />
</fileset>
</batchtest>
Is there a better way to identify test classes fo the junit ant task? Perhaps one based on reflection and the #Test attribute as manually listing all the test classes (which would work perfectly well) is not really a maintainable solution. Something like the SpecClassFileSelector from the Spock framework.
what about changing the include pattern to *Test rather than *Test*
as #jon-skeet suggested here.
This way it will not match the anonymous closure classes.
you'll have to rename your existing classes and ask the developers to follow this pattern.
Take a look at:
http://www.ibm.com/developerworks/java/library/j-pg11094/
There's a groovyc ant taskdef for compiling the groovy test cases and running them. The example there is Maven, but it shouldn't be too hard to adapt it to do what you want.
Can't you just write something like this?
<fileset dir="${test.dir}/unit" includes="**/*.java,**/*.groovy"/>

Copy all files and folders using msbuild

Just wondering if someone could help me with some msbuild scripts that I am trying to write. What I would like to do is copy all the files and sub folders from a folder to another folder using msbuild.
{ProjectName}
|----->Source
|----->Tools
|----->Viewer
|-----{about 5 sub dirs}
What I need to be able to do is copy all the files and sub folders from the tools folder into the debug folder for the application. This is the code that I have so far.
<ItemGroup>
<Viewer Include="..\$(ApplicationDirectory)\Tools\viewer\**\*.*" />
</ItemGroup>
<Target Name="BeforeBuild">
<Copy SourceFiles="#(Viewer)" DestinationFolder="#(Viewer->'$(OutputPath)\\Tools')" />
</Target>
The build script runs but doesn't copy any of the files or folders.
Thanks
I was searching help on this too. It took me a while, but here is what I did that worked really well.
<Target Name="AfterBuild">
<ItemGroup>
<ANTLR Include="..\Data\antlrcs\**\*.*" />
</ItemGroup>
<Copy SourceFiles="#(ANTLR)" DestinationFolder="$(TargetDir)\%(RecursiveDir)" SkipUnchangedFiles="true" />
</Target>
This recursively copied the contents of the folder named antlrcs to the $(TargetDir).
I think the problem might be in how you're creating your ItemGroup and calling the Copy task. See if this makes sense:
<Project DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003" ToolsVersion="3.5">
<PropertyGroup>
<YourDestinationDirectory>..\SomeDestinationDirectory</YourDestinationDirectory>
<YourSourceDirectory>..\SomeSourceDirectory</YourSourceDirectory>
</PropertyGroup>
<Target Name="BeforeBuild">
<CreateItem Include="$(YourSourceDirectory)\**\*.*">
<Output TaskParameter="Include" ItemName="YourFilesToCopy" />
</CreateItem>
<Copy SourceFiles="#(YourFilesToCopy)"
DestinationFiles="#(YourFilesToCopy->'$(YourDestinationDirectory)\%(RecursiveDir)%(Filename)%(Extension)')" />
</Target>
</Project>
I'm kinda new to MSBuild but I find the EXEC Task handy for situation like these. I came across the same challenge in my project and this worked for me and was much simpler. Someone please let me know if it's not a good practice.
<Target Name="CopyToDeployFolder" DependsOnTargets="CompileWebSite">
<Exec Command="xcopy.exe $(OutputDirectory) $(DeploymentDirectory) /e" WorkingDirectory="C:\Windows\" />
</Target>
<Project DefaultTargets="Build" xmlns="http://schemas.microsoft.com/developer/msbuild/2003" ToolsVersion="3.5">
<PropertyGroup>
<YourDestinationDirectory>..\SomeDestinationDirectory</YourDestinationDirectory>
<YourSourceDirectory>..\SomeSourceDirectory</YourSourceDirectory>
</PropertyGroup>
<Target Name="BeforeBuild">
<CreateItem Include="$(YourSourceDirectory)\**\*.*">
<Output TaskParameter="Include" ItemName="YourFilesToCopy" />
</CreateItem>
<Copy SourceFiles="#(YourFilesToCopy)"
DestinationFiles="$(YourFilesToCopy)\%(RecursiveDir)" />
</Target>
</Project>
\**\*.* help to get files from all the folder.
RecursiveDir help to put all the file in the respective folder...
This is the example that worked:
<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<ItemGroup>
<MySourceFiles Include="c:\MySourceTree\**\*.*"/>
</ItemGroup>
<Target Name="CopyFiles">
<Copy
SourceFiles="#(MySourceFiles)"
DestinationFiles="#(MySourceFiles->'c:\MyDestinationTree\%(RecursiveDir)%(Filename)%(Extension)')"
/>
</Target>
</Project>
source: https://msdn.microsoft.com/en-us/library/3e54c37h.aspx
This is copy task i used in my own project, it was working perfectly for me that copies folder with sub folders to destination successfully:
<ItemGroup >
<MyProjectSource Include="$(OutputRoot)/MySource/**/*.*" />
</ItemGroup>
<Target Name="AfterCopy" AfterTargets="WebPublish">
<Copy SourceFiles="#(MyProjectSource)"
OverwriteReadOnlyFiles="true" DestinationFolder="$(PublishFolder)api/% (RecursiveDir)"/>
In my case i copied a project's publish folder to another destination folder, i think it is similiar with your case.
Did you try to specify concrete destination directory instead of
DestinationFolder="#(Viewer->'$(OutputPath)\\Tools')" ?
I'm not very proficient with advanced MSBuild syntax, but
#(Viewer->'$(OutputPath)\\Tools')
looks weird to me. Script looks good, so the problem might be in values of $(ApplicationDirectory) and $(OutputPath)
Here is a blog post that might be useful:
How To: Recursively Copy Files Using the <Copy> Task
Personally I have made use of CopyFolder which is part of the SDC Tasks Library.
http://sdctasks.codeplex.com/
The best way to recursively copy files from one directory to another using MSBuild is using Copy task with SourceFiles and DestinationFiles as parameters. For example - To copy all files from build directory to back up directory will be
<PropertyGroup>
<BuildDirectory Condition="'$(BuildDirectory)' == ''">Build</BuildDirectory>
<BackupDirectory Condition="'$(BackupDiretory)' == ''">Backup</BackupDirectory>
</PropertyGroup>
<ItemGroup>
<AllFiles Include="$(MSBuildProjectDirectory)/$(BuildDirectory)/**/*.*" />
</ItemGroup>
<Target Name="Backup">
<Exec Command="if not exist $(BackupDirectory) md $(BackupDirectory)" />
<Copy SourceFiles="#(AllFiles)" DestinationFiles="#(AllFiles->
'$(MSBuildProjectDirectory)/$(BackupDirectory)/%(RecursiveDir)/%(Filename)%
(Extension)')" />
</Target>
Now in above Copy command all source directories are traversed and files are copied to destination directory.
If you are working with typical C++ toolchain, another way to go is to add your files into standard CopyFileToFolders list
<ItemGroup>
<CopyFileToFolders Include="materials\**\*">
<DestinationFolders>$(MainOutputDirectory)\Resources\materials\%(RecursiveDir)</DestinationFolders>
</CopyFileToFolders>
</ItemGroup>
Besides being simple, this is a nice way to go because CopyFilesToFolders task will generate appropriate inputs, outputs and even TLog files therefore making sure that copy operations will run only when one of the input files has changed or one of the output files is missing. With TLog, Visual Studio will also properly recognize project as "up to date" or not (it uses a separate U2DCheck mechanism for that).