Include native binaries in JAR (issues w/ maven-shade-plugin) - java-native-interface

I am building a library that requires an SO file.
My goals:
automatically pick up the SO file from the artifact repository
include it as a resource in the installed JAR so that I do not need to add it manually and therefore not have to also put in the GitHub repository.
I am referencing it in my library POM like this:
<dependency>
<groupId>com.myco.mygrp</groupId>
<artifactId>native</artifactId>
<type>so</type>
<scope>runtime</scope>
<version>0.0.1</version>
</dependency>
This does add it to my local Maven repository but when I attempt to build an uber JAR from a second project that is a client to this project and therefore imports it and references it in its POM, I get
Failed to execute goal org.apache.maven.plugins:maven-shade-plugin:3.2.4:shade (default) on project i-am-the-client: Error creating shaded jar: error in opening zip file /Users/me/.m2/repository/com/myco/mygrp/0.0.1/native-0.0.1.so -> [Help 1]
While I want that SO file in my client uber JAR, I don't need it to be unzipped or opened. How do I get maven-shade-plugin to include it in the JAR but without treating the SO file like it's a JAR to be unzipped? Here's the maven-shade-plugin for reference:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.4</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.myco.somegrp.Client</mainClass>
</transformer>
</transformers>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>scala/tools/nsc/doc/html/resource/lib/jquery*</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>
I can circumvent this by just adding the native SO into the resources directory manually but that will go against one of the goals and also raise eyebrows during code review because, per the goal, the binary is already in Nexus and is not necessarily wanted in GitHub as well.
As a secondary approach, I resort to my maven-dependency-plugin as so:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>3.1.2</version>
<executions>
<execution>
<id>copy</id>
<phase>package</phase>
<goals>
<goal>copy</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>com.myco.mygrp</groupId>
<artifactId>native</artifactId>
<type>so</type>
<overWrite>false</overWrite>
<destFileName>my_native.so</destFileName>
</artifactItem>
</artifactItems>
<overWriteReleases>false</overWriteReleases>
<overWriteSnapshots>true</overWriteSnapshots>
<outputDirectory>${project.build.directory}/lib</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
When I mvn install, I get the expected "my_native.so" file in target/lib but not in the installed JAR. I would like to have this file included in the installed JAR so that it can be loaded up by the JNI code. I intend to System.load it up even if it is in the same JAR by writing it out to a temp file in the file system right before and loading that temp file instead.
I have been working on this for some days including scouring the interwebs before I settled upon this strategy but I am open to suggestions if this is the wrong approach.

Related

Maven cargo plugin with Surefire + Jacoco report

I have a project that fires up tomcat using the cargo plugin and then runs integration tests against that tomcat instance. My goal is get integration test coverage reports of the code that runs in tomcat (not coverage of my integration tests).
The question is, how do I get code coverage results of code running in tomcat (separate JVM)?
I have been able to get coverage reports of the integration tests itself and java classes within the test module, however these are rather worthless.
In the jacoco-sessions.html file, I can only see classes that are available in the test module's classpath. Classes that are running in the tomcat server are not present.
In addition to the answer of Godin you also have to make sure that the coverage data of the test classes itself doesn't use the same output file as the coverage data of the code running in the cargo container. I think in my case the coverage-of-the-tests was overwriting the coverage-of-the-code-under-test.
Note that I am using jacoco.exec for the coverage data of the code-under-integration-test. Normally this is used for the unit tests, but there are no unit tests in my module. This way I don't need to configure SonarQube for an extra filename, but if you like you can use another filename here.
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<executions>
<execution>
<id>prepare-agent-integration-cargo</id>
<goals>
<goal>prepare-agent-integration</goal>
</goals>
<configuration>
<destFile>${project.build.directory}/jacoco.exec</destFile>
<propertyName>argLineCargo</propertyName>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.cargo</groupId>
<artifactId>cargo-maven2-plugin</artifactId>
<configuration>
<container>
<!-- ... -->
</container>
<configuration>
<properties>
<cargo.start.jvmargs>${argLineCargo}</cargo.start.jvmargs>
<!-- ... -->
</properties>
</configuration>
<!-- ... -->
</configuration>
<!-- ... -->
</plugin>
And the JaCoCo configuration in the parent POM:
<build>
<plugins>
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<configuration>
<includes>
<include>my/project/package/**/*</include>
</includes>
</configuration>
<executions>
<execution>
<id>prepare-agent</id>
<goals>
<goal>prepare-agent</goal>
</goals>
</execution>
<execution>
<id>prepare-agent-integration</id>
<goals>
<goal>prepare-agent-integration</goal>
</goals>
</execution>
<execution>
<id>reports</id>
<goals>
<goal>report</goal>
<goal>report-integration</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<argLine>#{argLine}</argLine>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<configuration>
<argLine>#{argLine}</argLine>
</configuration>
</plugin>
</plugins>
</build>
According to documentation of jacoco-maven-plugin http://www.jacoco.org/jacoco/trunk/doc/prepare-agent-mojo.html :
Prepares a property pointing to the JaCoCo runtime agent that can be passed as a VM argument to the application under test. Depending on the project packaging type by default a property with the following name is set:
tycho.testArgLine for packaging type eclipse-test-plugin and
argLine otherwise.
argLine property affects maven-surefire-plugin and maven-failsafe-plugin that start JVM with tests - see http://maven.apache.org/surefire/maven-surefire-plugin/test-mojo.html#argLine and http://maven.apache.org/surefire/maven-failsafe-plugin/integration-test-mojo.html#argLine respectively, but this property has no effect on cargo-maven2-plugin that starts JVM with Tomcat.
According to https://stackoverflow.com/a/38435778/244993 and https://codehaus-cargo.github.io/cargo/Tomcat+9.x.html you need to pass property set by jacoco-maven-plugin to cargo-maven2-plugin as cargo.jvmargs.

Can I get maven to generate two test jars with different classifiers?

I have a maven project where I generate some test code for some modules. I'd like this generated test code to be available for testing by other modules. Typically if the module bar wants to use the test code of module foo, the foo module must generate a foo-tests.jar and the module bar adds a dependency such as:
<dependency>
<artifactId>foo</artifactId>
<classifier>tests</classifier>
<scope>test</scope>
</dependency>
Which is fine, except that I only want to pull in the generated test code of foo, not all of foo's unit tests and helper classes (there may be unintended class conflicts, for example). I'd like to define a dependency such as:
<dependency>
<artifactId>foo</artifactId>
<classifier>test-libs</classifier>
<scope>test</scope>
</dependency>
Typically maven-jar-plugin is used to generate the foo-test artifact, so I was hoping I could configure that plugin to generate two test artifacts: one that contains the usual test code (unit tests, etc) in foo-tests and one that contains the generated code in foo-test-libs, using two different classifiers, e.g.:
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<executions>
<execution>
<id>generate-test-jar</id>
<phase>package</package>
<goals><goal>test-jar</goal></goals>
<configuration>
<excludes>... all the generated code ...</excludes>
</configuration>
</execution>
<execution>
<id>generate-test-libs-jar</id>
<phase>package</package>
<goals><goal>test-jar</goal></goals>
<classifier>test-libs</classifier>
<configuration>
<includes>... all the generated code ...</includes>
</configuration>
</execution>
</executions>
</plugin>
The problem here is that, unlike the jar goal, the test-jar goal of maven-jar-plugin does not support the classifier element. I assume the goal uses the test classifier by default, so I cannot generated two test jars with different classifiers.
I am wondering if there's a good way of splitting the test jars for a maven module? If all else fails, I can go back to adding dependencies on the complete test jar, but I'm hoping for a more elegant solution.
(And I know that using classifier is frowned upon, but I'm not sure if it can be avoided here ...)
I think I figured it out, using a combination of maven-antrun-plugin to create multiple test jar files (using the jar task), and build-helper-maven-plugin to attach the generated jar file as artifacts to the project.
Generate test jar files.
I use maven-antrun-plugin combined with the ant jar task to split my test code:
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.7</version>
<executions>
<execution>
<phase>package</phase>
<goals><goal>run</goal></goals>
<configuration>
<target>
<!-- The test-libs jar contains all test code in the x.y.z package. -->
<jar destfile="${project.build.directory}/artifacts/test-libs/${project.build.finalName}.jar"
basedir="${project.build.directory}/test-classes"
includes="x/y/z/**/*.class"/>
<!-- The tests jar contains all test code exception the x.y.z package. -->
<jar destfile="${project.build.directory}/artifacts/tests/${project.build.finalName}.jar"
basedir="${project.build.directory}/test-classes"
excludes="x/y/z/**/*.class"/>
</target>
</configuration>
</execution>
</executions>
</plugin>
Note that I had to use the same name ${project.build.finalName}.jar for each generated jar file (this is important later) so I put each jar file into its own directory:
target/artifacts/test-libs/foo.jar
target/artifacts/tests/foo.jar
Attaching the jar files to the project.
Generating the jar files is only the first step. The jar files need to be attached to the maven project so that they will be installed. For this the build-helper-maven-plugin is required. This allows attaching of files as artifacts, where each file has a location, a type, and a classifier. Here the type is jar and the classifier will be tests or test-libs as appropriate:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>1.9</version>
<executions>
<execution>
<id>attach-test-jars</id>
<phase>package</phase>
<goals><goal>attach-artifact</goal></goals>
<configuration>
<artifacts>
<!-- Attach the test-libs artifact. -->
<artifact>
<file>${project.build.directory}/artifacts/test-libs/${project.build.finalName}.jar</file>
<type>jar</type>
<classifier>test-libs</classifier>
</artifact>
<!-- Attach the tests artifact. -->
<artifact>
<file>${project.build.directory}/artifacts/tests/${project.build.finalName}.jar</file>
<type>jar</type>
<classifier>tests</classifier>
</artifact>
</artifacts>
</configuration>
</execution>
</executions>
</plugin>
With this I now see the following output:
[INFO] Installing ...\target\artifacts\test-libs\foo-1.0.jar to ...\foo\1.0\foo-1.0-test-libs.jar
[INFO] Installing ...\target\artifacts\tests\foo-1.0.jar to ...\foo\1.0\foo-1.0-tests.jar
[INFO] Installing ...\target\foo-1.0.jar to ...\foo\1.0\foo-1.0.jar
My other projects can now add dependencies on either foo-test.jar or foo-test-lib.jar as required. Huzzah!

How to get deterministic builds using lein?

Running lein uberjar twice in a row, I get two different builds. After some unzip / find / sort / diff shell magic I saw it came down to some Maven file: more specifically the pom.properties file.
Here's a diff:
< #Tue Jan 14 07:07:50 CET 2014
---
> #Tue Jan 14 07:07:01 CET 2014
How can I get deterministic Clojure builds using Leiningen (and hence Maven)?
I have a local patch to lein-voom (a project I maintain with Chouser) which will address this, fixing the pom.properties header time to VCS (currently only git) commit time if the working copy is entirely clean. I expect this commit to finalize sometime next week, though I'm still thinking about the configurability of this feature.
This alone doesn't make for stable jars but it is the first trivial piece. Also of interest are timestamps of files within the jar which will change the zip header. Normalizing timestamps should also be straightforward but is a separate step.
Deterministic builds are of interest to lein-voom, a project which may generally be of interest to you since it allows pointing dependencies directly to a particular source version by commit sha, avoiding artifacts altogether.
lein-voom is quite young and the documentation and CLI are pretty rough but the core functionality is solid. Feel free to post issues or questions on the GitHub project.
I wrote up an article a while back covering deterministic builds with Maven. I have extracted the salient points here:
Use the assembly plugin and configure it like this:
src/main/assembly/zip.xml:
<assembly xmlns="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/plugins/maven-assembly-plugin/assembly/1.1.0 http://maven.apache.org/xsd/assembly-1.1.0.xsd">
<id>deterministic</id>
<baseDirectory>/</baseDirectory>
<formats>
<format>zip</format>
</formats>
<fileSets>
<fileSet>
<directory>${project.build.directory}/classes</directory>
<outputDirectory>/</outputDirectory>
</fileSet>
</fileSets>
</assembly>
Add in your own MANIFEST.MF remembering the extra CRLF at the end or it won't be valid.
src/main/resources/META-INF/MANIFEST.MF:
Manifest-Version: 1.0
Archiver-Version: Plexus Archiver
Created-By: Apache Maven
Built-By: yourapp
Build-Jdk: 1.7.0
Add some plugins into your pom.xml:
pom.xml:
<plugins>
... other plugins ...
<!-- Step 1: Set all timestamps to same value -->
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.7</version>
<executions>
<execution>
<id>1-touch-classes</id>
<phase>prepare-package</phase>
<configuration>
<target>
<touch datetime="01/01/2000 00:10:00 am">
<fileset dir="target/classes"/>
</touch>
</target>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
<!-- Step 2: Assemble as a ZIP to avoid MANIFEST.MF timestamp -->
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.2.1</version>
<configuration>
<descriptors>
<descriptor>src/main/assembly/zip.xml</descriptor>
</descriptors>
</configuration>
<executions>
<execution>
<id>2-make-assembly</id>
<phase>prepare-package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
<!-- Step 3: Rename ZIP as JAR -->
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.7</version>
<executions>
<execution>
<id>3-rename-assembly</id>
<phase>package</phase>
<configuration>
<target>
<move file="${project.build.directory}/${project.build.finalName}-deterministic.zip"
tofile="${project.build.directory}/${project.build.finalName}-deterministic.jar"/>
</target>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
... more plugins ...
</plugins>
This will create a deterministic JAR, but it will still depend on the exact version of the JVM and operating system you build it with. To overcome that you should explore the gitian approach used by the Bitcoin Core project and mandate a particular JVM within the VirtualBox environment. In this manner multiple developers can build from the source independently and then sign the binary to state that they are in agreement. When a certain threshold is reached the code is considered proven to be deterministic and can be released.

Running tests after packaging

One of my projects needs a pretty complex setup for the resulting JAR file, so I'd like to run a test after the package phase to make sure the JAR contains what it should.
How do I do that with Maven 2?
You can use the surefire-plugin for this. what you need to do is associate a phase with an execution (see below). You will need to change the phase to be whatever you want it to be in your case one after the package phase.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<skip>true</skip>
</configuration>
<executions>
<execution>
<id>unittests</id>
<phase>test</phase>
<goals>
<goal>test</goal>
</goals>
<configuration>
<skip>false</skip>
<includes>
<include>**/**/**/*Test.java</include>
</includes>
</configuration>
</execution>
</executions>
</plugin>
Convert your project into a multi-module build. In the first module, build your original project. In the second module, add a dependency to the first.
This will add the first JAR to the classpath.
Update by OP: This works but I had to add this to my POM:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>${version.maven-surefire-plugin}</version>
<configuration>
<useSystemClassLoader>false</useSystemClassLoader>
</configuration>
</plugin>
The important part is <useSystemClassLoader>false</useSystemClassLoader>. Without this, my classpath only contained a couple of VM JARs plus the surefire bootstrap JAR (which contains the test classpath in the MANIFEST.MF). I have no idea why this test classpath isn't visible from the classes loaded from it.

Using maven2 to build autotools-based C/C++ package

I am working on a collection MATLAB, Java, and C/C++ components that all inter-operate, but have distinctly different compilation/installation steps. We currently don't compile anything for MATLAB, use maven2 for our Java build and unit tests, and use autotools for our C/C++ build and unit tests.
I would like to move everything to a single build and unit test system, using maven2, but have not been able to find a plugin that will allow the C/C++ codestream to remain autotools-based and simply wrap it in a maven build. Having to rip out autotools support and recreate all the dependencies in maven is most likely a deal-breaker, so I'm looking for a way for maven and autotools to play nicely together, rather than having to choose between the two.
Is this possible or even desirable? Are there resources out there that I have overlooked?
I don't really know autotools, but can't you use the maven exec plugin, that lets you execute system commands (or Java programs)? For example:
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<executions>
<execution>
<id>exec-one</id>
<phase>compile</phase>
<configuration>
<executable>autogen</executable>
<arguments>
<argument>-v</argument>
</arguments>
</configuration>
<goals>
<goal>exec</goal>
</goals>
</execution>
<execution>
<id>exec-two</id>
<phase>compile</phase>
<configuration>
<executable>automake</executable>
<arguments>
<argument>-v</argument>
<argument>[other arguments]</argument>
</arguments>
</configuration>
<goals>
<goal>exec</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
I didn't test the pom fragment above, but it gives you some hints about how to proceed.
You did overlook the maven cbuild parent suite. take a look at the "make-maven-plugin" section for more details.