Arquillian tomee remote jacoco code coverage - jacoco

I am doing integration test using Arquillian in TomEE-Plus 7.0.4 remote and trying to get Code coverage using Jacoco 0.8.2. My code coverage is not covered since I am using arquillian-tomee-remote. Since code is not covered not able to take build. I need sample code with has TomEE-plus arquillian remote and Code coverage using Jacoco. I will appreciate if I get any sample working code or sample project.
I used prepare-agent goal which will generate surefireArgLine ( javaagent) and passed the same in surefire plugin. issue here is, I am using remote Tomee and don't know how to generate correct java agent surefireArgLine set to -javaagent:/home/user/.m2/repository/org/jacoco/org.jacoco.agent/0.8.2/org.jacoco.agent-0.8.2-runtime.jar=destfile=/home/user/project/target/coverage-reports/jacoco-ut.exec,append=true,excludes=/config/*.class:/util/*Constants.class
what is the correct javaagent option for my configuration which will connect to arquillian-remote-tomee ?
Jacoco plugin
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>${plugin.maven.jacoco.version}</version>
<configuration>
<propertyName>coverageAgent</propertyName>
<append>true</append>
<excludes>
<exclude>**/config/*.class</exclude>
<exclude>**/util/*Constants.class</exclude>
</excludes>
</configuration>
<executions>
<execution>
<id>pre-unit-test</id>
<goals>
<goal>prepare-agent</goal>
</goals>
<configuration>
<destFile>${sonar.jacoco.reportPath}</destFile>
<propertyName>surefireArgLine</propertyName>
<append>true</append>
</configuration>
</execution>
<execution>
<id>post-unit-test</id>
<phase>test</phase>
<goals>
<goal>report</goal>
</goals>
<configuration>
<dataFile>${sonar.jacoco.reportPath}</dataFile>
<outputDirectory>${project.reporting.outputDirectory}/jacoco-ut</outputDirectory>
<append>true</append>
</configuration>
</execution>
<execution>
<id>check</id>
<goals>
<goal>check</goal>
</goals>
<configuration>
<dataFile>${sonar.jacoco.reportPath}</dataFile>
<haltOnFailure>true</haltOnFailure>
<rules>
<rule>
<element>BUNDLE</element>
<limits>
<limit>
<counter>LINE</counter>
<value>COVEREDRATIO</value>
<minimum>0.99</minimum>
</limit>
<limit>
<counter>BRANCH</counter>
<value>COVEREDRATIO</value>
<minimum>0.99</minimum>
</limit>
<limit>
<counter>CLASS</counter>
<value>MISSEDCOUNT</value>
<maximum>0</maximum>
</limit>
</limits>
</rule>
</rules>
</configuration>
</execution>
</executions>
</plugin>
Dependency
<dependency>
<groupId>org.jboss.arquillian.testng</groupId>
<artifactId>arquillian-testng-container</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.arquillian.config</groupId>
<artifactId>arquillian-config-api</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jboss.arquillian.extension</groupId>
<artifactId>arquillian-jacoco</artifactId>
<version>1.0.0.Alpha10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.jacoco</groupId>
<artifactId>org.jacoco.agent</artifactId>
<classifier>runtime</classifier>
<scope>test</scope>
<version>${plugin.maven.jacoco.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.jacoco/org.jacoco.core -->
<dependency>
<groupId>org.jacoco</groupId>
<artifactId>org.jacoco.core</artifactId>
<version>${plugin.maven.jacoco.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.tomee</groupId>
<artifactId>arquillian-tomee-remote</artifactId>
<version>${tomee.version}</version>
<scope>test</scope>
</dependency>
Arquillian.xml
<extension qualifier="jacoco">
<property name="includes">com.demo.*</property>
</extension>

You can set catalina_opts in arquillian.xml for tomee container. Filter it with maven to pass jacoco javaagent and you are done :).

I have added the proper java agent ( surefireArgLine) to TomEE remote server via catalina opts in surefire pluggin. it works.
surefireArgLine - Will be populate by Surefire prepare-agent at runtime.
<tomee.catalina_opts> ${surefireArgLine}</tomee.catalina_opts>

Disclaimer: I'm not an expert in neither Arquillian nor TomEE, so you might adjust the answer for your purposes.
Anyway, in a nutshell, JaCoCo instruments bytecode in order to provide a coverage report.
Since when Arquillian is used, the actual test execution happens in a TomEE JVM and not in a JVM that actually runs the test suite (probably a CI server or just a build script that runs the test), so configuring JaCoCo on this test machine won't do much, you'll have to configure the server itself.
JaCoCo has a -javaagent option for doing this, and this Java Agent will "intercept" the loading of classes by the server and instrument them.
Now, when JaCoCo works, it produces a jacoco.exec file that actually contains a coverage report that can be shown later in various ways (jenkins plugin to show coverage, sonar integration whatever).
And this is by far the most used option AFAIK, so if you go with this approach, given the instrumentation really works, after the tests are done, you'll have to find the server on the test machine and download it to the build machine and integrate with CI/Sonar whatever.
However, there are alternative solutions:
JaCoCo Documentation states that there are three modes of running an instrumenting Java Agent:
File System: At JVM termination execution data is written to a local file.
TCP Socket Server: External tools can connect to the JVM and retrieve execution data over the socket connection. Optional execution data reset and execution data dump on VM exit is possible.
TCP Socket Client: At startup, the JaCoCo agent connects to a given TCP endpoint. Execution data is written to the socket connection on request. Optional execution data reset and execution data dump on VM exit is possible.
Technically you can just give different parameters to that javaagent so that it will run JaCoCo in one of these modes.
Anyway, we've discussed the first option, but you can also work with TCP configurations if it's required. Of course, here you'll have to handle security concerns (like permission to expose/access the port, etc).
If you work with TCP mode, there is a Maven Plugin that can come in handy. I haven't used it by myself, just googled so I can't comment whether its any good, it has only 2 stars on Github, so probably it's not production ready but maybe you could get some ideas from its source code.

Related

SpringBoot fails to deploy after adding .ebextensions for ngingx SSL -[An error occurred during execution of command [app-deploy]]

I have a SpringBoot app that deploys just fine to AWS Beanstalk, and the default nginx proxy works, allowing me to connect via port 80.
Following the instructions here: https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/https-singleinstance.html, and verifying with another of my projects that works with this exact config, Beanstalk fails to deploy the app with error:
2020/05/29 01:27:56.418780 [ERROR] An error occurred during execution of command [app-deploy] - [CheckProcfileForJavaApplication]. Stop running the command. Error: there is no Procfile and no .jar file at root level of your source bundle
The contents of my war file are as such:
app.war
-.ebextensions
-nginx/conf.d/https.conf
-https-instance-single.config
-https-instance.config
-web-inf/
My config files pass as valid yaml files. (These files are identical to those in the AWS doc, and those that work in other project on mine.)
I am using a single instance, with port 443 set open.
These are the errors reported throughout the various log files:
----------------------------------------
/var/log/eb-engine.log
----------------------------------------
2020/05/29 01:37:53.054366 [ERROR] /usr/bin/id: healthd: no such user
...
2020/05/29 01:37:53.254965 [ERROR] Created symlink from /etc/systemd/system/multi-user.target.wants/healthd.service to /etc/systemd/system/healthd.service.
...
2020/05/29 01:37:53.732794 [ERROR] Created symlink from /etc/systemd/system/multi-user.target.wants/cfn-hup.service to /etc/systemd/system/cfn-hup.service.
----------------------------------------
/var/log/cfn-hup.log
----------------------------------------
ReadTimeout: HTTPSConnectionPool(host='sqs.us-east-1.amazonaws.com', port=443): Read timed out. (read timeout=23)
Taking count #Dean Wookey's answer for Java 11, I have successfully deployed Spring Boot application jar along with .ebextensions folder. I just added maven antrun plug to my maven build configurations and for output I am receiving .zip file, which contains .ebextensions folder and spring Boot .jar file at the same level. Just deploying this final zip file to AWS UI Console.
The following is the maven antrun plugin configuration
....
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.8</version>
<executions>
<execution>
<id>prepare</id>
<phase>package</phase>
<configuration>
<tasks>
<copy todir="${project.build.directory}/${project.build.finalName}/" overwrite="false">
<fileset dir="./" includes=".ebextensions/**"/>
<fileset dir="${project.build.directory}" includes="*.jar"/>
</copy>
<zip destfile="${project.build.directory}/${project.build.finalName}.zip" basedir="${project.build.directory}/${project.build.finalName}"/>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
....
Issue with Java and Linux version
If you are using Java 8 and Linux 2.10.9 code will work and override ngingx configuration but if you choose Corretto 11 and Linux 2.2.3 get following error.
Error: there is no Procfile and no .jar file at root level of your
source bundle
Create new environment with Java 8 and deploy app again will resolve issue.
Instead of changing to java 8 as described in vaquar khan's answer, an alternative is to package your source jar inside a zip that also contains the .ebextensions folder.
In other words:
source.zip
-.ebextensions
-nginx/conf.d/https.conf
-https-instance-single.config
-https-instance.config
-web-inf/
-app.war
If you look at the latest documentation https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/platforms-linux-extend.html, you'll see that the nginx config now goes in the .platform folder instead, so your structure would be:
source.zip
-.ebextensions
-https-instance-single.config
-https-instance.config
-.platform
-nginx/conf.d/https.conf
-web-inf/
-app.war
After following vaquar's answer above, also change the 'buildspec.yml' file to have the correct java version. E.g:
runtime-versions:
java: corretto8 # previously this was openjdk8
Should work.
It is still possible to use .ebextensions within your war file.
Add following to your pom.xml in the <build><plugins> section:
<build>
<plugins>
...
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<executions>
<execution>
<id>add-resource-ebextensions</id>
<phase>generate-resources</phase>
<goals>
<goal>add-resource</goal>
</goals>
<configuration>
<resources>
<resource>
<directory>${basedir}/.ebextensions</directory>
<targetPath>.ebextensions</targetPath>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
...
</plugins>
</build>
This will copy .ebextensions folder to WEB-INF/classes folder. There AWS picks it up while starting and applies scripts from there.

Not able to run Jmeter Performance Test directly using the JAR created from MAVEN

I have been trying to run one Jmeter script through MAVEN, I was able to do that using command
"mvn verify" and able to see the reports also inside /target folder
However, I want to run the Jmeter Performance Test directly through the JAR which is created after running the "mvn verify" command.
Below is my POM.xml
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.example</groupId>
<artifactId>jmeter-testproject</artifactId>
<packaging>jar</packaging>
<version>1.0-SNAPSHOT</version>
<name>jmeter-testproject</name>
<url>http://maven.apache.org</url>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>com.lazerycode.jmeter</groupId>
<artifactId>jmeter-maven-plugin</artifactId>
<version>2.1.0</version>
<executions>
<execution>
<id>jmeter-tests</id>
<goals>
<goal>jmeter</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
A /target folder is being generated after running "mvn verify" and also jmeter-testproject-1.0-SNAPSHOT created.
However, if I want to run the package by directly executing the JAR, it's giving just "Hello World" as an output which is present in the main class. I want the Jmeter Performance Test also to execute after directly clicking the JAR. Please help.
enter image description here
You can't do this with the jmeter-maven-plugin.
The jmeter-maven-plugin is taking your test plan/configuration and then running a JMeter process for you.
It is not physically building a runnable jar with everything that you need inside it to be able to run performance tests.

How can I utilize AWS::Serverless::LayerVersion in order to use external libraries in my AWS Lambda functions

I need to use external library that is located on my local file system in order to successfully execute my Lambda function. Using AWS SAM framework I found out that this can be done by specifying AWS::Serverless::LayerVersion resource.
What I am not sure is how does this exactly work and how do I specify path to my external library. Do I first need to deploy my external library to S3 bucket or?
You need to deploy the jar on layer in AWS Lambda layers section
AWS Lambda Layers :
You can configure your Lambda function to pull in
additional code and content in the form of layers. A layer is a ZIP
archive that contains libraries, a custom runtime, or other
dependencies. With layers, you can use libraries in your function
without needing to include them in your deployment package.
https://docs.aws.amazon.com/lambda/latest/dg/configuration-layers.html
Following are the steps to use AWS lambda layers
Write a Lambda layer code
Package Lambda layer
Deploy Lambda layer
Attached a layer to function Call a method
Verify the results
Once you complete writing your function make sure the pom.xml contains the artifacts and maven-shade-plugin
<groupId>java-lambda-layer</groupId>
<artifactId>java-lambda-layer</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<configuration> <createDependencyReducedPom>false</createDependencyReducedPom> </configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
Run Maven
mvnclean install and package
Please read further on following link
https://medium.com/#zeebaig/working-with-aws-lambda-layers-ddf5c91674d3

AWS Lambda NoClassDefFoundError

I am having difficulty with a Java based Lambda function setup to receive messages from SNS. My function looks like the below:
package com.mycompany;
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.LambdaLogger;
import com.amazonaws.services.lambda.runtime.events.SNSEvent;
public class LambdaHandler {
public void Handler(SNSEvent event, Context context) {
//Process the event
}
}
It compiles just fine and I don't have any problems uploading the jar file to Lambda (via the web console).
However, when I publish to it (via SNS through to the subscribed Lambda function) with JSON representing the SNSEvent model, the Lambda function throws the following exception:
Error loading method handler on class com.mycompany.LambdaHandler:
class java.lang.NoClassDefFoundError java.lang.NoClassDefFoundError:
com/amazonaws/services/lambda/runtime/events/SNSEvent at
java.lang.Class.getDeclaredMethods0(Native Method) at
java.lang.Class.privateGetDeclaredMethods(Class.java:2701) at
java.lang.Class.privateGetPublicMethods(Class.java:2902) at
java.lang.Class.getMethods(Class.java:1615) Caused by:
java.lang.ClassNotFoundException:
com.amazonaws.services.lambda.runtime.events.SNSEvent at
java.net.URLClassLoader.findClass(URLClassLoader.java:381) at
java.lang.ClassLoader.loadClass(ClassLoader.java:424) at
java.lang.ClassLoader.loadClass(ClassLoader.java:357)
I use Maven + Netbeans and it's a Maven Java Application project. I downloaded the function from the Lambda console and confirmed, the jar has a lib/ directory with all of the jar's for the imports, including aws-lambda-java-events-1.1.0.jar, which itself includes the /com/amazonaws/services/lambda/runtime/events/SNSEvent.class file.
Why is the runtime unable to find the class when it's definitely in the jar file? Is there anything else I need to do, set any environment variables, etc?
Any help would be appreciated!
EDIT 1
I tried downgrading to aws-lambda-java-events 1.0.0 and it's still reporting the same exception. As requested, below is my POM file (with just project name changed). I don't know how to tell Maven to put the libraries in a tree structure.
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.app</groupId>
<artifactId>Handler</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-lambda</artifactId>
<version>1.10.6</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-core</artifactId>
<version>1.0.0</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-events</artifactId>
<version>1.0.0</version>
</dependency>
</dependencies>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
</project>
Use the maven-shade plugin so that the JAR contains the dependencies in an uber-jar.
So, add this to your pom.xml
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
Source: http://docs.aws.amazon.com/lambda/latest/dg/java-create-jar-pkg-maven-no-ide.html
Potentially you may have this issue https://github.com/aws/aws-lambda-java-libs/issues/2 which requires a downgrade to aws-lambda-java-events-1.0.0.jar
=== If this issue exists even after including shaded jar ===
If you have this issue even after having shaded jar then the issue should be related to aws-lambda-java-events package version (should be some incompatibility between AWS lamda version and newer aws-lambda-java-events version) . i.e. I had this issue with latest version (2.0.2) of aws-lambda-java-events package and I have to downgrade the version to 1.3.0.
Seems like newer aws-lambda-java-events version doesn't have many dependencies.
Sometimes you have to upload your lambda again. Also I got the same issue I fixed with this pom.xml:
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-bom</artifactId>
<version>1.11.83</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
In the plugins section of your pom.xml, add the Apache Maven Shade Plugin.It is used during the build process. This plugin is used for packaging jars to create a standalone .jar.The maven-shade-plugin will take artifacts (jars) produced by the package goal , and created a standalone .jar that contains the compiled code, and the resolved dependencies from the pom.xml.
<dependency>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.0.0</version>
If you have <scope>provided</scope>
in aws-lambda-java-events artifact, remove it.
Whenever We tried to upload java based Jar or Zip into AWS lambda Console we have to take care of some basic things like,
The Code URI, Which is present in SAM Template or the template.yml file.
eg:
Eg: runwayDetails - package Name
App - class name
handleRequest - lambda handler method.
Syntax should be like this - packageName.className::methodName
It will solve the the error
upload the function.zip instead of jar to lambda

Can maven surefire plugin run multiple test executions when invoked directly? (Sonar + Jenkins not running all unit tests)

I have a maven project that uses the surefire plugin for unit testing. Currently the testing is split into two executions default-test and unitTests, both bound to the test goal, as follows:
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<executions>
<!-- Test one set of files -->
<execution>
<id>default-test</id>
<goals>
<goal>test</goal>
</goals>
<configuration>
<includes> ... </includes>
</configuration>
</execution>
<!-- Test a different set of files -->
<execution>
<id>unitTests</id>
<goals>
<goal>test</goal>
</goals>
<configuration>
<includes> ... </includes>
</configuration>
<execution>
</executions>
</plugin>
</plugins>
This works fine when running mvn test:
$ mvn test
...
[INFO] maven-surefire-plugin:2.13:test (default-test)
...
[INFO] maven-surefire-plugin:2.13:test (unitTests)
I am using the Sonar plugin on Jenkins to generate code metrics, and Jenkins is set up as follows:
run the build action mvn clean install -DskipTests
run the Sonar plugin as a post-build action
This means that the Sonar plugin runs the unit tests, and the unit tests are run once only. And I see the following output when the Sonar plugin runs:
[INFO] maven-surefire-plugin:213:test (default-cli)
Note that only the default-cli execution is invoked rather than default-test + unitTests, because (I assume) the surefire plugin is invoked directly via mvn surefire:test instead of via the mvn test lifecycle.
I can add a default-cli execution to the POM file, and copy the configuration from the default-test execution, but this will only run one set of unit tests. The unit tests configured in the unitTests execution are not run at all, even though that execution is bound to the same goal.
How can I ensure that the both the default-cli and unitTests executions are invoked when the Sonar plugin invokes mvn surefire:test ?
I am thinking that perhaps my Jenkins setup should change, so that the unit tests are run in the normal build action, which generates code coverage reports, and then the Sonar plugin runs as a post-build action and loads these reports to perform analysis. Not sure if this is possible, or what changes are required to my POM files to support this (hoping to make minimal changes to POM files is another goal).
I am thinking that perhaps my Jenkins setup should change, so that the
unit tests are run in the normal build action, which generates code
coverage reports, and then the Sonar plugin runs as a post-build
action and loads these reports to perform analysis.
That seems like a good idea. You could use maven profiles to do this. You can configure the surefire plugin in different profiles and make the one you want your tools to use active by default.