Run JUnit tests with SBT - unit-testing

I have a 0.13.7 SBT project, with several sub-projects.
One of them is called webapp, and it has many JUnit tests in webapp/src/test/java.
When running:
sbt webapp/test
only the ScalaTest tests are run, but no JUnit tests.
Snippet of my build.sbt file:
libraryDependencies ++= Seq(
"com.novocode" % "junit-interface" % "0.11" % Test
)
lazy val webapp = project
settings(
Seq(
projectDependencies ++= Seq(
....
"org.scalatest" %% "scalatest" % "2.2.2" % Test,
"junit" % "junit" % "4.11" % Test,
"com.novocode" % "junit-interface" % "0.11" % Test
)
): _*
)
Example JUnit test:
import org.junit.Test;
public class CodificadorBase64Test {
#Test
public void testPlain() {
byte b[] = {64, 127, 72, 36, 100, 1, 5, 9, 123};
assertEquals("QH9IJGQBBQl7", CodificadorBase64.encode(b));
}
}
UPDATE (some more research):
> webapp/testFrameworks
[info] List(TestFramework(WrappedArray(org.scalacheck.ScalaCheckFramework)), TestFramework(WrappedArray(org.specs2.runner.Specs2Framework, org.specs2.runner.SpecsFramework)), TestFramework(WrappedArray(org.specs.runner.SpecsFramework)), TestFramework(WrappedArray(org.scalatest.tools.Framework, org.scalatest.tools.ScalaTestFramework)), TestFramework(WrappedArray(com.novocode.junit.JUnitFramework))
show webapp/loadedTestFrameworks
[info] Map(TestFramework(WrappedArray(
org.scalatest.tools.Framework,
org.scalatest.tools.ScalaTestFramework)
) -> org.scalatest.tools.Framework#65767aeb)
So JUnit support is known by SBT but not loaded.
Debug output:
[debug] Framework implementation 'org.scalacheck.ScalaCheckFramework' not present.
[debug] Framework implementation 'org.scalacheck.ScalaCheckFramework' not present.
[debug] Framework implementation 'org.scalacheck.ScalaCheckFramework' not present.
[debug] Framework implementation 'org.scalacheck.ScalaCheckFramework' not present.
[debug] Framework implementation 'org.specs2.runner.Specs2Framework' not present.
[debug] Framework implementation 'org.specs2.runner.Specs2Framework' not present.
[debug] Framework implementation 'org.specs2.runner.Specs2Framework' not present.
[debug] Framework implementation 'org.specs2.runner.Specs2Framework' not present.
[debug] Framework implementation 'org.specs2.runner.SpecsFramework' not present.
[debug] Framework implementation 'org.specs2.runner.SpecsFramework' not present.
[debug] Framework implementation 'org.specs2.runner.SpecsFramework' not present.
[debug] Framework implementation 'org.specs2.runner.SpecsFramework' not present.
[debug] Framework implementation 'org.specs.runner.SpecsFramework' not present.
[debug] Framework implementation 'org.specs.runner.SpecsFramework' not present.
[debug] Framework implementation 'org.specs.runner.SpecsFramework' not present.
[debug] Framework implementation 'org.specs.runner.SpecsFramework' not present.
[debug] Framework implementation 'com.novocode.junit.JUnitFramework' not present.
[debug] Framework implementation 'com.novocode.junit.JUnitFramework' not present.
[debug] Framework implementation 'com.novocode.junit.JUnitFramework' not present.
[debug] Framework implementation 'com.novocode.junit.JUnitFramework' not present.
[debug] Subclass fingerprints: List((org.scalatest.Suite,false,org.scalatest.tools.Framework$$anon$1#3ad42aff))
[debug] Subclass fingerprints: List((org.scalatest.Suite,false,org.scalatest.tools.Framework$$anon$1#97f54b))
[debug] Annotation fingerprints: List((org.scalatest.WrapWith,false,org.scalatest.tools.Framework$$anon$2#6a589982))
[debug] Annotation fingerprints: List((org.scalatest.WrapWith,false,org.scalatest.tools.Framework$$anon$2#1b95d5e6))
[debug] Subclass fingerprints: List((org.scalatest.Suite,false,org.scalatest.tools.Framework$$anon$1#5c997dac))
[debug] Subclass fingerprints: List((org.scalatest.Suite,false,org.scalatest.tools.Framework$$anon$1#406c43ef))
[debug] Annotation fingerprints: List((org.scalatest.WrapWith,false,org.scalatest.tools.Framework$$anon$2#282ddefc))
[debug] Annotation fingerprints: List((org.scalatest.WrapWith,false,org.scalatest.tools.Framework$$anon$2#4400c80))
Working with:
SBT 0.13.9, and
JUnit 4.x.
Related information:
Why don't junit tests get executed with "sbt test"?
SBT documentation

Finally, I've discovered, that I have to add the following settings to the subproject:
lazy val webapp = project
settings(
Seq(
projectDependencies ++= Seq(
....
"org.scalatest" %% "scalatest" % "2.2.2" % Test,
"junit" % "junit" % "4.11" % Test,
crossPaths := false,
"com.novocode" % "junit-interface" % "0.11" % Test
)
): _*
)
It is important to declare the junit-interface dependency in the subproject, and in addition, to set to false the crossPaths setting.
The clue has been given by this issue.
If the main project doesn't have JUnit tests, then the needed test settings, don't need to be provided.
In addition, for knowing the failing method and the cause, we need this setting:
testOptions in Test := Seq(Tests.Argument(TestFrameworks.JUnit, "-a"))

This question about supporting JUnit in SBT has been asked multiple times with slightly different framings.
Multiple hits made it hard for me to find the simplest and most current answer.
This answer by #david.perez seems clear and works with current (2018) SBT 1.1.4.
(That particular question was about conflicting JUnit versions. The exclude("junit", "junit-dep") may not be necessary.)
I'll also copy-paste the code here for quick access:
libraryDependencies ++= Seq(
"junit" % "junit" % "4.12" % Test,
"com.novocode" % "junit-interface" % "0.11" % Test exclude("junit", "junit-dep")
)

Related

object stream is not a member of package akka

I've below build.sbt file -
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / scalaVersion := "2.13.7"
val akkaVersion = "2.6.18"
lazy val root = (project in file("."))
.settings(
name := "akka-sbt-multijvm-issue"
)
libraryDependencies ++= Seq("com.typesafe.akka" %% "akka-actor" % akkaVersion,
"com.typesafe.akka" %% "akka-stream" % akkaVersion,
"com.typesafe.akka" %% "akka-cluster" % akkaVersion)
I've below main code -
package com.example
object MaterializerApp extends App {
import akka.stream.Materializer
}
when I compile the code I get the below error -
sbt clean compile
[info] welcome to sbt 1.6.1 (Azul Systems, Inc. Java 11.0.12)
[info] loading global plugins from /Users/rajkumar.natarajan/.sbt/1.0/plugins
[info] loading settings for project akka-sbt-multijvm-issue-build from plugins.sbt ...
[info] loading project definition from /Users/rajkumar.natarajan/Documents/Coding/akka-sbt-multijvm-issue/project
[info] loading settings for project root from build.sbt ...
[info] set current project to akka-sbt-multijvm-issue (in build file:/Users/rajkumar.natarajan/Documents/Coding/akka-sbt-multijvm-issue/)
[info] Executing in batch mode. For better performance use sbt's shell
[success] Total time: 0 s, completed Jan 5, 2022, 9:13:16 PM
[info] compiling 1 Scala source to /Users/rajkumar.natarajan/Documents/Coding/akka-sbt-multijvm-issue/target/scala-2.13/classes ...
[error] /Users/rajkumar.natarajan/Documents/Coding/akka-sbt-multijvm-issue/src/main/scala/com/example/MaterializerApp.scala:5:15: object stream is not a member of package akka
[error] import akka.stream.Materializer
^
[error] one error found
[error] (Compile / compileIncremental) Compilation failed
[error] Total time: 4 s, completed Jan 5, 2022, 9:13:21 PM
Note: When I change the akkaVersion to 2.6.17, the compilation succeeds.
How can I fix this error?

Spark Unit Testing

My entire build.sbt is:
name := """sparktest"""
version := "1.0.0-SNAPSHOT"
scalaVersion := "2.11.8"
scalacOptions := Seq("-unchecked", "-deprecation", "-encoding", "utf8", "-Xexperimental")
parallelExecution in Test := false
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.0.2",
"org.apache.spark" %% "spark-sql" % "2.0.2",
"org.apache.avro" % "avro" % "1.8.1",
"org.scalatest" %% "scalatest" % "3.0.1" % "test",
"com.holdenkarau" %% "spark-testing-base" % "2.0.2_0.4.7" % "test"
)
I have a simple test. Obviously, this is just a starting point, I'd like to test more:
package sparktest
import com.holdenkarau.spark.testing.DataFrameSuiteBase
import org.scalatest.FunSuite
class SampleSuite extends FunSuite with DataFrameSuiteBase {
test("simple test") {
assert(1 + 1 === 2)
}
}
I run sbt clean test and get a failure with:
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf$ConfVars
For my dev environment, I'm using the spark-2.0.2-bin-hadoop2.7.tar.gz
Do I have to configure this environment in any way? Obviously HiveConf is a transitive Spark dependency
As #daniel-de-paula mentions in the comments you will need to add spark-hive as an explicit dependency (you can restrict this to the test scope though if you aren't using hive in your application its self). spark-hive is not a transitive dependency of spark-core which is why this error happened. spark-hive is excluded from spark-testing-base as a dependency so that people who are doing RDD only tests don't need to add it as a dependency.

Stack trace: ExitCodeException exitCode=1 when starting MapReduce job on Bigtable

We are using Google Cloud Bigtable for our Big Data.
When I'm running MapReduce job I assembly a jar and run it and now I'm getting this error:
Application application_1451577928704_0050 failed 2 times due to AM
Container for appattempt_1451577928704_0050_000002 exited with
exitCode: 1 For more detailed output, check application tracking
page:http://censored:8088/cluster/app/application_1451577928704_0050Then,
click on links to logs of each attempt. Diagnostics: Exception from
container-launch. Container id:
container_e02_1451577928704_0050_02_000001 Exit code: 1 Stack trace:
ExitCodeException exitCode=1: at
org.apache.hadoop.util.Shell.runCommand(Shell.java:545) at
org.apache.hadoop.util.Shell.run(Shell.java:456) at
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722)
at
org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
at java.util.concurrent.FutureTask.run(FutureTask.java:266) at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745) Container exited with a
non-zero exit code 1 Failing this attempt. Failing the application.
When I logged to see the logging of the workers node I saw this error:
2016-02-15 02:59:54,106 INFO [main]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Created MRAppMaster
for application appattempt_1451577928704_0050_000001 2016-02-15
02:59:54,294 WARN [main] org.apache.hadoop.util.NativeCodeLoader:
Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable 2016-02-15 02:59:54,319 INFO
[main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Executing with
tokens: 2016-02-15 02:59:54,319 INFO [main]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Kind:
YARN_AM_RM_TOKEN, Service: , Ident: (appAttemptId { application_id {
id: 50 cluster_timestamp: 1451577928704 } attemptId: 1 } keyId:
-******) 2016-02-15 02:59:54,424 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Using mapred
newApiCommitter. 2016-02-15 02:59:54,755 WARN [main]
org.apache.hadoop.hdfs.shortcircuit.DomainSocketFactory: The
short-circuit local reads feature cannot be used because libhadoop
cannot be loaded. 2016-02-15 02:59:54,855 INFO [main]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: OutputCommitter set in
config null 2016-02-15 02:59:54,911 INFO [main]
org.apache.hadoop.service.AbstractService: Service
org.apache.hadoop.mapreduce.v2.app.MRAppMaster failed in state INITED;
cause: org.apache.hadoop.yarn.exceptions.YarnRuntimeException:
java.lang.ClassCastException:
org.apache.xerces.dom.DeferredElementNSImpl cannot be cast to
org.w3c.dom.Text
org.apache.hadoop.yarn.exceptions.YarnRuntimeException:
java.lang.ClassCastException:
org.apache.xerces.dom.DeferredElementNSImpl cannot be cast to
org.w3c.dom.Text at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:478)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:458)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.callWithJobClassLoader(MRAppMaster.java:1560)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRAppMaster.java:458)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:377)
at
org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster$4.run(MRAppMaster.java:1518)
at java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:422) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1515)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1448)
Caused by: java.lang.ClassCastException:
org.apache.xerces.dom.DeferredElementNSImpl cannot be cast to
org.w3c.dom.Text at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2603)
at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2502)
at
org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2405)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:981)
at
org.apache.hadoop.conf.Configuration.getTrimmed(Configuration.java:1031)
at
org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1432)
at
org.apache.hadoop.hbase.HBaseConfiguration.checkDefaultsVersion(HBaseConfiguration.java:67)
at
org.apache.hadoop.hbase.HBaseConfiguration.addHbaseResources(HBaseConfiguration.java:81)
at
org.apache.hadoop.hbase.HBaseConfiguration.create(HBaseConfiguration.java:96)
at
org.apache.hadoop.hbase.HBaseConfiguration.create(HBaseConfiguration.java:105)
at
org.apache.hadoop.hbase.mapreduce.TableOutputFormat.setConf(TableOutputFormat.java:184)
at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:474)
... 11 more
I tried an older jar and it's running perfectly fine and I'm not sure why the new jar won't work - Didn't change anything.
Please advise?
Thanks!
Update 1: Here is some more details:
I setup the cluster with the dataproc.
We are using the newest versions, here is the library dependencies:
val BigtableHbase = "com.google.cloud.bigtable" % "bigtable-hbase-1.1"
% "0.2.2" val BigtableHbaseMapreduce = "com.google.cloud.bigtable" %
"bigtable-hbase-mapreduce" % "0.2.2" val CommonsCli = "commons-cli" %
"commons-cli" % "1.2" val HadoopCommon = "org.apache.hadoop" %
"hadoop-common" % "2.7.1" val HadoopMapreduceClientApp =
"org.apache.hadoop" % "hadoop-mapreduce-client-app" % "2.7.1" val
HbaseCommon = "org.apache.hbase" % "hbase-common" % "1.1.2" val
HbaseProtocol = "org.apache.hbase" % "hbase-protocol" % "1.1.2" val
HbaseClient = "org.apache.hbase" % "hbase-client" % "1.1.2" val
HbaseServer = "org.apache.hbase" % "hbase-server" % "1.1.2" val
HbaseAnnotations = "org.apache.hbase" % "hbase-annotations" % "1.1.2"
libraryDependencies += BigtableHbase libraryDependencies +=
BigtableHbaseMapreduce libraryDependencies += CommonsCli
libraryDependencies += HadoopCommon libraryDependencies +=
HadoopMapreduceClientApp libraryDependencies += HbaseCommon
libraryDependencies += HbaseProtocol libraryDependencies +=
HbaseClient libraryDependencies += HbaseServer libraryDependencies +=
HbaseAnnotations
Java version:
openjdk version "1.8.0_66-internal" OpenJDK Runtime Environment (build
1.8.0_66-internal-b17) OpenJDK 64-Bit Server VM (build 25.66-b17, mixed mode)
Alpn version: alpn-boot-8.1.3.v20150130
hbase verison:
2016-02-15 20:45:42,050 INFO [main] util.VersionInfo: HBase 1.1.2
2016-02-15 20:45:42,051 INFO [main] util.VersionInfo: Source code
repository file:///mnt/ram/bigtop/bigtop/output/ hbase/hbase-1.1.2
revision=Unknown 2016-02-15 20:45:42,051 INFO [main]
util.VersionInfo: Compiled by bigtop on Tue Nov 10 19:09:17 UTC 2015
2016-02-15 20:45:42,051 INFO [main] util.VersionInfo: From source
with checksum 42e8a1890c700d37485c69a44a3
hadoop version:
Hadoop 2.7.1 Subversion
https://bigdataoss-internal.googlesource.com/third_party/apache/bigtop
-r 2a194d4d838b79460c3ceb892f3c94 44218ba970 Compiled by bigtop on 2015-11-10T18:38Z Compiled with protoc 2.5.0 From source with checksum
fc0a1a23fc1868e4d5ee7fa2b28a58a This command was run using
/usr/lib/hadoop/hadoop-common-2.7.1.jar
I found the problem in my case!
The hbase-site.xml was slightly different in the hbase.client.connection.impl property.
<property>
<name>hbase.client.connection.impl</name>
<value>com.google.cloud.bigtable.hbase1_1.BigtableConnection</value>
</property>
I got to this after extracting and comparing the two jars.
The newer versions of the bigtable client jar include newer versions of the gRPC jar. Newer versions of the gRPC jar depend on newer versions of alpn-boot or OpenSSL. In addition to a new version of the bigtable jar, you may need a new version of the alpn-boot jar. Unfortunately, the Jetty team isn't making new alpn-boot jars for Java7, which bdutil depends on.
We are actively working on moving away from bdutil to dataproc, which is the newer version of Google Cloud Hadoop management. Dataproc uses Java 8, and doesn't have the same problems as bdutil. There are still kinks we need to work out.
More information can be found at:
https://cloud.google.com/dataproc/examples/cloud-bigtable-example
and
https://github.com/grpc/grpc-java/blob/master/SECURITY.md

Tests fail when run with SBT on the command line but not when run in the IDE

When I run the unit tests of my Flink application through the IntelliJ IDE they pass without any issue. When I run them through SBT though, a few exceptions are thrown (see below). What may the cause for these exceptions be? I've been unable to track them down.
Edit: It's worth noting that the project in IntelliJ was created as a "sbt project", so the way the IDE knows about the project dependencies is also through the build.sbt file. Why is this file not being enough when running sbt form the command line?
$ sbt clean test
[info] Loading global plugins from /Users/myuser/.sbt/0.13/plugins
[info] Loading project definition from /Users/myuser/projects/anonymizer/project
[info] Set current project to anonymizer (in build file:/Users/myuser/projects/anonymizer/)
[success] Total time: 8 s, completed Jan 20, 2016 6:15:20 PM
[info] Updating {file:/Users/myuser/projects/anonymizer/}anonymizer...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Compiling 8 Scala sources to /Users/myuser/projects/anonymizer/target/scala-2.11/classes...
[info] Compiling 2 Scala sources to /Users/myuser/projects/anonymizer/target/scala-2.11/test-classes...
[error] Test myorg.mypackage.TestMyClass failed: java.nio.file.FileAlreadyExistsException: /var/folders/n4/_bl8xyqs15xbgy37k889plm80000gn/T/TestBaseUtils-logdir9030651763276830933.tmp/jobmanager.out, took 0.0 sec
[error] at sun.nio.fs.UnixException.translateToIOException(UnixException.java:88)
[error] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
[error] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
[error] at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:214)
[error] at java.nio.file.Files.newByteChannel(Files.java:361)
[error] at java.nio.file.Files.createFile(Files.java:632)
[error] at org.apache.flink.test.util.TestBaseUtils.startCluster(TestBaseUtils.java:136)
[error] at org.apache.flink.test.util.TestBaseUtils.startCluster(TestBaseUtils.java:124)
[error] at org.apache.flink.streaming.util.StreamingMultipleProgramsTestBase.setup(StreamingMultipleProgramsTestBase.java:72)
[error] ...
2016-01-20 18:15:38 INFO FlinkMiniCluster:230 - Starting FlinkMiniCluster.
2016-01-20 18:15:38 INFO Slf4jLogger:80 - Slf4jLogger started
2016-01-20 18:15:38 INFO BlobServer:94 - Created BLOB server storage directory /var/folders/n4/_bl8xyqs15xbgy37k889plm80000gn/T/blobStore-07aa1e12-785d-4a18-bb16-6c2664f6b4f2
[...]
2016-01-20 18:15:39 INFO Task:470 - Loading JAR files for task Source: Collection Source -> Flat Map -> Sink: Unnamed (1/1)
2016-01-20 18:15:39 INFO Task:858 - Source: Collection Source -> Flat Map -> Sink: Unnamed (1/1) switched to FAILED with exception.
java.lang.Exception: Could not load the task's invokable class.
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:729)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:474)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.runtime.tasks.SourceStreamTask
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:725)
... 2 more
2016-01-20 18:15:39 INFO Task:672 - Freeing task resources for Source: Collection Source -> Flat Map -> Sink: Unnamed (1/1)
2016-01-20 18:15:39 INFO TestingTaskManager:128 - Unregistering task and sending final execution state FAILED to JobManager for task Source: Collection Source -> Flat Map -> Sink: Unnamed (717fa0d09f9592d62db7b9e52f08de6e)
2016-01-20 18:15:39 INFO ExecutionGraph:934 - Source: Collection Source -> Flat Map -> Sink: Unnamed (1/1) (717fa0d09f9592d62db7b9e52f08de6e) switched from DEPLOYING to FAILED
2016-01-20 18:15:39 INFO TestingJobManager:137 - Status of job f8d59cfb76aaaeb016a14120125338fd (Flink Streaming Job) changed to FAILING.
java.lang.Exception: Could not load the task's invokable class.
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:729)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:474)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.runtime.tasks.SourceStreamTask
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:725)
... 2 more
2016-01-20 18:15:39 INFO JobClientActor:280 - 01/20/2016 18:15:39 Source: Collection Source -> Flat Map -> Sink: Unnamed(1/1) switched to FAILED
java.lang.Exception: Could not load the task's invokable class.
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:729)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:474)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.runtime.tasks.SourceStreamTask
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:725)
... 2 more
2016-01-20 18:15:39 INFO JobClientActor:280 - 01/20/2016 18:15:39 Job execution switched to status FAILING.
[...]
This is my build.sbt file:
name := "anonymizer"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
"org.apache.flink" % "flink-streaming-scala_2.11" % "0.10.1",
"org.apache.flink" % "flink-clients_2.11" % "0.10.1",
"org.apache.flink" % "flink-connector-kafka_2.11" % "0.10.1",
"com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.6.3"
)
// Dependencies needed by the unit tests when using junit
libraryDependencies ++= Seq(
"com.novocode" % "junit-interface" % "0.11" % "test",
"org.apache.flink" % "flink-streaming-contrib_2.11" % "0.10.1",
"org.apache.flink" % "flink-streaming-java_2.11" % "0.10.1" % "test" classifier "tests",
"org.apache.flink" % "flink-core_2.11" % "0.10.1" % "test" classifier "tests",
"org.apache.flink" % "flink-runtime_2.11" % "0.10.1" % "test" classifier "tests",
"org.apache.flink" % "flink-test-utils_2.11" % "0.10.1" % "test"
)

No unit test success reported to sonar

A gradle build is creating a jacoco report which is picked up by sonar runner. After the sonar results are pushed to the sonar server, unit test coverage is displayed but unit test success is 0 even though all tests are run successfully.
Looking at the sonar runner log I see the following items reporting the source and classes locations:
15:52:36.087 INFO - Source dirs: /poc-sonar/src/main/java
15:52:36.087 INFO - Test dirs: /poc-sonar/src/test/java, /Volumes/Disk/Development/poc-sonar/src/test/groovy
15:52:36.088 INFO - Binary dirs: /poc-sonar/build/classes/main
This is raising the first question: Does Sonar have to see the compiled test classes for analysis?
Further down the log:
15:52:37.435 INFO - Sensor JaCoCoSensor...
15:52:37.445 INFO - Analysing /poc-sonar/build/jacoco/test.exec
15:52:37.546 INFO - No information about coverage per test.
15:52:37.548 INFO - Sensor JaCoCoSensor done: 113 ms
15:52:38.105 INFO - Execute decorators...
15:52:38.580 INFO - Store results in database
Jacoco has picked up the test.exec file, but reporting "No information about coverage per test"
What does that log statement mean? The sonar server actually exposes correct coverage! Is it an indicator for the missing test success reported by Sonar? What is missing for getting the unit
Unit Tests Coverage
50,0%
50,0% line coverage
Unit test success
0 tests
The full gradle build.script:
ext {
spockVersion = '0.7-groovy-2.0'
groovyVersion = '2.2.1'
}
apply plugin: 'idea'
apply plugin: 'java'
apply plugin: 'groovy'
apply plugin: 'jacoco'
apply plugin: 'sonar-runner'
group = "poc"
version = "1.0.0-SNAPSHOT"
sourceCompatibility = 1.7
targetCompatibility = 1.7
repositories {
maven {
credentials {
username "${artifactoryUsername}"
password "${artifactoryPassword}"
}
url "${artifactoryContextUrl}"
}
}
dependencies {
testCompile "junit:junit-dep:4.11"
testCompile "org.codehaus.groovy:groovy-all:$groovyVersion"
testCompile "org.spockframework:spock-core:$spockVersion"
}
tasks.withType(Test) { task ->
jacoco {
destinationFile = file("$buildDir/jacoco/${task.name}.exec")
}
}
sonarRunner {
sonarProperties {
property 'sonar.projectName', rootProject.name
property 'sonar.projectDescription', rootProject.name
// sonar server and database
property "sonar.host.url", sonarHostUrl
property "sonar.jdbc.url", sonarJdbcUrl
//property "sonar.jdbc.driverClassName", "com.mysql.jdbc.Driver"
property "sonar.jdbc.username", sonarJdbcUsername
property "sonar.jdbc.password", sonarJdbcPassword
property 'sonar.sourceEncoding', 'UTF-8'
}
}
tasks.sonarRunner.dependsOn = []
In recent Sonar versions, the Sonar property for test report location has been renamed from sonar.surefire.reportsPath to sonar.junit.reportsPath. Hence you may have to set the latter manually. For example:
apply plugin: "sonar"
subprojects {
apply plugin: "java"
sonarRunner {
sonarProperties {
property "sonar.junit.reportsPath", test.reports.junitXml.destination
}
}
}