object stream is not a member of package akka - akka

I've below build.sbt file -
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / scalaVersion := "2.13.7"
val akkaVersion = "2.6.18"
lazy val root = (project in file("."))
.settings(
name := "akka-sbt-multijvm-issue"
)
libraryDependencies ++= Seq("com.typesafe.akka" %% "akka-actor" % akkaVersion,
"com.typesafe.akka" %% "akka-stream" % akkaVersion,
"com.typesafe.akka" %% "akka-cluster" % akkaVersion)
I've below main code -
package com.example
object MaterializerApp extends App {
import akka.stream.Materializer
}
when I compile the code I get the below error -
sbt clean compile
[info] welcome to sbt 1.6.1 (Azul Systems, Inc. Java 11.0.12)
[info] loading global plugins from /Users/rajkumar.natarajan/.sbt/1.0/plugins
[info] loading settings for project akka-sbt-multijvm-issue-build from plugins.sbt ...
[info] loading project definition from /Users/rajkumar.natarajan/Documents/Coding/akka-sbt-multijvm-issue/project
[info] loading settings for project root from build.sbt ...
[info] set current project to akka-sbt-multijvm-issue (in build file:/Users/rajkumar.natarajan/Documents/Coding/akka-sbt-multijvm-issue/)
[info] Executing in batch mode. For better performance use sbt's shell
[success] Total time: 0 s, completed Jan 5, 2022, 9:13:16 PM
[info] compiling 1 Scala source to /Users/rajkumar.natarajan/Documents/Coding/akka-sbt-multijvm-issue/target/scala-2.13/classes ...
[error] /Users/rajkumar.natarajan/Documents/Coding/akka-sbt-multijvm-issue/src/main/scala/com/example/MaterializerApp.scala:5:15: object stream is not a member of package akka
[error] import akka.stream.Materializer
^
[error] one error found
[error] (Compile / compileIncremental) Compilation failed
[error] Total time: 4 s, completed Jan 5, 2022, 9:13:21 PM
Note: When I change the akkaVersion to 2.6.17, the compilation succeeds.
How can I fix this error?

Related

Unresolved dependencies in Akka http

I keep getting unresolved dependencies with the code below. Any clue what I can do to clear the error?
name := "AkkaDemo"
version := "1.0"
scalaVersion := "2.11.8"
val scalaTestVersion = "3.0.1"
resolvers += "Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/"
lazy val akkademoService = project.settings (libraryDependencies ++= Seq(
"mysql" % "mysql-connector-java" % "5.1.25",
"com.typesafe.slick" %% "slick"% "3.1.0",
"com.typesafe.slick" %% "slick-hikaricp" % "3.1.0",
"com.typesafe.akka" %% "akka-actor" % "2.4.16",
"com.typesafe.akka" %% "akka-http" % "10.0.1",
"com.typesafe.akka" % "akka-slf4j" % "2.3.14"
)).
dependsOn(instanceConfig)
lazy val instanceConfig = project
lazy val AkkaDemo = project.in(file(".")).aggregate(instanceConfig, akkademoService)
Here is the sbt output for the sbt run:
Error:Error while importing SBT project:<br/>...<br/><pre>[info]
Resolving org.fusesource.jansi#jansi;1.4 ...
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: com.typesafe.akka#akka-actor_2.10;2.4.16: not found
[warn] :: com.typesafe.akka#akka-slf4j;2.3.14: not found
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
This was solved by adding build.sbt with the scalaversion to use to each module. Apparently each module, without a build.sbt specifying the version to use, default to 2.10.

Not able to start Jetty in embedded mode

I am trying to start Jetty in embedded mode to deploy a war file. I am using jetty lib versioned 9.4.6
I have following task created in Gradle for starting Jetty and deploying the web application.
println 'Starting Jetty............'
project.ext.server = new Server();
ServerConnector connector = new ServerConnector(project.ext.server);
connector.setPort(jettyPort);
project.ext.server.addConnector(connector);
WebAppContext webapp = new WebAppContext()
webapp.setContextPath('/')
def warPath = 'build/libs/';
warPath += 'test-' + project.version + '.war';
println("Deploying WAR File : --> ${warPath}");
webapp.setWar(warPath)
project.ext.server.setHandler(webapp);
project.ext.server.start();
println 'Server started, waiting...'
new StopMonitor(jettyStopPort, project.ext.server).start();
println 'Jetty started.'
but above script fails with following error
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.eclipse.jetty.server.session.SessionHandler
Exact line from the script which is failing is
WebAppContext webapp = new WebAppContext()
Even if I keep this line as the single line in the script and remove everything, I get the same error.
Interestingly, the class for which it is complaining is present in the jar file jetty-server. Same script used to work with jetty libs 8.1
Note: In order to make the script work with jetty 9.4, i had to use ServerConnector class instead of BlockingConnectot, which was removed in jetty 9.4, rest of the script is same.
I am not sure why this failing.
You are probably missing required jar files.
Would strongly encourage you to use a proper build tool, you have many to choose from.
Here's the jar dependency list (in tree form) for the example project at...
https://github.com/jetty-project/embedded-servlet-3.1
[INFO] --- maven-dependency-plugin:2.8:tree (default-cli) # embedded-servlet-3.1 ---
[INFO] org.eclipse.jetty.demo:embedded-servlet-3.1:war:1-SNAPSHOT
[INFO] +- javax.servlet:javax.servlet-api:jar:3.1.0:compile
[INFO] +- org.eclipse.jetty:jetty-webapp:jar:9.4.6.v20170531:compile
[INFO] | +- org.eclipse.jetty:jetty-xml:jar:9.4.6.v20170531:compile
[INFO] | | \- org.eclipse.jetty:jetty-util:jar:9.4.6.v20170531:compile
[INFO] | \- org.eclipse.jetty:jetty-servlet:jar:9.4.6.v20170531:compile
[INFO] | \- org.eclipse.jetty:jetty-security:jar:9.4.6.v20170531:compile
[INFO] | \- org.eclipse.jetty:jetty-server:jar:9.4.6.v20170531:compile
[INFO] | +- org.eclipse.jetty:jetty-http:jar:9.4.6.v20170531:compile
[INFO] | \- org.eclipse.jetty:jetty-io:jar:9.4.6.v20170531:compile
[INFO] \- org.eclipse.jetty:jetty-annotations:jar:9.4.6.v20170531:compile
[INFO] +- org.eclipse.jetty:jetty-plus:jar:9.4.6.v20170531:compile
[INFO] | \- org.eclipse.jetty:jetty-jndi:jar:9.4.6.v20170531:compile
[INFO] +- javax.annotation:javax.annotation-api:jar:1.2:compile
[INFO] +- org.ow2.asm:asm:jar:5.1:compile
[INFO] \- org.ow2.asm:asm-commons:jar:5.1:compile
[INFO] \- org.ow2.asm:asm-tree:jar:5.1:compile

Stack trace: ExitCodeException exitCode=1 when starting MapReduce job on Bigtable

We are using Google Cloud Bigtable for our Big Data.
When I'm running MapReduce job I assembly a jar and run it and now I'm getting this error:
Application application_1451577928704_0050 failed 2 times due to AM
Container for appattempt_1451577928704_0050_000002 exited with
exitCode: 1 For more detailed output, check application tracking
page:http://censored:8088/cluster/app/application_1451577928704_0050Then,
click on links to logs of each attempt. Diagnostics: Exception from
container-launch. Container id:
container_e02_1451577928704_0050_02_000001 Exit code: 1 Stack trace:
ExitCodeException exitCode=1: at
org.apache.hadoop.util.Shell.runCommand(Shell.java:545) at
org.apache.hadoop.util.Shell.run(Shell.java:456) at
org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722)
at
org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
at
org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
at java.util.concurrent.FutureTask.run(FutureTask.java:266) at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745) Container exited with a
non-zero exit code 1 Failing this attempt. Failing the application.
When I logged to see the logging of the workers node I saw this error:
2016-02-15 02:59:54,106 INFO [main]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Created MRAppMaster
for application appattempt_1451577928704_0050_000001 2016-02-15
02:59:54,294 WARN [main] org.apache.hadoop.util.NativeCodeLoader:
Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable 2016-02-15 02:59:54,319 INFO
[main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Executing with
tokens: 2016-02-15 02:59:54,319 INFO [main]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Kind:
YARN_AM_RM_TOKEN, Service: , Ident: (appAttemptId { application_id {
id: 50 cluster_timestamp: 1451577928704 } attemptId: 1 } keyId:
-******) 2016-02-15 02:59:54,424 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Using mapred
newApiCommitter. 2016-02-15 02:59:54,755 WARN [main]
org.apache.hadoop.hdfs.shortcircuit.DomainSocketFactory: The
short-circuit local reads feature cannot be used because libhadoop
cannot be loaded. 2016-02-15 02:59:54,855 INFO [main]
org.apache.hadoop.mapreduce.v2.app.MRAppMaster: OutputCommitter set in
config null 2016-02-15 02:59:54,911 INFO [main]
org.apache.hadoop.service.AbstractService: Service
org.apache.hadoop.mapreduce.v2.app.MRAppMaster failed in state INITED;
cause: org.apache.hadoop.yarn.exceptions.YarnRuntimeException:
java.lang.ClassCastException:
org.apache.xerces.dom.DeferredElementNSImpl cannot be cast to
org.w3c.dom.Text
org.apache.hadoop.yarn.exceptions.YarnRuntimeException:
java.lang.ClassCastException:
org.apache.xerces.dom.DeferredElementNSImpl cannot be cast to
org.w3c.dom.Text at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:478)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:458)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.callWithJobClassLoader(MRAppMaster.java:1560)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRAppMaster.java:458)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:377)
at
org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster$4.run(MRAppMaster.java:1518)
at java.security.AccessController.doPrivileged(Native Method) at
javax.security.auth.Subject.doAs(Subject.java:422) at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1515)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1448)
Caused by: java.lang.ClassCastException:
org.apache.xerces.dom.DeferredElementNSImpl cannot be cast to
org.w3c.dom.Text at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2603)
at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2502)
at
org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2405)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:981)
at
org.apache.hadoop.conf.Configuration.getTrimmed(Configuration.java:1031)
at
org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1432)
at
org.apache.hadoop.hbase.HBaseConfiguration.checkDefaultsVersion(HBaseConfiguration.java:67)
at
org.apache.hadoop.hbase.HBaseConfiguration.addHbaseResources(HBaseConfiguration.java:81)
at
org.apache.hadoop.hbase.HBaseConfiguration.create(HBaseConfiguration.java:96)
at
org.apache.hadoop.hbase.HBaseConfiguration.create(HBaseConfiguration.java:105)
at
org.apache.hadoop.hbase.mapreduce.TableOutputFormat.setConf(TableOutputFormat.java:184)
at
org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
at
org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:474)
... 11 more
I tried an older jar and it's running perfectly fine and I'm not sure why the new jar won't work - Didn't change anything.
Please advise?
Thanks!
Update 1: Here is some more details:
I setup the cluster with the dataproc.
We are using the newest versions, here is the library dependencies:
val BigtableHbase = "com.google.cloud.bigtable" % "bigtable-hbase-1.1"
% "0.2.2" val BigtableHbaseMapreduce = "com.google.cloud.bigtable" %
"bigtable-hbase-mapreduce" % "0.2.2" val CommonsCli = "commons-cli" %
"commons-cli" % "1.2" val HadoopCommon = "org.apache.hadoop" %
"hadoop-common" % "2.7.1" val HadoopMapreduceClientApp =
"org.apache.hadoop" % "hadoop-mapreduce-client-app" % "2.7.1" val
HbaseCommon = "org.apache.hbase" % "hbase-common" % "1.1.2" val
HbaseProtocol = "org.apache.hbase" % "hbase-protocol" % "1.1.2" val
HbaseClient = "org.apache.hbase" % "hbase-client" % "1.1.2" val
HbaseServer = "org.apache.hbase" % "hbase-server" % "1.1.2" val
HbaseAnnotations = "org.apache.hbase" % "hbase-annotations" % "1.1.2"
libraryDependencies += BigtableHbase libraryDependencies +=
BigtableHbaseMapreduce libraryDependencies += CommonsCli
libraryDependencies += HadoopCommon libraryDependencies +=
HadoopMapreduceClientApp libraryDependencies += HbaseCommon
libraryDependencies += HbaseProtocol libraryDependencies +=
HbaseClient libraryDependencies += HbaseServer libraryDependencies +=
HbaseAnnotations
Java version:
openjdk version "1.8.0_66-internal" OpenJDK Runtime Environment (build
1.8.0_66-internal-b17) OpenJDK 64-Bit Server VM (build 25.66-b17, mixed mode)
Alpn version: alpn-boot-8.1.3.v20150130
hbase verison:
2016-02-15 20:45:42,050 INFO [main] util.VersionInfo: HBase 1.1.2
2016-02-15 20:45:42,051 INFO [main] util.VersionInfo: Source code
repository file:///mnt/ram/bigtop/bigtop/output/ hbase/hbase-1.1.2
revision=Unknown 2016-02-15 20:45:42,051 INFO [main]
util.VersionInfo: Compiled by bigtop on Tue Nov 10 19:09:17 UTC 2015
2016-02-15 20:45:42,051 INFO [main] util.VersionInfo: From source
with checksum 42e8a1890c700d37485c69a44a3
hadoop version:
Hadoop 2.7.1 Subversion
https://bigdataoss-internal.googlesource.com/third_party/apache/bigtop
-r 2a194d4d838b79460c3ceb892f3c94 44218ba970 Compiled by bigtop on 2015-11-10T18:38Z Compiled with protoc 2.5.0 From source with checksum
fc0a1a23fc1868e4d5ee7fa2b28a58a This command was run using
/usr/lib/hadoop/hadoop-common-2.7.1.jar
I found the problem in my case!
The hbase-site.xml was slightly different in the hbase.client.connection.impl property.
<property>
<name>hbase.client.connection.impl</name>
<value>com.google.cloud.bigtable.hbase1_1.BigtableConnection</value>
</property>
I got to this after extracting and comparing the two jars.
The newer versions of the bigtable client jar include newer versions of the gRPC jar. Newer versions of the gRPC jar depend on newer versions of alpn-boot or OpenSSL. In addition to a new version of the bigtable jar, you may need a new version of the alpn-boot jar. Unfortunately, the Jetty team isn't making new alpn-boot jars for Java7, which bdutil depends on.
We are actively working on moving away from bdutil to dataproc, which is the newer version of Google Cloud Hadoop management. Dataproc uses Java 8, and doesn't have the same problems as bdutil. There are still kinks we need to work out.
More information can be found at:
https://cloud.google.com/dataproc/examples/cloud-bigtable-example
and
https://github.com/grpc/grpc-java/blob/master/SECURITY.md

Tests fail when run with SBT on the command line but not when run in the IDE

When I run the unit tests of my Flink application through the IntelliJ IDE they pass without any issue. When I run them through SBT though, a few exceptions are thrown (see below). What may the cause for these exceptions be? I've been unable to track them down.
Edit: It's worth noting that the project in IntelliJ was created as a "sbt project", so the way the IDE knows about the project dependencies is also through the build.sbt file. Why is this file not being enough when running sbt form the command line?
$ sbt clean test
[info] Loading global plugins from /Users/myuser/.sbt/0.13/plugins
[info] Loading project definition from /Users/myuser/projects/anonymizer/project
[info] Set current project to anonymizer (in build file:/Users/myuser/projects/anonymizer/)
[success] Total time: 8 s, completed Jan 20, 2016 6:15:20 PM
[info] Updating {file:/Users/myuser/projects/anonymizer/}anonymizer...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Compiling 8 Scala sources to /Users/myuser/projects/anonymizer/target/scala-2.11/classes...
[info] Compiling 2 Scala sources to /Users/myuser/projects/anonymizer/target/scala-2.11/test-classes...
[error] Test myorg.mypackage.TestMyClass failed: java.nio.file.FileAlreadyExistsException: /var/folders/n4/_bl8xyqs15xbgy37k889plm80000gn/T/TestBaseUtils-logdir9030651763276830933.tmp/jobmanager.out, took 0.0 sec
[error] at sun.nio.fs.UnixException.translateToIOException(UnixException.java:88)
[error] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
[error] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
[error] at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:214)
[error] at java.nio.file.Files.newByteChannel(Files.java:361)
[error] at java.nio.file.Files.createFile(Files.java:632)
[error] at org.apache.flink.test.util.TestBaseUtils.startCluster(TestBaseUtils.java:136)
[error] at org.apache.flink.test.util.TestBaseUtils.startCluster(TestBaseUtils.java:124)
[error] at org.apache.flink.streaming.util.StreamingMultipleProgramsTestBase.setup(StreamingMultipleProgramsTestBase.java:72)
[error] ...
2016-01-20 18:15:38 INFO FlinkMiniCluster:230 - Starting FlinkMiniCluster.
2016-01-20 18:15:38 INFO Slf4jLogger:80 - Slf4jLogger started
2016-01-20 18:15:38 INFO BlobServer:94 - Created BLOB server storage directory /var/folders/n4/_bl8xyqs15xbgy37k889plm80000gn/T/blobStore-07aa1e12-785d-4a18-bb16-6c2664f6b4f2
[...]
2016-01-20 18:15:39 INFO Task:470 - Loading JAR files for task Source: Collection Source -> Flat Map -> Sink: Unnamed (1/1)
2016-01-20 18:15:39 INFO Task:858 - Source: Collection Source -> Flat Map -> Sink: Unnamed (1/1) switched to FAILED with exception.
java.lang.Exception: Could not load the task's invokable class.
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:729)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:474)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.runtime.tasks.SourceStreamTask
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:725)
... 2 more
2016-01-20 18:15:39 INFO Task:672 - Freeing task resources for Source: Collection Source -> Flat Map -> Sink: Unnamed (1/1)
2016-01-20 18:15:39 INFO TestingTaskManager:128 - Unregistering task and sending final execution state FAILED to JobManager for task Source: Collection Source -> Flat Map -> Sink: Unnamed (717fa0d09f9592d62db7b9e52f08de6e)
2016-01-20 18:15:39 INFO ExecutionGraph:934 - Source: Collection Source -> Flat Map -> Sink: Unnamed (1/1) (717fa0d09f9592d62db7b9e52f08de6e) switched from DEPLOYING to FAILED
2016-01-20 18:15:39 INFO TestingJobManager:137 - Status of job f8d59cfb76aaaeb016a14120125338fd (Flink Streaming Job) changed to FAILING.
java.lang.Exception: Could not load the task's invokable class.
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:729)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:474)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.runtime.tasks.SourceStreamTask
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:725)
... 2 more
2016-01-20 18:15:39 INFO JobClientActor:280 - 01/20/2016 18:15:39 Source: Collection Source -> Flat Map -> Sink: Unnamed(1/1) switched to FAILED
java.lang.Exception: Could not load the task's invokable class.
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:729)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:474)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.runtime.tasks.SourceStreamTask
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:725)
... 2 more
2016-01-20 18:15:39 INFO JobClientActor:280 - 01/20/2016 18:15:39 Job execution switched to status FAILING.
[...]
This is my build.sbt file:
name := "anonymizer"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
"org.apache.flink" % "flink-streaming-scala_2.11" % "0.10.1",
"org.apache.flink" % "flink-clients_2.11" % "0.10.1",
"org.apache.flink" % "flink-connector-kafka_2.11" % "0.10.1",
"com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.6.3"
)
// Dependencies needed by the unit tests when using junit
libraryDependencies ++= Seq(
"com.novocode" % "junit-interface" % "0.11" % "test",
"org.apache.flink" % "flink-streaming-contrib_2.11" % "0.10.1",
"org.apache.flink" % "flink-streaming-java_2.11" % "0.10.1" % "test" classifier "tests",
"org.apache.flink" % "flink-core_2.11" % "0.10.1" % "test" classifier "tests",
"org.apache.flink" % "flink-runtime_2.11" % "0.10.1" % "test" classifier "tests",
"org.apache.flink" % "flink-test-utils_2.11" % "0.10.1" % "test"
)

Run JUnit tests with SBT

I have a 0.13.7 SBT project, with several sub-projects.
One of them is called webapp, and it has many JUnit tests in webapp/src/test/java.
When running:
sbt webapp/test
only the ScalaTest tests are run, but no JUnit tests.
Snippet of my build.sbt file:
libraryDependencies ++= Seq(
"com.novocode" % "junit-interface" % "0.11" % Test
)
lazy val webapp = project
settings(
Seq(
projectDependencies ++= Seq(
....
"org.scalatest" %% "scalatest" % "2.2.2" % Test,
"junit" % "junit" % "4.11" % Test,
"com.novocode" % "junit-interface" % "0.11" % Test
)
): _*
)
Example JUnit test:
import org.junit.Test;
public class CodificadorBase64Test {
#Test
public void testPlain() {
byte b[] = {64, 127, 72, 36, 100, 1, 5, 9, 123};
assertEquals("QH9IJGQBBQl7", CodificadorBase64.encode(b));
}
}
UPDATE (some more research):
> webapp/testFrameworks
[info] List(TestFramework(WrappedArray(org.scalacheck.ScalaCheckFramework)), TestFramework(WrappedArray(org.specs2.runner.Specs2Framework, org.specs2.runner.SpecsFramework)), TestFramework(WrappedArray(org.specs.runner.SpecsFramework)), TestFramework(WrappedArray(org.scalatest.tools.Framework, org.scalatest.tools.ScalaTestFramework)), TestFramework(WrappedArray(com.novocode.junit.JUnitFramework))
show webapp/loadedTestFrameworks
[info] Map(TestFramework(WrappedArray(
org.scalatest.tools.Framework,
org.scalatest.tools.ScalaTestFramework)
) -> org.scalatest.tools.Framework#65767aeb)
So JUnit support is known by SBT but not loaded.
Debug output:
[debug] Framework implementation 'org.scalacheck.ScalaCheckFramework' not present.
[debug] Framework implementation 'org.scalacheck.ScalaCheckFramework' not present.
[debug] Framework implementation 'org.scalacheck.ScalaCheckFramework' not present.
[debug] Framework implementation 'org.scalacheck.ScalaCheckFramework' not present.
[debug] Framework implementation 'org.specs2.runner.Specs2Framework' not present.
[debug] Framework implementation 'org.specs2.runner.Specs2Framework' not present.
[debug] Framework implementation 'org.specs2.runner.Specs2Framework' not present.
[debug] Framework implementation 'org.specs2.runner.Specs2Framework' not present.
[debug] Framework implementation 'org.specs2.runner.SpecsFramework' not present.
[debug] Framework implementation 'org.specs2.runner.SpecsFramework' not present.
[debug] Framework implementation 'org.specs2.runner.SpecsFramework' not present.
[debug] Framework implementation 'org.specs2.runner.SpecsFramework' not present.
[debug] Framework implementation 'org.specs.runner.SpecsFramework' not present.
[debug] Framework implementation 'org.specs.runner.SpecsFramework' not present.
[debug] Framework implementation 'org.specs.runner.SpecsFramework' not present.
[debug] Framework implementation 'org.specs.runner.SpecsFramework' not present.
[debug] Framework implementation 'com.novocode.junit.JUnitFramework' not present.
[debug] Framework implementation 'com.novocode.junit.JUnitFramework' not present.
[debug] Framework implementation 'com.novocode.junit.JUnitFramework' not present.
[debug] Framework implementation 'com.novocode.junit.JUnitFramework' not present.
[debug] Subclass fingerprints: List((org.scalatest.Suite,false,org.scalatest.tools.Framework$$anon$1#3ad42aff))
[debug] Subclass fingerprints: List((org.scalatest.Suite,false,org.scalatest.tools.Framework$$anon$1#97f54b))
[debug] Annotation fingerprints: List((org.scalatest.WrapWith,false,org.scalatest.tools.Framework$$anon$2#6a589982))
[debug] Annotation fingerprints: List((org.scalatest.WrapWith,false,org.scalatest.tools.Framework$$anon$2#1b95d5e6))
[debug] Subclass fingerprints: List((org.scalatest.Suite,false,org.scalatest.tools.Framework$$anon$1#5c997dac))
[debug] Subclass fingerprints: List((org.scalatest.Suite,false,org.scalatest.tools.Framework$$anon$1#406c43ef))
[debug] Annotation fingerprints: List((org.scalatest.WrapWith,false,org.scalatest.tools.Framework$$anon$2#282ddefc))
[debug] Annotation fingerprints: List((org.scalatest.WrapWith,false,org.scalatest.tools.Framework$$anon$2#4400c80))
Working with:
SBT 0.13.9, and
JUnit 4.x.
Related information:
Why don't junit tests get executed with "sbt test"?
SBT documentation
Finally, I've discovered, that I have to add the following settings to the subproject:
lazy val webapp = project
settings(
Seq(
projectDependencies ++= Seq(
....
"org.scalatest" %% "scalatest" % "2.2.2" % Test,
"junit" % "junit" % "4.11" % Test,
crossPaths := false,
"com.novocode" % "junit-interface" % "0.11" % Test
)
): _*
)
It is important to declare the junit-interface dependency in the subproject, and in addition, to set to false the crossPaths setting.
The clue has been given by this issue.
If the main project doesn't have JUnit tests, then the needed test settings, don't need to be provided.
In addition, for knowing the failing method and the cause, we need this setting:
testOptions in Test := Seq(Tests.Argument(TestFrameworks.JUnit, "-a"))
This question about supporting JUnit in SBT has been asked multiple times with slightly different framings.
Multiple hits made it hard for me to find the simplest and most current answer.
This answer by #david.perez seems clear and works with current (2018) SBT 1.1.4.
(That particular question was about conflicting JUnit versions. The exclude("junit", "junit-dep") may not be necessary.)
I'll also copy-paste the code here for quick access:
libraryDependencies ++= Seq(
"junit" % "junit" % "4.12" % Test,
"com.novocode" % "junit-interface" % "0.11" % Test exclude("junit", "junit-dep")
)