Jacoco Unit and Integration Tests coverage - individual and overall [closed] - unit-testing

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have a project (ProjectA) which contains some unit tests and integration tests.
Following is the structure.
ProjectA
- src/java (java source code)
- test/java (Junit unit tests)
- test/resources (resources required for Junit Unit tests)
- src/java-test (Integration tests)
- conf (contains .xml files required for building/testing/compiling purposes)
I run the following commands -- All of them works but I have a doubt on how the configurations that I have in build.gradle / GRADLE_HOME/init.d/*.gradle files are affecting what I'm getting.
It seems like I'm missing something and not getting where I want what.
Commands:
- gradle clean build -- it works fine
- gradle clean build jacocoTestReport -- it works fine.
- gradle clean build integrationTest jacocoTestReport -- it works fine (if I have a tomcat instance up and running in another putty window for the same ProjectA).
After the 3rd bullet operation is complete, I see the extra folder "build" and its subfolders (other than what's checked out from source/version control) in my Jenkins jobs workspace.
i.e. under -- JenkinsWorkspace
/build
- classes (contains .class files for the following which are mentioned as one of the sourceSets section)
---- integrationTest
---- main
---- test
- resources (this contains all the files .properties/.xml files which were under "conf" folder in source control.
- reports (contains .xml/.html files for PMD/CheckStyle/Findbugs and Tests results for either Unit or IT Tests but NOT both).
---- checkstyle
---- findbugs
---- pmd
---- jacoco
---- tests (Note: this is plural i.e. it's not "test" which is defined as one entry in sourceSets)
- jacoco (This contains 2 .exec files i.e. test.exec and integrationTest.exec both have different file size)
---- test.exec
---- integrationTest.exec
- jacocoHtml (This folder contains lots of folders (containing .html files) and mainly "index.html" under it.
---- somexxxfolders
---- ---- somexxfolder's.html files
---- index.html
---- other etc files/folders
- test-results (This contains some .xml files BUT only for either Unit tests or Integration tests - but not for both of the test types at a given time).
i.e. if I run "gradle clean build", then you'll see Unit test related .xml files and
if I run "gradle clean build integrationTest", then Unit test .xml files are overwritten
and the .xml files I see are only related to/generated by integrationTest task.
Following is one of the common gradle (GRADLE_HOME/init.d/some.common.gradle file)
//
//Extra file can hold global Gradle settings so that these dont have to be inserted in project
//specific build.gradle file.
//Filename: extraN.common<anyname>.gradle
allprojects {
apply plugin: 'java'
apply plugin: 'pmd'
apply plugin: 'findbugs'
apply plugin: 'checkstyle'
apply plugin: 'jacoco'
apply plugin: 'sonar-runner'
tasks.withType(Compile) {
options.debug = true
options.compilerArgs = ["-g"]
}
sourceSets {
main {
java {
// MOTE: If your project's build.gradle specify the sourceSet section, the following
// values will be overwritten by what project's build.gradle will set.
//
// If you project's sourceSet structure if different in each project, then in this
// global common .gradle file, you can define srcDir for main or other sections like
// test, integrationTest etc as shown below -- commented out. If that's the case,
// then uncomment the below line and comment out using // -- srcDir 'src/java' line
// for sourceSets.main.java section. This rule applies to other sections as well.
// srcDir 'no_src_dir_set_yet'
srcDir 'src/java'
}
resources {
srcDir 'conf'
}
}
test {
java {
srcDir 'test/java'
}
resources {
srcDir 'test/resources'
srcDir 'conf'
}
}
integrationTest {
java {
srcDir 'src/java-test'
}
resources {
srcDir 'conf'
}
}
}
def sonarServerUrl = "dev.sandbox.server.com"
sonarRunner {
sonarProperties {
property "sonar.host.url", "http://$sonarServerUrl:9000"
property "sonar.jdbc.url", "jdbc:h2:tcp://$sonarServerUrl:9092/sonar"
property "sonar.jdbc.driverClassName", "org.h2.Driver"
property "sonar.jdbc.username", "sonar"
property "sonar.jdbc.password", "sonar"
properties ["sonar.sources"] += sourceSets.main.allSource.srcDirs
//properties ["sonar.tests"] += sourceSets.test.java.srcDirs
properties ["sonar.tests"] += sourceSets.integrationTest.allSource.srcDirs
}
}
checkstyle {
configFile = new File(rootDir, "config/checkstyle.xml")
ignoreFailures = true
//sourceSets = [sourceSets.main, sourceSets.test, sourceSets.integrationTest]
sourceSets = [sourceSets.main]
}
findbugs {
ignoreFailures = true
sourceSets = [sourceSets.main]
}
pmd {
ruleSets = ["basic", "braces", "design"]
ignoreFailures = true
}
jacoco {
toolVersion = "0.6.2.201302030002"
reportsDir = file("$buildDir/customJacocoReportDir")
}
task testReport(type: TestReport) {
destinationDir = file("$buildDir/reports/allTests")
}
test {
jacoco {
//destinationFile = file("$buildDir/jacoco/jacocoTest.exec")
destinationFile = file("$buildDir/jacoco/test.exec")
//classDumpFile = file("$buildDir/jacoco/classpathdumps")
classDumpFile = file("$buildDir/build/classes/test")
}
}
jacocoTestReport {
group = "Reporting"
description = "Generate Jacoco coverage reports after running tests."
reports {
xml{
enabled true
destination "${buildDir}/reports/jacoco/jacoco.xml"
}
csv.enabled false
html{
enabled true
destination "${buildDir}/jacocoHtml"
}
}
additionalSourceDirs = files(sourceSets.main.allJava.srcDirs)
//additionalSourceDirs = files([sourceSets.main.allJava.srcDirs, xxxx, 'xxxxyyyy' ])
}
}
build.gradle file looks like:
import com.tr.ids.gradle.CustomFileUtil
apply plugin: 'CustomSetup'
apply plugin: 'java'
apply plugin: 'customJarService'
apply plugin: 'customWarService'
sourceSets {
main {
java {
srcDir 'src/java'
}
}
test {
java {
srcDir 'test/java'
}
resources {
srcDir 'test/resources'
srcDir 'conf'
}
}
integrationTest {
java {
srcDir 'src/java-test'
}
}
}
// Read dependency lists from external files. Our custom plugin reads a dep-xxx.txt file for compile/test/war related .jar file entries
// where each entry is like: groupid:artifactid:x.x.x
// and these artifact jars are available in Artifactory
List depListCompile = customFileUtil.readIntoList( "$projectDir/dep-compile.txt" )
List depListTest = customFileUtil.readIntoList( "$projectDir/dep-testArtifacts.txt" )
List depListWar = customFileUtil.readIntoList( "$projectDir/dep-war.txt" )
// Define dependencies
dependencies {
// Compilation
compile depListCompile
// Unit Tests
testCompile depListTest
// Integration tests
// Everything from compile and testCompile targets
integrationTestCompile configurations.compile
integrationTestCompile configurations.testCompile
// Output of compiling "main" files
integrationTestCompile sourceSets.main.output
// Additional dependencies from war and others
integrationTestCompile depListTest, depListWar
// All configuration files
integrationTestRuntime files( 'conf' )
}
task deployTomcat( type: Copy, dependsOn: [ jar, compileIntegrationTestJava, warService ] ) {
from "$buildDir/customWar/${project.name}.war"
into "$projectDir/tomcat/webapps"
}
build {
dependsOn deployTomcat
}
task integrationTest( type: Test, dependsOn: cleanTest ) {
jacoco {
//destinationFile = file("$buildDir/jacoco/jacocoTest.exec")
destinationFile = file("$buildDir/jacoco/integrationTest.exec")
//classDumpFile = file("$buildDir/jacoco/classpathdumps")
classDumpFile = file("$buildDir/classes/integrationTest")
}
testClassesDir = sourceSets.integrationTest.output.classesDir
classpath = sourceSets.integrationTest.runtimeClasspath
}
apply plugin: 'eclipse'
eclipse.classpath {
// Define output directory so Eclipse does not accidentally clobber /bin
defaultOutputDir = file( 'out/classes' )
// Add integration test
plusConfigurations += configurations.integrationTestCompile
// Remove unnecessary files
file.whenMerged { classpath ->
classpath.entries.removeAll { entry -> ( entry.path.indexOf( '/build/classes/main' ) > 0 ) }
classpath.entries.removeAll { entry -> ( entry.path.indexOf( '/build/resources/main' ) > 0 ) }
}
}
My questions:
1) Why "gradle clean build integrationTest" -- which is working successfully, is overwriting test results in build/reports/tests and build/test-results folders.
2) It doesn't matter what name I give for .exec file under common gradle file for test and in build.gradle for integrationTest task for jacoco, it always create test.exec and integrationTest.exec file but the resultant build/jacocoHtml folder index.html file doesn't show coverage / files related to both Unit / Integration tests. To prove this, if I run "gradle clean build integrationTest jacocoTestReport sonarRunner", I see the workspace for the job, now contains, ".sonar" folder and build/reports/sonar folder which contains another file called "overall-xxx.exec" some file, but that file size is not close to the "sum" of Unit test.exec and IT integrationTest.exec file size. Though its bigger than test.exec file size by few bytes.
3) What configuration can I set to have overall coverage for both Unit and IT tests i.e. overall...exec file gets good size (after running sonarRunner task). During sonarRunner task, I do see SonarRunner task's "jacocoSensor step" does see both UT and IT .exec files and the overall .exec file as well automatically (a good feature from Sonar).

Found answer to my 2nd question. High level info:
Gradle 1.6 jacocoTestReport uses different variables, Gradle >=1.7 uses different.
For ex: we can tweak Unit tests and Integration Tests .exec file creation by changing "test" or "integrationTest" task by using the CORRECT variables -or it wont work n generate "test.exec" and "integrationTest.exec" default file names. See example below.
task integrationTest(type: Test) OR test { ... } section can have the correct variables for the given Gradle version that we are using.
task integrationTest (type: Test) {
testClassesDir = sourceSets.integrationTest.output.classesDir
classpath = sourceSets.integrationTest.runtimeClasspath
testReportDir = file("$buildDir/reports/tests/IT")
testResultsDir = file("$buildDir/test-results/IT")
ignoreFailures = true
jacoco {
//This works with 1.6
destPath = file("$buildDir/jacoco/IT/jacocoIT.exec")
classDumpPath = file("$buildDir/jacoco/IT/classpathdumps")
/*
Following works only with versions >= 1.7 version of Gradle
destinationFile = file("$buildDir/jacoco/IT/jacocoIT.exec")
classDumpFile = file("$buildDir/jacoco/IT/classpathdumps")
*/
}
}
Similarly, for test { .... } task, you can define it as ../../UT/jacocoUT.exec and ../../UT/classpathdumps...
.sonar folder gets created if I run "sonar-runner" Linux/Unix sonar-runner executable in my project's workspace, BUT if I run Jenkins job which calls Gradle "clean build integrationTest jacocoTestReport sonarRunner", then build/sonar folder is created and becomes the WORKING DIR for SONAR (this shows up during the output).
In Jenkins > under Post build section, I mentioned the following and NOW, jacoco code coverage report on Jenkins Dashboard for both jacoco files (.html and .xml) - includes Unit and Integration tests code coverage data.
Record Jacoco coverage report section in Jenkins has the following boxes:
I mentioned:
Path to exec files: /build/jacoco/UT/jacocoUT.exec, */build/jacoco/IT/jacocoIT.exec
Path to class dirs: */build/jacoco//classpathdumps/com/thc
(this is the location where Jacoco instrumented classes sit).. both build/jacoco/UT/classpathdumps/com/thc and build/jacoco/IT/classpathdumps/com/thc will be picked as ** will be replaced for any folder under build/jacoco. this value can be set to "build/classes" folder as well.
Path to source files: **
(if I use src/java, few of the links for source file don't work i.e. file contents don't show up).. used ** .. it works now.
rest boxes - left blank.
Still wondering - why overall-jacoco.exec is not having the file size = sum of both jacocoUT.exec and jacocoIT.exec
At this point, I'm able to see Jacoco code coverage for both Unit and Integration Tests i.e. via visiting the Jacoco code coverage image on job's dashboard and visiting source code links and also if you go and browse "build/reports/jacoco/html/index.html" or "build/jacocoHtml/index.html" file.
Still trying to find out - what needs to be done for SONAR to pick these 2 .exec files (I have valid values set for sonar.xxx variurs variables for sources, tests, binaries, ...reportsPath etc for UT / IT exec files... on SonarQube dashboard, Unit test coverage is showing up fine but Integration tests coverage is still 0.0%.
I'll paste the copy of both my common.gradle and project's build.gradle file soon .... to have a better look.

OK, found the solution to UT/IT folder issue and my question (1) in this post. It's actually the "cleanTest" task which is one of the default RULE in Gradle.
Running "gradle tasks -all" gives a big output of tasks that Gradle supports and this output at the end tells about such Rules where if we call clean, then it'll wipe those folders. As you see in my code above, integrationTest was dependent upon cleanTest, thus when I called "gradle clean build integrationTest" it ran units tests first (via build task as Unit tests runs by default with the build step in Gradle) and then integration tests via "integrationTest" task, therefore, while running integration tests, it called cleanTest task, which wiped out "UT" folder which i have mentioned in a common gradle script (/init.d/commmon-some-name.gradle file) like I mentioned IT folders for reports/results directories.
Removing cleanTest as a dependsOn from integrationTest task, resolved wiping out issue.
task integrationTest( type: Test, dependsOn: cleanTest ) {
//task integrationTest( type: Test ) {
Output of following command: showing only last few lines...
gradle tasks -all
integrationTest
classes - Assembles binary 'main'.
cleanTest
compileIntegrationTestJava - Compiles source set 'integrationTest:java'.
compileJava - Compiles source set 'main:java'.
integrationTestClasses - Assembles binary 'integrationTest'.
processIntegrationTestResources - Processes source set 'integrationTest:resources'.
processResources - Processes source set 'main:resources'.
jarService
sonarRunner [test]
Rules
-----
Pattern: build<ConfigurationName>: Assembles the artifacts of a configuration.
Pattern: upload<ConfigurationName>: Assembles and uploads the artifacts belonging to a configuration.
Pattern: clean<TaskName>: Cleans the output files of a task.
BUILD SUCCESSFUL

Related

Libgdx test from gradle command line assets not found

I have a LibGDX project with some tests. The structure is as follow:
core/src -> for my java sources code
core/test -> for my tests source code
core/assets -> for all my assets
When I run the tests from eclipse, they all go green but whenever I try to run them from the gradle command line (./gradlew test) I get an error with the assets folder. I believe this is because the tests are not launched from the assets folder as it is from eclipse.
How can I solve this problem ? Is there a way to tell gradle to use core/assets as the workspace directory when running the tests ?
Here is the error I get:
com.badlogic.gdx.utils.GdxRuntimeException: com.badlogic.gdx.utils.GdxRuntimeException: Couldn't load dependencies of asset: myasset.png
I found a way to achieve what I wanted. I post my solution here for anyone that might need it. There is a property named workingDir for the test task in gradle. So I just needed to set it to the right folder. Inside the build.gradle file of your project (the root folder) just add the following section:
project(":core") {
apply plugin: "java"
// Add the following test section
test{
workingDir= new File("/assets")
}
// Rest of the file
}
That's it! My tests are running green from the command line now.

Gradle task not ran when subproject task ran

I have a gradle c/cpp project where it has rootProject/project/subProject
In my root build.gradle I have
ext.libFolder = rootDir.toString + '/lib/'
subprojects {
build.doLast {
copy {
from fileTree ( buildDir.getCanonicalPath() ).files
into rootProject.ext.libFolder
include '*.so'
include '*.a'
}
}
}
This works perfectly when I do: gradle build but if I do gradle :project:subProject:build it does not work... What do I need to change to make this work no matter how I call the build task?
On a side note we are trying to make all subProject build.gradle files as bare minimal as possible, simple as possible, and structurally the same as possible, as we have over 780 of them.
edit: I have added in a clean task as shown below this way gradle will know the files have been cleaned. This however does not seem to effect the core issue where the build task when called any way other than gradle build does not run the doLast task.
clean {
delete rootProject.ext.includesFolder
delete rootProject.ext.libfolder
}
Stick the build.doLast{} in an afterEvaluate{} block.
subprojects {
afterEvaluate{
build.doLast {
copy{
from fileTree ( buildDir.getCanonicalPath() ).files
into rootProject.ext.libFolder
include '*.so'
include '*.a'
}
}
}
}
Because you want to add the doLast functionality after the subproject build task is available.

aggregating gradle multiproject test results using TestReport

I have a project structure that looks like the below. I want to use the TestReport functionality in Gradle to aggregate all the test results to a single directory.
Then I can access all the test results through a single index.html file for ALL subprojects.
How can I accomplish this?
.
|--ProjectA
|--src/test/...
|--build
|--reports
|--tests
|--index.html (testresults)
|--..
|--..
|--ProjectB
|--src/test/...
|--build
|--reports
|--tests
|--index.html (testresults)
|--..
|--..
From Example 4. Creating a unit test report for subprojects in the Gradle User Guide:
subprojects {
apply plugin: 'java'
// Disable the test report for the individual test task
test {
reports.html.enabled = false
}
}
task testReport(type: TestReport) {
destinationDir = file("$buildDir/reports/allTests")
// Include the results from the `test` task in all subprojects
reportOn subprojects*.test
}
Fully working sample is available from samples/testing/testReport in the full Gradle distribution.
In addition to the subprojects block and testReport task suggested by #peter-niederwieser above, I would add another line to the build below those:
tasks('test').finalizedBy(testReport)
That way if you run gradle test (or even gradle build), the testReport task will run after the subproject tests complete. Note that you have to use tasks('test') rather than just test.finalizedBy(...) because the test task doesn't exist in the root project.
If using kotlin Gradle DSL
val testReport = tasks.register<TestReport>("testReport") {
destinationDir = file("$buildDir/reports/tests/test")
reportOn(subprojects.map { it.tasks.findByPath("test") })
subprojects {
tasks.withType<Test> {
useJUnitPlatform()
finalizedBy(testReport)
ignoreFailures = true
testLogging {
events("passed", "skipped", "failed")
}
}
}
And execute gradle testReport. Source How to generate an aggregated test report for all Gradle subprojects
I am posting updated answer on this topic. I am using Gradle 7.5.1.
TestReport task
In short I'm using following script to set up test aggregation form subprojects (based on #Peter's answer):
subprojects {
apply plugin: 'java'
}
task testReport(type: TestReport) {
destinationDir = file("$buildDir/reports/allTests")
// Include the results from the `test` task in all subprojects
testResults.from = subprojects*.test
}
Note that reportOn method is "deprecated" or will be soon and replaced with testResults, while at the same time testResults is still incubating as of 7.5.1.
I got following warning in IDE
The TestReport.reportOn(Object...) method has been deprecated. This is scheduled to be removed in Gradle 8.0.
Hint: subproject*.test is example of star dot notation in groovy that invokes test task on a list of subprojects. Equally would be invocation of subprojects.collect{it.test}
TestReport#reportOn (Gradle API documentation)
TestReport#testResults (Gradle API documentation)
reportOn replacement for gradle 8 (Gradle Forum)
test-report-aggregation plugin
There is also alternative option for aggregating tests (Since Gradle 7.4). One can apply test-report-aggregation plugin.
If your projects already apply java plugin, this means they will come with jvm-test-suite, all you have to do is apply the plugin.
plugins {
id 'test-report-aggregation'
}
Then you will be able to invoke test reports through testSuiteAggregateTestReport task. Personally didn't use the plugin, but I think it makes sense to use it if you have multiple test suites configured with jvm-test-suite.
Example project can be found in https://github.com/gradle-samples/Aggregating-test-results-using-a-standalone-utility-project-Groovy
For 'connectedAndroidTest's there is a approach published by google.(https://developer.android.com/studio/test/command-line.html#RunTestsDevice (Multi-module reports section))
Add the 'android-reporting' Plugin to your projects build.gradle.
apply plugin: 'android-reporting'
Execute the android tests with additional 'mergeAndroidReports' argument. It will merge all test results of the project modules into one report.
./gradlew connectedAndroidTest mergeAndroidReports
FYI, I've solved this problem using the following subprojects config in my root project build.gradle file. This way no extra tasks are needed.
Note: this places each module's output in its own reports/<module_name> folder, so subproject builds don't overwrite each other's results.
subprojects {
// Combine all build results
java {
reporting.baseDir = "${rootProject.buildDir.path}/reports/${project.name}"
}
}
For a default Gradle project, this would result in a folder structure like
build/reports/module_a/tests/test/index.html
build/reports/module_b/tests/test/index.html
build/reports/module_c/tests/test/index.html

How can I not include a build task when I include a project in my settings.gradle file?

My settings.gradle file looks like:
include "serverTest", "shared"
And the serverTest build.gradle file looks like:
group = 'gradle'
version = '1.0'
defaultTasks 'build'
apply plugin: 'java'
apply plugin: 'eclipse'
sourceCompatibility = JavaVersion.VERSION_1_6
dependencies
{
compile project(':shared')
}
The directory structure is: The top level holds the settings.gradle file and it has folders shared and serverTest in it. Then in the serverTest directory there is the build.gradle file.
When I run gradle at the top level it outputs:
:shared:compileJava UP-TO-DATE
:shared:processResources UP-TO-DATE
:shared:classes UP-TO-DATE
:shared:jar UP-TO-DATE
:serverTest:compileJava UP-TO-DATE
:serverTest:processResources UP-TO-DATE
:serverTest:classes UP-TO-DATE
:serverTest:jar UP-TO-DATE
:serverTest:assemble UP-TO-DATE
:serverTest:compileTestJava
:serverTest:processTestResources UP-TO-DATE
:serverTest:testClasses
:serverTest:test
I don't want it to execute the task :serverTest:test though. I tried changing my defaultTasks to just compileJava but that didn't work, does anyone else have any ideas?
Well this question has been asked in different ways , the main theme of the question is;
How to exclude sub tasks of build task?
1 . gradle build -x test
The above instruction is helpful while executing gradle build command from CLI; this exludes test task, but we want to exclude test task programmatically in build.gradle file. Look below the answer for the respective question.
2 . check.dependsOn -= test
Copy this small script in your build.gradle file.
Now when you execute gradle build from CLI, your tests won't run at all.
Cheers !
You could try to disable the task only if a build task is present... Something like:
project(":serverTest").test.onlyIf { !gradle.taskGraph.hasTask(":shared:build") }

Basic testing functionality in SBT

How do I create a simple unit test for my application using SBT's test feature?
I'm hoping the answer is that I can write a single file in src/test/scala for my project that imports some special testing package from SBT which makes writing tests as easy as writing a single method.
The tutorial ExampleSbtTest seems to be doing something more complicated than what I need, and I can't find anything simpler on the SBT GoogleCode page.
Testing with SBT
No matter which version of SBT you want to use, basically you have to do the following steps:
Include your desired testing framework as test-dependency in your project configuration.
Create a dedicated testing folder within your source tree, usually src/test/scala, if it isn't present already.
As always: Write your tests, specs ...
Those basic steps are identical for the sbt 0.7 branch (that's the one from google-code) and the current sbt 0.10 branch (now developed and documented on github). However, there are minor differences how to define the testing dependencies since 0.10 provides a new quick configuration method not present in 0.7.
Defining the dependency for SBT 0.7
Here is how you create a basic test (based on scalacheck) with sbt 0.7. Create a new sbt 0.7 project by calling sbt in an empty folder. Change into the automatically created project folder and create a new build folder
# cd [your-project-root]/project
# mkdir build
change into the newly created build folder and create your first project build file Project.scala with the follwing content:
class Project(info: ProjectInfo) extends DefaultProject(info) {
val scalacheck = "org.scala-tools.testing" %% "scalacheck" % "1.9" % "test"
}
Since for 0.7 the testing folder is created automatically you can then start to write your first test right away. Step to the paragraph "Create a simple test".
Defining the dependency for SBT 0.10
For 0.10 one can use the sbt console to add the dependency. Just start sbt in your project directory and enter the following commands:
set libraryDependencies += "org.scala-tools.testing" %% "scalacheck" % "1.9" % "test"
session save
You can then close the sbt console and have a look at your projects build.sbt file. As you can easily spot the above libraryDependencies line was added to your projects quick configuration.
Since 0.10 doesn't create the source folders automatically. You have to create the testing folder on your own:
# cd [project-root]
# mkdir -p src/test/scala
That's all it takes to get started with 0.10. Moreover, the documentation about testing with 0.10 is far more detailed then the old one. See the testing wiki page for further details.
Create a simple scalacheck test
Create the following test file src/test/scala/StringSpecification.scala (taken from the scalacheck homepage):
import org.scalacheck._
object StringSpecification extends Properties("String") {
property("startsWith") = Prop.forAll((a: String, b: String) => (a+b).startsWith(a))
property("endsWith") = Prop.forAll((a: String, b: String) => (a+b).endsWith(b))
// Is this really always true?
property("concat") = Prop.forAll((a: String, b: String) =>
(a+b).length > a.length && (a+b).length > b.length
)
property("substring") = Prop.forAll((a: String, b: String) =>
(a+b).substring(a.length) == b
)
property("substring") = Prop.forAll((a: String, b: String, c: String) =>
(a+b+c).substring(a.length, a.length+b.length) == b
)
}
As already indicated this basic check will fail for the "concat" specification, but that are the basic testing steps needed to get started with testing and sbt. Just adapt the included dependency if you want to use another testing framework.
Run your tests
To run your test, open the sbt console and type
> test
That will run all tests present in your src/test tree no matter if those are java or scala based tests. So you can easily reuse your existing java unit tests and convert them step by step to scala.