IntelliJ displays error message when unit test is written in Kotlin - unit-testing

Context
I have a project with following traits
IntelliJ Ultimate 2020.1
Java 13 (with module-info.java)
Kotlin 1.3.72
JUnit (+ truth)
maven (I believe this to be unimportant)
The code base is mixed, some classes are written using plain Java, others with Kotlin, the same is true for tests. Everything works as expected, that is
all code is compiled in proper order and fully interoperable between Kotlin and Java
all test can be executed using either mvn test or IntelliJ "Run Test"
the resulting jar can be run (for the sake of providing context)
but...
apart from the fact that everything works, IntelliJ warns me about a non declared module dependency only if the test class is written in Kotlin. This warning is not displayed for test classes written in plain Java.
The warning:
Error:(9, 6) Symbol is declared in module 'org.junit.jupiter.api' which current module does not depend on
That warning normally allows one to import / require the respective module / dependency, but there is no solution offered in [alt]+[enter] dialog.
Things I have tried so far:
upgrading from JUnit 4 to 5 didn't change the situation
googling to no avail :(
making sure tests written in Kotlin are really executed when mvn test is run by making a test fail
manually running test using IntelliJ "Run Test"
converting tests back and forth from / to Kotlin
explicitly requiring the the JUnit API / truth in module-info
The latter obviously prevents the warning but is no solution since that actually produces a hard dependency. From what I found out while googling, the maven-surefire-plugin makes sure the test-dependencies are included. Also: running mvn test works like charm as stated above, so this does not seem to be part of the problem.
Seeing all the red lines when writing test is really annyoing...
the suspect behavior
same test but in java - everything is fine
Question:
How can I fix that warning for Kotlin Test Classes in IntelliJ?
Note
I have come to believe this is a bug in IntelliJ but I'd be happy to be shown what I overlooked.
Since everything from compiling to running with Maven works like a charm, I excluded details regarding project structure and so on. The issue at hand is about the warning in the IntelliJ, not about a broken build or non-functional jars. I'll glady add those in case they turn out to be necessary.
also since everthing actually works (apart from the annoying warning), I really don't know where to continue researching and hence created a question.

This is a bug in IDEA Kotlin plugin error highlighting: https://youtrack.jetbrains.com/issue/KT-26037
Workaround: add #file:Suppress("JAVA_MODULE_DOES_NOT_DEPEND_ON_MODULE") to the test file before the package declaration.

Related

All but one test class not running Visual Studio 2019

I'm attempting to do unit and integration tests for my application.
I'm using Microsoft.VisualStudio.TestTools.UnitTesting;
The generated class that I refactored works just fine and all the tests run.
I have added additional classes to the project and none of the tests in those files are detected by the test explorer.
As you can see, DatabaseConnectionTests is running just fine but CourseInformation tests isn't
I have checked to make sure the test methods have '[TestMethod]' above them, they return void and have no parameters just like the methods in DatabaseConnectionTests that run.
This is quite embarrassing, but after looking at the image, I realised my classes were not public.
Going to leave this up in case anyone else overlooks this simple mistake.

Grails test-app classpath

I'm trying to use test support classes within my tests. I want these classes to be available for all different test types.
My directory structure is as follows;
/test/functional
/test/integration
/test/unit
/test/support
I have test helper classes within the /test/support folder that I would like to be available to each of the different test types.
I'm using GGTS and I've added the support folder to the classpath. But whenever I run my integration tests running 'test-app' I get a compiler 'unable to resolve class mypackage.support.MyClass
When I run my unit tests from within GGTS the support classes are found and used. I presume this is because the integration tests run my app in its own JVM.
Is there any way of telling grails to include my support package when running any of my tests?
I don't want my test support classes to be in my application source folders.
The reason that it works for your unit tests inside the IDE is that all source folders get compiled into one directory, and that is added to your classpath along with the jars GGTS picks up from the project dependencies. This is convenient but misleading, because it doesn't take into account that Grails uses different classpaths for run-app and each of the test phases, which you see when you run the integration tests. GGTS doesn't really run the tests; it runs the same grails test-app process that you do from the commandline, and captures its output and listens for build events so it can update its JUnit view.
It's possible to add extra jar files to the classpath for tests because you can hook into an Ant event and add it to the classpath before the tests start. But the compilation process is a lot more involved and it looks like it would be rather ugly/hackish to get it working, and would likely be brittle and stop working in the future when the Grails implementation changes.
Here are some specifics about why it'd be non-trivial. I was hoping that you could call GrailsProjectTestCompiler.compileTests() for your extra directory, but you need to compile it along with the test/unit directory for unit tests and the test/integration directory for integration tests, and the compiler (GrailsProjectTestCompiler) presumes that each test phase only needs to compile that one directory. That compiler uses Gant, and each test phase has its own Grailsc subclass (org.grails.test.compiler.GrailsTestCompiler and org.grails.test.compiler.GrailsIntegrationTestCompiler) registered as taskdefs. So it should be possible to subclass them and add logic to compile both the standard directory and the shared directory, and register those as replacements, but that requires also subclassing and reworking GrailsProjectTestRunner (which instantiates GrailsProjectTestCompiler), and hooking into an event to replace the projectTestRunner field in _GrailsTest.groovy with your custom one, and at this point my brain hurts and I don't want to think about this anymore :)
So instead of all this, I'd put the code in src/groovy and src/java, but in test-specific packages that make it easy to exclude the compiled classes from your WAR files. You can do that with a grails.war.resources closure in BuildConfig.groovy, e.g.
grails.war.resources = { stagingDir ->
println '\nDeleting test classes\n'
delete(verbose: true) {
// adjust as needed to only delete test-specific classes
fileset dir: stagingDir, includes: '**/test/**/*.class'
}
println '\nFinished deleting test classes\n'
}

Haskell program coverage not highlighting a module that has no tests at all

I've got the start of a haskell application and I want to see how the build tools behave. One of the things I would like to see is the Haskell coverage reports, through hpc (Haskell Program Coverage -> I didn't find this tag on so, hpc points to high perf computing, on a side note).
The structure of my application is
Main
src/
ModuleA
ModuleB
tests/
ModuleBTest
I have unit tests for moduleB, and I run those unit-tests through cabal test. Before, I configure cabal to spit out the hpc data through
cabal configure --ghc-options=-fhpc --enable-tests
I then build and test,
cabal build
cabal test unit-tests (that's the name of the test suite in the cabal file)
and I indeed see a report and all seems well. However, moduleA is not referred from within moduleB, it's only referred from the Main. I don't have tests (yet) for the Main module.
The thing is, I expected to see moduleA pop up in the hpc output, highlighted completely in yellow and really waving at me that there are no tests for this module, but that doesn't seem to be the case. I noticed that the .mix files are created for this 'unused' module, so I suspect the build step went ok but it goes wrong in the cabal test step.
If I go through ghci and I compile the unit tests while explicitly moduleA on the list of modules to compile then I do get hpc to show me that this module has no tests at all. So I suspect cabal optimizes this moduleA away (as it's 'unused') somewhere, but I don't really see how or where.
Now, I do realize that this might not be a real life situation, as moduleA is only referenced from within the main method, moduleB doesn't reference moduleA and I don't test the Main module (yet), but still I would have felt a lot better if it would at least show up in the program coverage as a hole in my tests the size of a battleship. Anybody an idea?
Note : I realize that my question might boil down to : "How do I tell cabal not to optimize unused modules away?" but I wanted to present the complete problem.
Kasper
First, make sure all of your modules are listed in the other-modules cabal field.
Even though in my experience sometimes applications seem to work their way around without specifying everything there - it can often cause mysterious linking issues, and I assume it could cause situations like yours.
Now, other than that, I don't think cabal would optimize your modules like that, but GHC's dead code elimination. So if your code is not used at all (just one actual usage per module has to exist), GHC wouldn't even care for it.
Unfortunately I haven't seen a flag to change that. You may want to make a meaningless usage for every module in your tests project, just to get things visible.
2.1 Dead code elimination
Does GHC remove code that you're not actually using?
Yes and no. If there is something in a module that isn't exported and
isn't used by anything that is exported, it gets ignored. (This makes
your compiled program smaller.) So at the module level, yes, GHC does
dead code elimination.
On the other hand, if you import a module and use just 1 function from
it, all of the code for all of the functions in that module get linked
in. So in this sense, no, GHC doesn't do dead code elimination.
(There is a switch to make GHC spit out a separate object file for
each individual function in a module. If you use this, only the
functions are actually used will get linked into your executable. But
this tends to freak out the linker program...)
If you want to be warned about unused code (Why do you have it there
if it's unused? Did you forget to type something?) you can use the
-fwarn-unused-binds option (or just -Wall).
- GHC optimisations - HaskellWiki

Tests won't run: "The executable for the test bundle at '../DerivedData/../*.xctest' could not be found"

I have a Xcode 5.1 project (projA) which included a private framework (frameworkB) via Cocoapods. Everything was working, building and testing and even the weather was nice. But, because the frameworkB is being developed in parallel with the projA, I decided to include the project of the frameworkB (proj B) in projA, again via Cocoapods but as a reference with :path ='path/to/projB'
In the result the projB compiles and builds and runs on device, the tests target also compiles and builds but doesn't run, the simulator start and this message is displayed:
2014-04-14 11:08:34.990 xctest[98973:303] The executable for the test bundle at
/Users/myNameHere/Library/Developer/Xcode/DerivedData/projB-manyLettersHere/Build
/Products/Debug-iphonesimulator/projB.xctest could not be found.
Program ended with exit code: 1
Also the weather is not so nice anymore.
Google didn't help. Other stackoverflow question are more about transition from Sentest cu XCTest.
Any hint that will put me on the right path will be greatly appreciated.
The cause and solution to this problem is (as usual) very simple:
The framework that was being built contained a bundle, which was required, but (of course) was not generated by the Pod project, even if in the podspec i specified the spec.resources pram. The solution was to create a spec.resource_bundle with the required name and resource files.
Why it would throw this error and not a compile-time or runtime error i still don't know :|
(the projB.xctest was present at the path in error)

TFS UnitTesting not deploying local copy assembly to test dir when on build server

I have an assembly that needs to be in the test output dir for my tests to run.
I have the assembly referenced as a local copy in the project but on the build server this gets ignored.
The two ways I have found to fix this are
Add a special attribute to test method that will make sure the file is there for each test.
[DeploymentItem("my assembly")]
This is not very practical as this assembly is required for almost every test in the assembly.
Add test run config file with special deployment section. I am using a TestContainer in my build scripts to run the tests I think that this may be the reason my included test run config does not get picked up and the assembly not copied. I would prefer to not have a vsmdi test list file as I am trying to run all tests and I feel this would be a violation of DRY.
Any suggestions on how I can get my tests running?
As my assembly was being dynamicly loaded the unit test framework was not copying it.
I added a explict refrence to it by calling typeof on one of the types in the assembly and all is fine.
Thanks Jerome Laban for your help with this one.