When you run ExUnit tests, they usually run in the "test" environment (Mix.env="test"). Is there a way to have them run in a different environment, like "unittest"? I'd like to use "test" for configuring our QA testing environment.
Have you tried the MIX_ENV variable?
$ MIX_ENV=yourenv mix test
Related
Scenario
I'm working on an app that has fast unit/functional jest tests along with slower end-to-end jest-puppeteer tests. I would like to split those up so that I can run the faster tests as part of a git pre-commit hook and leave the end-to-end tests to be run on CI after the code is eventually pushed to origin.
Question
How can I define specific tests to run at pre-commit? Specifically via regex similar to jest moduleNameMapper eg <rootDir>/__tests__/[a-z]+\.unit\.test\.js
Best idea so far:
in package.json add test:pre which uses bash find . -regex with bash for do to run desired "pre commit" tests
I've added
"test:pre": "PRE=1 npm test -- test/pre-*test.js"
# everything after -- is a kind of regex filter for matching file names
to my package.json scripts and in my jest-puppeteer global-setup.js I'm using
if(+process.env.PRE) return;
before all the puppeteer extras are started. So now I can
$ npm run test:pre
and violá
Say I have the following in application.properties:
quarkus.datasource.url=${db_url:jdbc:postgresql://localhost:5432/ekycapi}
%dockerrun.quarkus.datasource.url=${db_url:jdbc:postgresql://postgres:5432/anotherdb}
When running in dev mode, I run as "mvn quarkus:dev -Dquarkus.profile=dockerrun"
But what if I want to use the same profile while running the tests? What is the correct syntax for it? Something like "mvn test -Dquarkus.profile=dockerrun"?
Use quarkus.test.profile to set the desired test profile. See more config options on the all-config page
I want to add unit testing for my ansible playbook. I am new to this and have tried few things but didn't understood much. How can I start on this and write a test case properly?
Following is the simple example:
yum:
name: httpd
state: present
Ansible is not a programming language but a tool that will check the state you describe is aligned with the state of the node your run in against. So you cannot unit tests your tasks. They are in a certain way tests by themselves already. The underlying ansible binary that runs those task has unit tests itself used during its development.
Your example above is asking ansible to test if httpd is present on the target machine and will return ok if this is the case, changed if it had to install the package to fulfill the requirement, or error if something went wrong.
Meanwhile, it is not because you cannot unit test your ansible code that no tests are possible at all. You can perform basic static checks with yammlint and ansible-lint. To go further, you will have to run your playbook/role/collection against a test target node.
This has become quite easy with CI that will let you spawn virtual machines or docker container from scratch and run your script to test that no error is fired, the --check option passes successfully, idempotency is obeyed (i.e. nothing should change on a second run with the same parameters), and everything works as expected (e.g. in your above case port 80 is opened and your get the default Apache web page).
You can write those kind of tests yourself (running against localhost in a test vm for example). This Mac Appstore CLI role by Geerlinguy is using such tests through travis-ci as an example.
You can also use existing tools to help you write those tests in a more structured way like molecule. Here are some example roles using it if you are interested:
Redis role by Geerlinguy
nexus3-oss role by ThoTeam [1]
[1] Note for transparency: I am the maintainer of this example repository
I'm using jest to write tests in my ReactJS application.
So far, to run my test suite, I need to type 'npm test'.
Here's the snippet from package.npm:
"scripts": {
"test": "./node_modules/.bin/jest",
(other stuff)
},
"jest": {
"unmockedModulePathPatterns": ["<rootDir>/node_modules/react"],
"scriptPreprocessor": "<rootDir>/node_modules/babel-jest",
"testFileExtensions": [
"es6",
"js"
],
"moduleFileExtensions": [
"js",
"json",
"es6"
]
},
Is it possible to run those tests within my IDE (IDEA/WebStorm) directly, preserving the configuration? I'm not a js guy, but for example WebStrom works perfectly fine with Karma. Shouldn't this be possible with jest-cli either?
To make Jest test results shown in a tree view (like karma, etc.), a special integration is needed. WebStorm doesn't yet support Jest. Please vote for WEB-14979 to be notified on any progress.
EDIT: as of March 2017 the first version of Jest integration at WebStorm has been released.
In WebStorm 9+ You can set this up as follows:
Install Jest CLI: npm install --save-dev jest-cli
Create node run configuration with javascript file set to node_modules/.bin/jest, and application parameter to --runInBand. runInBand tells jest to run in single process, otherwise there's a port conflict when running multiple node processes in debug mode
Create some tests and run configuration in Debug mode (Ctrl-D/CMD-D). If you set breakpoints in your test or app code they should hit
It would be great though if you could click on file:line numbers in the output to go directly to the code.
app_sciences's answer is awesome, but does not work for Windows.
For windows, you can use next configuration:
Provided configuration is taken from here
For IDEA I'm using https://confluence.jetbrains.com/display/IDEADEV/Run+Configurations for that purposes. For WebStorm it seems you can add your config by yourself https://www.jetbrains.com/webstorm/help/creating-and-editing-run-debug-configurations.html . The configuration you are talking about is on the software level. If you will configure to run it via your IDE it will definitely will run within the ENV variables and paths given, you just need to add the needed global paths and the commands to run.
I'm trying to setup code coverage for a Java application project.
Project name : NewApp
Project structure:
src/java/** (source code)
src/java-test (unit tests - Jnuit)
test/it-test (integration test)
test/at-tests (acceptance tests)
tomcat/* (contain tomcat start/stop scripts)
xx/.. etc folders which are required for a usual application.
Gradle version : 1.6
Environment : Linux
I have a running gradle build script that fetches application (NewApp) dependencies (i.e. service jars used by the app for build process) from a build artifact repository (artifactory/maven for ex), and builds the app.
Now at this point, I wanted to get code coverage using JaCoCo plugin for my NewApp application project.
I followed the documentation per Gradle/Jacoco but it doesn't seems to create any reports/... folder for jacoco etc where I can find what Jacoco coverage report did.
My questions:
1. For getting code coverage using Unit tests (Junit), I assume all I have to do is the following and it will NOT require me to start/stop the tomcat before running unit test (test task i.e. "gradle test") to get code coverage for/via using unit tests. Please advise/correct. The code (just for Gradle jacoco unit test part) - I'm using is:
apply plugin: 'jacoco'
test {
include 'src/java-test/**'
}
jacocoTestReport {
group = "reporting"
description = "Generate Jacoco coverage reports after running tests."
reports {
xml.enabled true
html.enabled true
csv.enabled false
}
//classDirectories = fileTree(dir: 'build/classes/main', include: 'com/thc/**')
//sourceDirectories = fileTree(dir: 'scr/java', include: 'com/thc/**')
additionalSourceDirs = files(sourceSets.main.allJava.srcDirs)
}
and for Integration tests:
task integrationTest(type: Test) {
include 'test/java/**'
}
As jacocoTestReport is depends upon test task(s), thus they will be called first and then finally jacocoTestReport will report what it found for the code coverage.
For getting code coverage for integration tests, I assume I must start tomcat first (i.e. before running / calling test target for integration tests), then call "gradle integrationTest" or "gradle test" task and then stop tomcat -- to get the code coverage report. From other blog posts I also found that one should setup JAVA_OPTS variable to assign jacoco agent before tomcat starts.
for ex: setting JAVA_OPTS variable like:
export JACOCO="-Xms256m -Xmx512m -XX:MaxPermSize=1024m -javaagent:/production/jenkinsAKS/jobs/NewApp/workspace/jacoco-0.6.3.201306030806/lib/jacocoagent.jar=destfile=/production/jenkinsAKS/jobs/NewApp/workspace/jacoco/jacoco.exec,append=true,includes=*"
export JAVA_OPTS="$JAVA_OPTS $JACOCO"
Being new to Gradle/groovy - I'm not sure what code should I write within build.gradle (build script) to get the above Integration/Unit tests working if it involves start/stop of tomcat. If someone can provide a sample script to do that, I'll try.
I'm not getting any code coverage right now, when I publish Jacoco code coverage in Jenkins (using Jenkins post build action for publishing Jacoco reports). Jenkins build dashboard shows 0% for code coverage (i.e. bars showing all red color, no green for actual code coverage).
Need your advice to get some traction on this.
Question : I assume that your unit tests doesn't depend on tomcat. In this case, you're right, you must not start tomcat upfront.
To create the coverage report you need to execute
gradle jacocoTestReport
without jacocoTestReport gradle won't trigger jacoco to generate the reports.
One additional thing, regarding to your snippet. I assume that you have changed the the default main sourceset to source/java. in this case you don't have to set the additionalSourceDirs.
Integration tests : Yes, you need to start tomcat first, or at least you have to ensure that tomcat is running. You should have a look into Gradle 1.7. It has a new task ordering rule called finalizedBy
With this you could do something like
task integrationtests(type: Test) {
dependsOn startTomcat
finalizedBy stopTomcat
}
where start/stopTomcat are custom tasks.If you have to stay on Gradle 1.6 you have to build a dependsOn chain:
stopTomcat -dependsOn-> integrationtests -dependsOn-> startTomcat
I assume that the blog article is right, I don't have any experience with that.
Starting/Stoping Tomcat : You could do it in a way like this
task startTomcat() << {
def tomcatStartScript = "${project.rootDir}/tomcat/startScript"
tomcatStartScript.execute()
}
The stop script can be written in a similiar way. (Some in from Groovy doc : Executing)
Jenkins & Jacoco : Should be fixed when executing jacocoTestReport
Got it working.
Gradle 1.7
- download the .zip which contains the binaries/src and documentation.
- Go to folder: if you unzip the above .zip at C:\gradle-1.7
C:\gradle-1.7\samples\testing\jacoco\quickstart
Run:
gradle build jacocoTestReport
You’ll see a new folder “build” after the build.
– folder jacoco gets created with classdumps and .exec if only build task is called.
– folder jacoco and jacocoHtml gets created – if both build jacocoTestReport is called
have fun.
I also saw that it’s better to include:
the following section in build.gradle
/////
tasks.withType(Compile) {
options.debug = true
options.compilerArgs = ["-g"]
}
////