How to run all test files from separate folders - jest - unit-testing

I am able to test each file using Jest, but I cannot figure out how to test multiple test files from separate folders.
I have a folder named test and it has 2 sub folders: e2e and unit.
I have these scripts to run these tests individually:
"test": "jest",
"test:e2e": "jest --config ./test/e2e/jest-e2e.json",
"test:unit": "jest --config ./test/unit/jest-unit.json"
When I run npm test it only runs the spec file inside the src directory.
Also other 2 scripts runs the tests inside e2e and unit respectively.
Is there a way to run all of these tests with one script only?

You can change the folder name to __tests__ or add a jest.config.js at the repo root with a testMatch like this
testMatch: [
'**/test/**/*.js'
]

Related

Disable generation of cypress folders when running tests

I'm trying to do a bunch of unit tests with Cypress. Here's the npm script that runs them:
cypress run --project tests/unit/ --headless
When I run them, it generates the typical plugin/support/videos folders, but I don't need them. Is there any flag that disables the generation of these 3 folders when running the tests?
Thanks!
Just add these generated reports to a .gitignore file in the project's root like so:
# Cypress generated files #
######################
cypress.env.json
cypress.meta.json
cypress/logs/
cypress/videos/*
cypress/screenshots/*
cypress/integration/_generated/*
cypress/data/migration/generated/*.csv
cypress/fixtures/example.json
cypress/build/*
Now, these files will never be version-controlled.
You can also disable video recording with proper configuration in your cypress.json file like so: "video": false.
You can also do it with CLI by overriding your cypress.json.
Currently, there's no way to disable the generation of those files. However, you could remove them by when launching Cypress with an npm script like so:
"clean:launch:test": "rm -rf /cypress/movies && rm -rf /cypress/screenshots && cypress run --project tests/unit/ --headless"
Then you can run it like so: npm run clean:launch:test. It should remove those folders & launch Cypress's unit tests.
I suggest just adding them to .gitignore or configuring Cypress to trash them before each run. You can read about it here.
cypress.json file:
trashAssetsBeforeRuns: true
To disable the creation of video and screenshots folder you can do like in the following command.
cypress run --config video=false,screenshotOnRunFailure=false
To remove plugin/support folders I think they are not generated with current Cypress version so you can just remove them and add to .gitignore.
Video recording can be turned off entirely by setting video to false from within your configuration.
"video": false
https://docs.cypress.io/guides/guides/screenshots-and-videos#Videos

Testing - How to split up pre-commit unit tests and CI end-to-end tests

Scenario
I'm working on an app that has fast unit/functional jest tests along with slower end-to-end jest-puppeteer tests. I would like to split those up so that I can run the faster tests as part of a git pre-commit hook and leave the end-to-end tests to be run on CI after the code is eventually pushed to origin.
Question
How can I define specific tests to run at pre-commit? Specifically via regex similar to jest moduleNameMapper eg <rootDir>/__tests__/[a-z]+\.unit\.test\.js
Best idea so far:
in package.json add test:pre which uses bash find . -regex with bash for do to run desired "pre commit" tests
I've added
"test:pre": "PRE=1 npm test -- test/pre-*test.js"
# everything after -- is a kind of regex filter for matching file names
to my package.json scripts and in my jest-puppeteer global-setup.js I'm using
if(+process.env.PRE) return;
before all the puppeteer extras are started. So now I can
$ npm run test:pre
and violá

Jenkins JUnit plugin gives error "ERROR: No test report files were found. Configuration error?"

I have Jenkins master container running in a Kubernetes cluster. I have a separate VM configured as a build slave so that it can build containers.
I am using a Multibranch Pipeline with the Jenkinsfile in the git repo.
The pipeline builds the Docker image which is a Django app. I have installed django_nose, so it can produce xunit files with the test results.
The Django settings has the following options to enable xunit test results.
TEST_RUNNER = 'django_nose.NoseTestSuiteRunner'
NOSE_ARGS = [
'--with-xunit',
'--xunit-file=/tmp/tests/results/results.xml',
]
In the pipeline, I have the following stage:
stage("Test") {
sh("docker run --rm -i \
-v '${env.WORKSPACE}/tests/results/':/tmp/tests/results \
${image} python3 manage.py test")
junit '${env.WORKSPACE}/tests/results/*.xml'
}
It is mounting a directory in the Jenkins workspace as a volume in the container so that the saved test results can be viewed by Jenkins after the container completes.
When I run the build, I get the following error:
[Pipeline] { (Test)
[Pipeline] sh
[my_project-test-7T6G6Z...QQBYGFA] Running shell script
+ docker run --rm -i -v /home/jenkins/jenkins_home/workspace/my_project-test-7T6G6Z...QQBYGFA/tests/results/:/tmp/tests/results my-registry/username/my_project python3 manage.py test
nosetests --with-xunit --xunit-file=/tmp/tests/results/results.xml --verbosity=1
Creating test database for alias 'default'...
............
----------------------------------------------------------------------
Ran 12 tests in 1.421s
OK
Destroying test database for alias 'default'...
[Pipeline] junit
Recording test results
No test report files were found. Configuration error?
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
I can see that the Django tests are running and passing. If I look on the Jenkins slave VM, I can see that the results.xml file is in the location that it is supposed to be, and it contains the XML results of the tests.
jenkins#jenkins-slave01:~/jenkins_home/workspace$ ls -al /home/jenkins/jenkins_home/workspace/my_project-test-7T6G6Z...QQBYGFA/tests/results/results.xml
-rw-r--r-- 1 root root 1329 Oct 2 09:45 /home/jenkins/jenkins_home/workspace/my_project-test-7T6G6Z...QQBYGFA/tests/results/results.xml
jenkins#jenkins-slave01:~/jenkins_home/workspace$
Why is Jenkins not able to get the test results, since I can see that the file is created?
I figured out the answer. JUnit needs a relative path from within the workspace. You don't need to include the path to the workspace.
junit 'tests/results/*.xml'
I had this error because I had a leading dot in the path to the test results, e.g.:
junit './build/outputs/androidTest-results/connected/flavors/DEV/*.xml'
I'm not sure why, but this fixed it:
junit 'build/outputs/androidTest-results/connected/flavors/DEV/*.xml'

Cannot view code coverage on angular-cli new project

I'm trying to figure out how to get code coverage working with #angular/cli but so far i'm not having much luck.
I started a new project using angular CLI. Basically all i did was ng new test-coverage and once everything was installed in my new project folder, I did a ng test --code-coverage. The tests were run successfully but nothing resembling code coverage was displayed in the browser.
Am I missing some dependencies or something else? Any help will be appreciated.
EDIT:
R. Richards and Rachid Oussanaa were right, the file does get generated and I can access it by opening the index.html.
Now i'm wondering, is there a way I could integrate that into a node command so that the file opens right after the tests are run?
here's what you can do:
install opn-cli which is a cli for the popular opn package which is a cross-platform tool used to open files in their default apps.
npm install -D opn-cli -D to install as dev dependency.
in package.json add a script under scripts as follows
"scripts": {
...
"test-coverage": "ng test --code-coverage --single-run && opn ./coverage/index.html"
}
now run npm run test-coverage
this will run the script we defined. here is an explanation of that script:
ng test --code-coverage --single-run will run tests, with coverage, only ONCE, hence --single-run
&& basically executes the second command if the first succeeds
opn ./coverage/index.html will open the file regardless of platform.

golang test for sub directories fails

In my use case I am setting up a single go test which runs all _test.go in all packages in the project folder. I tried to achieve this using $go test ./... from the src folder of the project
/project-name
/src
/mypack
/dao
/util
When I try to run the test it is asking to install the packages which are used in the imported packages. For example if I import "github.com/go-sql-driver/mysql", it might have used another package github.com/golang/protobuf/proto. I did not manually import the proto package. The application runs without manually importing the inner package. But when I run the tests it fails. But individual package test succeeded. Do I have to install all the packages in the $go test ./... error manually?
Could anyone help me on this?
You need to run go get -t ./... first to get all test deps.
From the go test -h:
The -t flag instructs get to also download the packages required to
build the tests for the specified packages.