Codecov : coverage and complexity rate - jacoco

I'm testing a project and I use Codecov to publish the coverage rate of my tests. Codecov uses the rapport generated by Jacoco, and so far it works fine. Codecov doesn't display only the coverage rates, but also the complexity rates of the tests.
I have two questions about this complexity rate, I couldn't find answers for in the documentation:
1/ what is exactly the complexity rate and how does Codecov measure it?
2/ The project under test is a maven multi-module project. When i activate the report-aggregate-goal of the jacoco plugin in my POM, in order to aggregate the rapports of each module, as a result no complexity is dsiplayed on codecov :
But when I don't activate the plugin, then the complexity is calculated :
Why is it so?

Related

How to disable check in codeclimate?

Сan you please guide me on how to reduce the full test coverage threshold in the config file in the repository? I had 33% full coverage in some project, my edits reduced it to 0.2, and I would like to say to codeclimate that this is acceptable
I have my golang repo and .codeclimate.yml in root:
version: "2" # required to adjust maintainability checks
checks:
return-statements:
enabled: false
as pointed here there is no configuration to modify that.
Extracted from the docs
If the overall test coverage percentage of your repository will decrease (by 0.1% >or more) by merging the PR, Code Climate will send a failed status to Github.
If I'm understanding correctly, you would like to increase that threshold to lets say 0.5%, so for the case of your edits codeclimate doesn't raise a flag. If that's the case, then again, that's not configurable.
Under your Repo -> Settings, you can enable or disable sending pass/fail statuses >on your total coverage.
Check this document. You can disable the Enforce Diff Coverage and Enforce Totoal Coverage checks in codeclimate so that these reports are not run for your commits.

Pytest coverage with line coverage and minimum limits like karma/Istanbul

In Instanbul coverage module for Karma you can set thresholds for different kind of coverages. If some coverage doesnt meet its minimum then instanbul throws an error. This is very usefull when building the project with jenkins and you have to keep such limits. Is it possible to get similar functionality with pytest-cov or any other module?
https://ibb.co/y4J3JrG
pytest-cov generates only statements coverage. Is it possible to get line/code coverage as well?
Coverage.py (which is the engine for pytest-cov) has thresholds for total coverage, but not separate thresholds for different measurements. Look at the --fail-under option.
Coverage.py can measure statement coverage and branch coverage. You mention "line" coverage and "code" coverage: I don't know how those differ from statement coverage.
you can find the option you need as follows:
pytest --help
--cov-fail-under=MIN Fail if the total coverage is less than MIN.

Is it possible to merge test coverage on sonarqube level?

For Java I know the possibility to merge test coverage results on build level by specifying the same path of JaCoCo reports (see SonarQube: Multiple unit test and code coverage result files). This might be transported to SonarQube.
But is it possible to make this on SonarQube level?
I mean from different build servers or different jobs build and test software and combine coverage results at SonarQube side (perhaps by marking the SW version or any kind of given label)?
For me it would be usefull to combine integration and unit tests.
You can combine the result of multiple jobs. You can create two coverage folders, e.g.
- coverage-unit
- coverage-integration
and use the resulting lcov files, e.g.
sonar.javascript.lcov.reportPaths=coverage-unit/lcov.info,coverage-integration/lcov.info
Currently it is not possible to "amend" coverage to an existing analysis. You have to orchestrate your build pipeline so that all kind of coverage reports are produced before you actually starts the SonarQube analysis.

SonarQube 5.6.1 and VSTS - No count for Unit Tests

We use VSTS with the newest SonarQube tasks and Sonarqube 5.6.1
In Sonarqube we see all the unit test coverage results, except for one item: The nr of unit tests. How/what do we need to configure to have the nr of unit tests also shown in Sonar Qube?
Per SonarC# documentation, you need to import the Unit Test Execution Results, using the applicable property (for example sonar.cs.vstest.reportsPath). The trick is to set the appropriate value, which is not always straightforward in automated environments (e.g. VSTS).
Pending planned improvements with SONARMSBRU-231, you may want to try the workaround mentioned in that ticket:
/d:sonar.cs.vstest.reportsPaths=..\**\TestResults\**\*.trx
(under Advanced, Additional Settings , in the Prepare the SonarQube analysis build step)

Branch coverage with JaCoCo, Emma from IntelliJ

I am trying to measure branch coverage of unit tests for a large Grails application. I am using JaCoCo, Emma and IDEA to collect the metrics from inside IntelliJ, I am getting the following:
JaCoCo (no metrics are shown even for line coverage)
Emma (produces method and line coverage)
IDEA (produces class, method and line coverage)
I am mostly interested in JaCoCo as it should give me Branch Coverage by default. Could someone point me to some tips on how to troubleshoot this?
Actually IntelliJ code coverage tool supports branch coverage though it does not show the results on the summary. Check this article to see how it can be configured and how you can check your branch coverage: https://confluence.jetbrains.com/display/IDEADEV/IDEA+Coverage+Runner
The key is to use Tracing instead of Sampling.