I have some trouble testing my code with codeception in Yii2 and i hope one of you can help me.
First of all my authentication doesn't work as expected.
In my Class PagesUrl the user isn't logged in but in my template file the user is logged in. Whenever the page is not accessed with Codeception it's working fine.
Code to check if user is logged in: var_dump(Yii::$app->getUser()->getIsGuest());
Config of UrlManager:
'urlManager' => [
'enablePrettyUrl' => true,
'showScriptName' => false,
'rules' => [
[
'class' => 'common\routes\PagesUrl',
'pattern' => '',
'route' => 'site/index',
],
//Some additional rules
],
]
Second I can't seem to generate XML reports.
Command for test output: codecept run functional LoginFormCest --xml -vvv
Gives this output:
Codeception PHP Testing Framework v2.1.0
Powered by PHPUnit 4.8.35-1-g912b8c1e9 by Sebastian Bergmann and contributors.
Functional Tests (5) ----------------------------------------------------------------------------------------------------------------------------------------------------------
Modules: Filesystem, Yii2
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Printing JUNIT report into report.xml
[PHPUnit_Framework_Exception]
Undefined index: log_incomplete_skipped
Exception trace:
() at C:\xampp\htdocs\stoneart-v2\vendor\codeception\base\src\Codeception\Subscriber\ErrorHandler.php:75
Codeception\Subscriber\ErrorHandler->errorHandler() at C:\xampp\htdocs\stoneart-v2\vendor\codeception\base\src\Codeception\PHPUnit\Runner.php:145
Codeception\PHPUnit\Runner->applyReporters() at C:\xampp\htdocs\stoneart-v2\vendor\codeception\base\src\Codeception\PHPUnit\Runner.php:91
Codeception\PHPUnit\Runner->doEnhancedRun() at C:\xampp\htdocs\stoneart-v2\vendor\codeception\base\src\Codeception\SuiteManager.php:157
Codeception\SuiteManager->run() at C:\Users\j-rub\AppData\Roaming\Composer\vendor\codeception\codeception\src\Codeception\Codecept.php:200
Codeception\Codecept->runSuite() at C:\Users\j-rub\AppData\Roaming\Composer\vendor\codeception\codeception\src\Codeception\Codecept.php:172
Codeception\Codecept->run() at C:\Users\j-rub\AppData\Roaming\Composer\vendor\codeception\codeception\src\Codeception\Command\Run.php:184
Codeception\Command\Run->execute() at C:\Users\j-rub\AppData\Roaming\Composer\vendor\symfony\console\Command\Command.php:264
Symfony\Component\Console\Command\Command->run() at C:\Users\j-rub\AppData\Roaming\Composer\vendor\symfony\console\Application.php:846
Symfony\Component\Console\Application->doRunCommand() at C:\Users\j-rub\AppData\Roaming\Composer\vendor\symfony\console\Application.php:191
Symfony\Component\Console\Application->doRun() at C:\Users\j-rub\AppData\Roaming\Composer\vendor\symfony\console\Application.php:122
Symfony\Component\Console\Application->run() at C:\Users\j-rub\AppData\Roaming\Composer\vendor\codeception\codeception\codecept:28
Command for test coverage: codecept run functional LoginFormCest --coverage-xml
Gives this output:
FAILURES!
Tests: 5, Assertions: 3, Failures: 5.
[yii\base\ErrorException]
Undefined index: quiet
Exception trace:
() at C:\xampp\htdocs\stoneart-v2\vendor\codeception\base\src\Codeception\Coverage\Subscriber\Printer.php:61
::call_user_func:{C:\xampp\htdocs\stoneart-v2\vendor\symfony\event-dispatcher\EventDispatcher.php:184}() at C:\xampp\htdocs\stoneart-v2\vendor\symfony\event-dispatcher\EventDispatcher.php:184
Symfony\Component\EventDispatcher\EventDispatcher->doDispatch() at C:\xampp\htdocs\stoneart-v2\vendor\symfony\event-dispatcher\EventDispatcher.php:46
Symfony\Component\EventDispatcher\EventDispatcher->dispatch() at C:\Users\j-rub\AppData\Roaming\Composer\vendor\codeception\codeception\src\Codeception\Codecept.php:218
Codeception\Codecept->printResult() at C:\Users\j-rub\AppData\Roaming\Composer\vendor\codeception\codeception\src\Codeception\Command\Run.php:204
Codeception\Command\Run->execute() at C:\Users\j-rub\AppData\Roaming\Composer\vendor\symfony\console\Command\Command.php:264
Symfony\Component\Console\Command\Command->run() at C:\Users\j-rub\AppData\Roaming\Composer\vendor\symfony\console\Application.php:846
Symfony\Component\Console\Application->doRunCommand() at C:\Users\j-rub\AppData\Roaming\Composer\vendor\symfony\console\Application.php:191
Symfony\Component\Console\Application->doRun() at C:\Users\j-rub\AppData\Roaming\Composer\vendor\symfony\console\Application.php:122
Symfony\Component\Console\Application->run() at C:\Users\j-rub\AppData\Roaming\Composer\vendor\codeception\codeception\codecept:28
Codeception.yml
actor: Tester
paths:
tests: tests
log: tests/_output
data: tests/_data
helpers: tests/_support
settings:
bootstrap: _bootstrap.php
memory_limit: 1024M
colors: false
modules:
config:
Yii2:
configFile: 'tests/config/test.php'
cleanup: false
coverage:
enabled: true
whitelist:
include:
- common/modules/news/*
Updating to the newest version did the work for me. Although there seems to be a bug displaying the right version of codeception. I would recommend to check your composer.json. Thanks.
Related
I am trying to configure/run my first unit test for Vuejs. But I can't get past the configuration issues. I have tried installing the libraries but for some reason I keep getting errors.
Here is what an example of my code looks like:
My directory structure:
hello/
dist/
node_modules/
src/
components/
hello.vue
test/
setup.js
test.spec.js
.babelrc
package.json
webpack.config.js
Contents inside my files
src/components/hello.vue
<template> <div> {{message}} </div> </template>
<script>
export default {
name: 'hello',
data () { return message: 'Hi' },
created () {
// ...
}
}
test/setup.js
// setup JSDOM
require('jsdom-global')()
// make expect available globally
global.expect = require('expect')
test/test.spect.js
import { shallow } from 'vue/test-utils'
import { hello} from '../../../src/components/hello.vue'
describe('hello', () => {
// just testing simple data to see if it works
expect(1).toBe(1)
})
.babelrc
{
"env": {
"development": {
"presets": [
[
"env",
{
"modules": false
}
]
]
},
"test": {
"presets": [
[
"env",
{
"modules": false,
"targets": {
"node": "current"
}
}
]
],
"plugins": [
"istanbul"
]
}
}
}
package.json
...
"scripts": {
"build": "webpack -p",
"test": "cross-env NODE_ENV=test nyc mocha-webpack --webpack-config webpack.config.js --require test/setup.js test/**/*.spec.js"
},
"devDependencies": {
"babel-core": "^6.26.0",
"babel-loader": "^7.1.2",
"babel-preset-env": "^1.6.1",
"cross-env": "^5.1.1",
"css-loader": "^0.28.7",
"file-loader": "^1.1.5",
"node-sass": "^4.7.2",
"sass-loader": "^6.0.6",
"vue-loader": "^13.5.0",
"vue-template-compiler": "^2.5.9",
"webpack": "^3.10.0",
"webpack-dev-server": "^2.9.7",
"jsdom": "^11.3.0",
"jsdom-global": "^3.0.2",
"mocha": "^3.5.3",
"mocha-webpack": "^1.0.0-rc.1",
"nyc": "^11.4.1",
"expect": "^21.2.1",
"#vue/test-utils": "^1.0.0-beta.12"
},
...
"nyc": {
"include": [
"src/**/*.(js|vue)"
],
"instrument": false,
"sourceMap": false
}
and finally my webpack.config.js
...
if(process.env.NODE_ENV == "test") {
module.exports.externals = [ require ('webpack-node-externals')()]
module.exports.devtool = 'inline-cheap-module-source-map'
}
now when I run npm test from my root folder hello/ I get this error:
> hello#1.0.0 test C:\Users\john\vue-learn\hello
> npm run e2e
> hello#1.0.0 e2e C:\Users\john\vue-learn\hello
> node test/e2e/runner.js
Starting selenium server... started - PID: 12212
[Test] Test Suite
=====================
Running: default e2e tests
× Timed out while waiting for element <#app> to be present for 5000 milliseconds. - expected "visible" but got: "not found"
at Object.defaultE2eTests [as default e2e tests] (C:/Users/john/Google Drive/lab/hello/test/e2e/specs/test.js:13:8)
at _combinedTickCallback (internal/process/next_tick.js:131:7)
FAILED: 1 assertions failed (20.281s)
_________________________________________________
TEST FAILURE: 1 assertions failed, 0 passed. (20.456s)
× test
- default e2e tests (20.281s)
Timed out while waiting for element <#app> to be present for 5000 milliseconds. - expected "visible" but got: "not found"
at Object.defaultE2eTests [as default e2e tests] (C:/Users/john/Google Drive/lab/hello/test/e2e/specs/test.js:13:8)
at _combinedTickCallback (internal/process/next_tick.js:131:7)
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! hello#1.0.0 e2e: `node test/e2e/runner.js`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the hello#1.0.0 e2e script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm ERR! A complete log of this run can be found in:
npm ERR! C:\Users\john\AppData\Roaming\npm-cache\_logs\2018-04-03T23_53_15_976Z-debug.log
npm ERR! Test failed. See above for more details.
I don't know why this happens. When I installed my webpack project at first I didn't install a testing library with the npm init command so there are no conflicts, but still I get that error:
update (after bounty)
I'm just trying to test my vuejs application. Hopefully with jasmine/karma. If anyone knows how to integrate these into a simple app and run the firsts test, I can take it from there. My problem is not writing tests but configuring it
So first thing you didn't need to enable the end to end testing in your project. I would say start fresh
$ npm install -g vue-cli
$ vue init webpack vue-testing
? Project name vue-testing
? Project description A Vue.js project
? Author Tarun Lalwani <tarun.lalwani#payu.in>
? Vue build standalone
? Install vue-router? Yes
? Use ESLint to lint your code? Yes
? Pick an ESLint preset Standard
? Set up unit tests Yes
? Pick a test runner karma
? Setup e2e tests with Nightwatch? No
? Should we run `npm install` for you after the project has been created? (recommended) yarn
Say N to Setup e2e tests with Nightwatch and use Karma for the Pick a test runner.
$ npm test
> vue-testing#1.0.0 test /Users/tarun.lalwani/Desktop/tarunlalwani.com/tarunlalwani/workshop/ub16/so/vue-testing
> npm run unit
> vue-testing#1.0.0 unit /Users/tarun.lalwani/Desktop/tarunlalwani.com/tarunlalwani/workshop/ub16/so/vue-testing
> cross-env BABEL_ENV=test karma start test/unit/karma.conf.js --single-run
07 04 2018 21:35:28.620:INFO [karma]: Karma v1.7.1 server started at http://0.0.0.0:9876/
07 04 2018 21:35:28.629:INFO [launcher]: Launching browser PhantomJS with unlimited concurrency
07 04 2018 21:35:28.645:INFO [launcher]: Starting browser PhantomJS
07 04 2018 21:35:32.891:INFO [PhantomJS 2.1.1 (Mac OS X 0.0.0)]: Connected on socket M1HeZIiOis3eE3mLAAAA with id 44927405
HelloWorld.vue
✓ should render correct contents
PhantomJS 2.1.1 (Mac OS X 0.0.0): Executed 1 of 1 SUCCESS (0.061 secs / 0.041 secs)
TOTAL: 1 SUCCESS
=============================== Coverage summary ===============================
Statements : 100% ( 2/2 )
Branches : 100% ( 0/0 )
Functions : 100% ( 0/0 )
Lines : 100% ( 2/2 )
================================================================================
Now your npm test would work fine.
According to the error logs you provide here, the failing tests that you spot are the End to End ones. Indeed, by executing the command npm test e2e you're testing using Nightwatch. See under /tests/e2e/specs. Here you should have a default test file checking that your Vue application properly create a DOM element identified as app.
The test should be the following:
// For authoring Nightwatch tests, see
// http://nightwatchjs.org/guide#usage
module.exports = {
'default e2e tests': function (browser) {
// automatically uses dev Server port from /config.index.js
// default: http://localhost:8080
// see nightwatch.conf.js
const devServer = browser.globals.devServerURL
browser
.url(devServer)
.waitForElementVisible('#app', 5000)
.assert.elementPresent('.hello')
.assert.containsText('h1', 'Welcome to Your Vue.js App')
.assert.elementCount('img', 1)
.end()
}
}
In your case this test is failing because you have probably removed the file named App.vue that is generated through vue-cli scaffolding. The error you get is because the above test checks, with a 5 seconds timeout, if a DOM node named "app" is rendered (i.e.: .waitForElementVisible('#app', 5000)).
Basically it is failing because you actually do not provide this div in your application anymore (due of App.vue removal, maybe).
So you have two options here:
restoring the App.vue file (i.e.: create a div identified as 'app' where you mount a Vue instance);
editing the end to end according to your needs.
Hope this helps!
We have a total of 2016 unit test cases written with Jasmine and are using Karma to run them. The tests run for a period of 1 min 30 sec to 2 min and then suddenly Karma disconnects from the browser.Here is a screenshot of the console logs.
The problem is that I am not able to diagnose why that is happening and which test case is causing it to get disconnected. I have tried different reporters of Karma to be able to identify the test case which forces it to disconnect from the browser but have been unsuccessful so far.
I have also tried running the tests in short batches to be able to drill down to the error test case (in case it is a test case error and not Karma configuration) but so far, the error has been thrown for all batches.
As per this post, I have tried setting the browserNoActivityTimeout to as high as 10 minutes (600000ms) but still no resolution. Also, the post mentions that there might be a problem with insufficient memory, so I have tried running the cases in one 8GB RAM and one 16GB RAM systems (Windows 10 on both).
Here's the complete stack trace:
[02:06:48] Error: MyApp Chromebook Unit tests failed with exitCode: 1
at formatError (C:\Users\barnadeep.bhowmik\AppData\Roaming\npm\node_modules\gulp\bin\gulp.js:169:10)
at Gulp.<anonymous> (C:\Users\barnadeep.bhowmik\AppData\Roaming\npm\node_modules\gulp\bin\gulp.js:195:15)
at emitOne (events.js:96:13)
at Gulp.emit (events.js:188:7)
at Gulp.Orchestrator._emitTaskDone (C:\Users\barnadeep.bhowmik\Desktop\Projects\MyProject\test-player-15-may\myapp-chrome\node_modules\orchestrator\index.js:264:8)
at C:\Users\barnadeep.bhowmik\Desktop\Projects\MyProject\test-player-15-may\myapp-chrome\node_modules\orchestrator\index.js:275:23
at finish (C:\Users\barnadeep.bhowmik\Desktop\Projects\MyProject\test-player-15-may\myapp-chrome\node_modules\orchestrator\lib\runTask.js:21:8)
at cb (C:\Users\barnadeep.bhowmik\Desktop\Projects\MyProject\test-player-15-may\myapp-chrome\node_modules\orchestrator\lib\runTask.js:29:3)
at C:\Users\barnadeep.bhowmik\Desktop\Projects\MyProject\test-player-15-may\myapp-chrome\build\tasks\test.js:18:13
at removeAllListeners (C:\Users\barnadeep.bhowmik\Desktop\Projects\MyProject\test-player-15-may\myapp-chrome\node_modules\karma\lib\server.js:336:7)
at Server.<anonymous> (C:\Users\barnadeep.bhowmik\Desktop\Projects\MyProject\test-player-15-may\myapp-chrome\node_modules\karma\lib\server.js:347:9)
at Server.g (events.js:291:16)
at emitNone (events.js:91:20)
at Server.emit (events.js:185:7)
at emitCloseNT (net.js:1555:8)
at _combinedTickCallback (internal/process/next_tick.js:71:11)
at process._tickCallback (internal/process/next_tick.js:98:9)
Here's my config file:
module.exports = function(config) {
config.set({
basePath: '',
frameworks: ['jasmine'],
files: [
'bower_components/jquery/dist/jquery.js',
'node_modules/angular/angular.js',
'other_dependencies/**.*.js',
'src/app/app.js',
'src/app/pack1-components/**/*.js',
'src/app/pack2-components/**/*.js',
'src/**/*.html'
],
exclude: [
'src/some-folder/*',
],
port: 8081,
logLevel: config.LOG_INFO,
autoWatch: false,
browsers: ['ChromeNoSandbox'],//temp fix for Chrome Browser 'Chrome'
customLaunchers: {
ChromeNoSandbox: {
base: 'Chrome',
flags: ['--no-sandbox']
}
},
reporters: ["spec","progress","coverage","html"],
specReporter: {
maxLogLines: 5, // limit number of lines logged per test
suppressErrorSummary: false, // do not print error summary
suppressFailed: false, // do not print information about failed tests
suppressPassed: false, // do not print information about passed tests
suppressSkipped: true, // do not print information about skipped tests
showSpecTiming: false, // print the time elapsed for each spec
failFast: true // test would finish with error when a first fail occurs.
},
preprocessors: {
'src/**/*.js':['coverage'],
'src/**/*.html':['ng-html2js']
},
coverageReporter: {
type: 'lcov',
dir: 'qualityreports/testresults/unit/coverage/'
},
htmlReporter: {
outputFile: 'qualityreports/testresults/unit/testresults.html'
},
browserNoActivityTimeout: 600000,
captureTimeout: 60000,
browserDisconnectTimeout : 60000,
browserDisconnectTolerance : 1,
ngHtml2JsPreprocessor: {
},
plugins: [
'karma-jasmine','karma-chrome-launcher','karma-coverage','karma-htmlfile-reporter','karma-ng-html2js-preprocessor',"karma-spec-reporter"],
singleRun: true
});
};
Here is a similar post but it did not have all details, hence posting mine. Any help would be deeply appreciated.
I have some C++ code which I build on jenkins. I ran UnitTest++ 1.4 to test the C++ code and this generates some TestResults*.xml.
This works nice as long as I configure the jenkins build job using the web frontend:
For a new build job I have to use the jenkins pipeline plugin instead, so I have to write a Jenkinsfile. For evaluating my TestResults*.xml, I only found two alternatives:
junit 'TestResults*.xml'
step([$class: 'JUnitResultArchiver', testResults: 'TestResults*.xml'])
But none of them work. Seems that junit xml format is different to UnitTest++ 1.4.
Does anyone know which Jenkinsfile statement is required to correctly use UnitTest++ 1.4 test results xml output?
The solution is, to use xUnit plugin instead if JUnit plugin. Like this:
step([
$class: 'XUnitBuilder', testTimeMargin: '3000', thresholdMode: 1,
thresholds: [
[$class: 'FailedThreshold', failureNewThreshold: '', failureThreshold: '0', unstableNewThreshold: '', unstableThreshold: ''],
[$class: 'SkippedThreshold', failureNewThreshold: '', failureThreshold: '', unstableNewThreshold: '', unstableThreshold: '']
],
tools: [[
$class: 'UnitTestJunitHudsonTestType',
deleteOutputFiles: true,
failIfNotNew: true,
pattern: 'result.xml',
skipNoTestFiles: false,
stopProcessingIfError: true
]]
])
Update: This issue has been resolved. Resolution in comments below.
When running Calabash tests in both terminal or through Rubymine, the output does not display until the test is finished. With webdriver tests, we get output in real time. Is there a way to display console output in real time with Calabash?
Additional Details
>xcode-select --print-path
/Applications/XCode.app/Contents/Developer
>xcodebuild -version
Xcode 5.1.1
Build version 5B1008
>calabash-ios version
0.9.169
irb(main):002:0> server_version
(I removed the app name)
{
"outcome" => "SUCCESS",
"app_id" => "com.<redacted>",
"simulator_device" => "iPhone",
"version" => "0.9.169",
"app_name" => "<redacted>",
"iphone_app_emulated_on_ipad" => false,
"4inch" => true,
"git" => {
"remote_origin" => "git#github.com:calabash/calabash-ios-server.git",
"branch" => "master",
"revision" => "ca62f6e"
},
"app_version" => "1.0",
"iOS_version" => "7.1",
"system" => "x86_64",
"simulator" => "iPhone Simulator 463.9.41, iPhone OS 7.1 (iPhone Retina (4-inch)/11D167)"
}
One scenario I have seen where Calabash stops displaying test output logs is , if user specifies any of the output format while running the tests. For e.g
--format json -out cucumber.json
--format html -out TestReport.html
Also, if you can specify how are you running the Tests and what your project structure looks like, may be I can help you better.
I've just started with Zend framework 2, with Doctrine. I want to set up unit testing for my Album module.
When I run c:\wamp\www\zf2-tutorial\module\Album\test > phpunit from command prompt,
I get following error:
PHPUnit 3.7.10 by Sebastian Bergmann.
Configuration read from C:\wamp\www\zf2-tutorial\module\Album\test\phpunit.xml.d
ist
.FFE
Time: 2 seconds, Memory: 8.25Mb
There was 1 error:
1) AlbumTest\Controller\AlbumControllerTest::testIndexActionCanBeAccessed
Zend\ServiceManager\Exception\ServiceNotFoundException: Zend\ServiceManager\Serv
iceManager::get was unable to fetch or create an instance for doctrine.entityman
ager.orm_default
C:\wamp\www\zf2-tutorial\vendor\zendframework\zendframework\library\Zend\Service
Manager\ServiceManager.php:452
C:\wamp\www\zf2-tutorial\module\Album\src\Album\Controller\AlbumController.php:2
5
C:\wamp\www\zf2-tutorial\module\Album\src\Album\Controller\AlbumController.php:3
3
C:\wamp\www\zf2-tutorial\vendor\zendframework\zendframework\library\Zend\Mvc\Con
troller\AbstractActionController.php:88
C:\wamp\www\zf2-tutorial\vendor\zendframework\zendframework\library\Zend\EventMa
nager\EventManager.php:464
C:\wamp\www\zf2-tutorial\vendor\zendframework\zendframework\library\Zend\EventMa
nager\EventManager.php:208
C:\wamp\www\zf2-tutorial\vendor\zendframework\zendframework\library\Zend\Mvc\Con
troller\AbstractController.php:107
C:\wamp\www\zf2-tutorial\module\Album\test\AlbumTest\Controller\AlbumControllerT
est.php:71
--
There were 2 failures:
1) AlbumTest\Controller\AlbumControllerTest::testDeleteActionCanBeAccessed
Failed asserting that 302 matches expected 200.
C:\wamp\www\zf2-tutorial\module\Album\test\AlbumTest\Controller\AlbumControllerT
est.php:54
2) AlbumTest\Controller\AlbumControllerTest::testEditActionCanBeAccessed
Failed asserting that 302 matches expected 200.
C:\wamp\www\zf2-tutorial\module\Album\test\AlbumTest\Controller\AlbumControllerT
est.php:64
FAILURES!
Tests: 4, Assertions: 3, Failures: 2, Errors: 1.
The root of the issue seems to be:
Zend\ServiceManager\Exception\ServiceNotFoundException: Zend\ServiceManager\Serv
iceManager::get was unable to fetch or create an instance for doctrine.entityman
ager.orm_default from test..
I don't understand this, please help!
Okay finally got it!!
I changed my TestConfig.php.dist inside module/Album/test to
<?php
return array(
'modules' => array(
'DoctrineModule',
'DoctrineORMModule',
'Album',
),
'module_listener_options' => array(
'config_glob_paths' => array(
'../../../config/autoload/{,*.}{global,local}.php',
),
'module_paths' => array(
'module',
'vendor',
),
),
);
so basically just added two missing modules, DoctrineModule and DoctrineORMModule, and the phpunit command showed no errors!
Hope it helps others.