AngularJS unit testing services that are broken out to separate files - unit-testing

My services.js file was getting quite large so I decided it'd be best to split out the individual services into separate files (service1.js, service2.js, etc).
Unfortunately, this broke all my unit tests. I'm no longer able to import the service dependencies into my tests. I'm seeing errors like this when I run my unit tests:
Error: [$injector:unpr] Unknown provider: Service1Provider <- Service1
I can't find any article on the web that addresses these issues. My app structure is pretty standard and OOTB, nothing really different from angular-seed (except of course the separate files for each service).
Please let me know if you need more info.

I currently work with #mtical, and it turns out the error was indeed in karma.conf.js. As he said, we broke apart our services into multiple files, and our main service file was named "service.js". By default, karma loads all js files that are not explicitly listed in the karma.conf.js file in recursive alphabetical order.
This was causing our "service.js" file to be loaded after all of our other service files, which were listed before that file when in alphabetical order. Unfortunately, all of those other services had "service.js" as a dependency, so when our tests ran, they weren't able to find the services we needed.
The solution was to explicitly list "service.js" before the recursive loading of other files in our karma.conf.js file, as follows:
...
files : [
'app/lib/angular/angular.js',
'app/lib/angular/angular-*.js',
'test/lib/angular/angular-mocks.js',
'app/js/services/services.js',
'app/js/**/*.js',
'test/unit/**/*.js'
],
...

Related

Testing in GO - code coverage within project packages

I have a question about generating code coverage in Go(lang) projects.I have this simple structure:
ROOT/
config/
handlers/
lib/
models/
router/
main.go
config contains configuration in JSON and one simple config.go that reads and parses JSON file and fills the Config struct which is used then when initializing DB connection. handlers contains controllers (i.e. handlers of respective METHOD+URL described in router/routes.go). lib contains some DB, request responder and logger logic. models contains structs and their funcs to be mapped from-to JSON and DB. Finally router contains the router and routes definition.
Basically just by testing one single handler I ensure that my config, logger, db, responder, router and corresponding model are invoked (and tested somehow as well).
Now if this would be a PHP or Java or I don't know what else language, having and running that single handler test would create a code coverage also for all other invoked parts of code even if they are in different folders (i.e. packages here). Unfortunately, this is not the case in Go.
And there is very little code in most of my lib files having just one method (like InitDB() or ReadConfig() or NewRouter() or Logger()) so to be able to have code coverage for them I have to create stupid tests in these packages as well while they were already invoked and tested by testing the main URLs handling packages.
Is there any way how to get code coverage from packages also for included other packages within one project?
You can import your packages within a single test and write tests for them based on how they'll be used within your application. For example, you can have one test, main_test.go which imports all the other packages, and then you write tests for the methods in the imported packages.
Like this (in main_test.go):
package main
import (
"testing"
"lib"
"models"
"handlers"
// etc
)
// testing code here
However, in my personal opinion, it might be better to make sure that each package does what it should and that only. The best way to do that, is by testing the individual package itself, with its own test suite.

Maven: utility to generate unit tests

I need to write unit tests for an existing Java REST server.
The GET methods are very similar and I am thinking that I can write a small unit test generator that will use reflection to introspect the GET methods and the POJOs they consume to generate (boilerplate) unit tests.
Each test will be generated with a small syntax error so that they cannot be run as is, but must be examined by a developer and the syntax error corrected. I am hoping that this will as least assure that the tests are sane and look reasonable.
The generator will be run from the command line, passing in the class-under-test, the output directory for the unit tests, etc.
I don't want the class files for the generator to be added to the WAR file, but the generator needs to have access to the class files for the REST server.
My project directory is a "standard" Maven hierarchy: project/src/main/java, project/target, etc.
Where is the best place to put the generator source code? Under project/src/main/java? Under project/src/generator/java? Somewhere else?
I know how to exclude the generated class files from the WAR file if they all are included under a specific package (e.g. com.example.unit_test_generator).
This scenario sound like a maven-plugin to me. Furthermore the usual place for generated code is under target/generated... which means target folder ...take a look at maven-antlr3-plugin or maven-jaxb-plugin to see where they usually put generated code into. Never put generated code into src/ structure...But may be you have to change the location and to put into project/src/main/ ...But if these classes are some kind of tests the have to be located under project/src/test instead.

Cobertura report has 100% coverage anywhere

I'm running my webApp using Jetty with my instrumented classes.
After the shutdown of Jetty i'm taking the generated .set file and creating a cobertura report using the command line tool.
I always get 100% coverage results on any class.
It seems that Cobertura takes into account only the lines that were executed during testing, and doesn't get the full class data.
I've tried to add source files to the reports - no help.
I also tried to take the .ser file created after the instrumentation and merge it with .ser file created after Jetty shutdown (it is actually the same file, but before running Jetty I backed-up the .ser which was created after instrumentation) - no help here either.
Can someone please help??
Thanks
100% coverage is a clear indicator, that the sources are missing for the report. You should check your configuration for creating the report.
Make sure that:
you give the right folder
the source folder is structured like the packages, and not just all classes in one dir
As explained at http://cobertura.sourceforge.net/faq.html, in the answer to the question "When I generate coverage reports, why do they always show 100% coverage everywhere?",
"Cobertura is probably using the wrong .ser file when generating the reports. When you instrument your classes, Cobertura generates a .ser file containing basic information about each class. As your tests run, Cobertura adds additional information to this same data file. If the instrumented classes can not find the data file when running then they will create a new one. It is important that you use the same cobertura.ser file when instrumenting, running, and generating reports."
In my case, I experienced this issue when instrumented classes were in one .ser and during execution I was generating another .ser. Generating the HTML report "just" from the second .ser shown the problem mentioned in the question. Merging the two datafiles (.ser), and regenerating the report, solved the issue.
Refer to http://cobertura.sourceforge.net/commandlinereference.html for "Merging Datafiles" information.

Recommended way to structure rspec modules?

I have a rails app, plus code in lib. I have the spec directory under RAILS_ROOT.
How should I put my tests in spec?
Currently, I am thinking of the following:
spec/lib
spec/controllers
spec/models
Further, I do some common setup / use common steps (e.g., generate an invalid user) in many tests. Where do you recommend I put the modules that do the common setup /steps in my rspec tests?
Your proposed directory structure is fine.
As for your helper modules, a common idiom is for these to go in the spec/support directory. You can include them all automatically by placing the following code into your spec_helper.rb file:
Dir[File.expand_path('../support/**/*.rb', __FILE__)].each { |f| require f }
You could just place the code directly in spec_helper.rb itself, but that can get messy and they could be wiped out by regenerating the helper file.

Best practices for file system dependencies in unit/integration tests

I just started writing tests for a lot of code. There's a bunch of classes with dependencies to the file system, that is they read CSV files, read/write configuration files and so on.
Currently the test files are stored in the test directory of the project (it's a Maven2 project) but for several reasons this directory doesn't always exist, so the tests fail.
Do you know best practices for coping with file system dependencies in unit/integration tests?
Edit: I'm not searching an answer for that specific problem I described above. That was just an example. I'd prefer general recommendations how to handle dependencies to the file system/databases etc.
First one should try to keep the unit tests away from the filesystem - see this Set of Unit Testing Rules. If possible have your code working with Streams that will be buffers (i.e. in memory) for the unit tests, and FileStream in the production code.
If this is not feasible, you can have your unit tests generates the files they need. This makes the test easy to read as everything is in one file. This may also prevent permissions problem.
You can mock the filesystem/database/network access in your unit tests.
You can consider the unit tests that rely on DB or file systems as integration tests.
Dependencies on the filesystem come in two flavours here:
files that your tests depend upon; if you need files to run the test, then you can generate them in your tests and put them in a /tmp directory.
files that your code is dependent upon: config files, or input files.
In this second case, it is often possible to re-structure your code to remove dependency on a file (e.g. java.io.File can be replaced with java.io.InputStream and java.io.OutputStream, etc.) This may not be possible of course.
You may also need to handle 'non-determinism' in the filesystem (I had a devil of a job debugging something on an NFS once). In this case you should probably wrap the file system in a thin interface.
At its simplest, this is just helper methods that take a File and forward the call onto that file:
InputStream getInputStream(File file) throws IOException {
return new FileInputStream(file);
}
You can then replace this one with a mock which you can direct to throw the exception, or return a ByteArrayInputStream, or whatever.
The same can be said for URLs and URIs.
There are two options for testing code that needs to read from files:
Keep the files related to the unit tests in source control (e.g. in a test data folder), so anyone who gets the latest and runs the tests always has the relevant files in a known folder relative to the test binaries. This is probably the "best practice".
If the files in question are huge, you might not want to keep them in source control. In this case, a network share that is accessible from all developer and build machines is probably a reasonable compromise.
Obviously most well-written classes will not have hard dependencies on the file system in the first place.
Usually, file system tests aren't very critical: The file system is well understood, easy to set up and to keep stable. Also, accesses are usually pretty fast, so there is no reason per se to shun it or to mock the tests.
I suggest that you find out why the directory doesn't exist and make sure that it does. For example, check the existence of a file or directory in setUp() and copy the files if the check fails. This only happens once, so the performance impact is minimal.
Give the test files, both in and out, names that are structurally similar to the unit test name.
In JUnit, for instance, I'd use:
File reportFile = new File("tests/output/" + getClass().getSimpleName() + "/" + getName() + ".report.html");