Is there any way to perform a basic test/unittest for a xml file included in a module?
E.g. I like to "lint" test my persistence.xml or check if the schema to a xml-file is valid.
What about using the xml-maven-plugin
It could be that the links will become invalid cause codehaus is shutting down it's services so you can take a look at https://github.com/mojohaus/xml-maven-plugin.
Related
I need to document tests that execute a program which takes a xml file as input and then generates multiple .c and .xml files as output.
The existing 300 tests are implemented as JUnit test cases and I started documenting them with doxygen.
The documentation should help to save time when it comes to the questions:
Which test cases have to be modified when a specific feature is modified?
Are all features of the program tested in at least one test case?
My first ideas was to use the classification tree method. The result would look like this example picture:
The program to test uses only a few boolean parameters but also a XML file which is the main input. This XML file contains different lists of nodes and a lot of invariants have to be checked by the program when generating the output .c and .xml files. As part of the tests the input XML file and the generated files are also parsed and compared.
To apply the classification tree method equivalence classes have to be found. In my case all possible contents of the input XML file have to be classified.
How could a structured way of working through all of these possible XML nodes look like? This seems to be a complicated task and I want to proceed in an efficient way.
Maybe using the classification tree method is also not the best choice for this task. Are there other/better options?
I have a .log file which gives the minimum information about the unit test case execution. The log contains, the date, pass, fail information, error message if any. I need to upload this to sonar qube and see the unit test case result. Just the unit test case result. that would be total number of test cases, the execution results. From this I need to get the unit test case coverage. How can I get a dash board in sonar qube with this .log file? Should I convert the .log file to XML file in Nunit format? Or do we have a simple format for sonarqube?
What you're looking for is the Generic Test Data format. Convert your log file into the XML format described in the docs, and use the sonar.coverageReportPaths property to specify the path to the file.
You should have a look at the Generic Test Data feature.
My services.js file was getting quite large so I decided it'd be best to split out the individual services into separate files (service1.js, service2.js, etc).
Unfortunately, this broke all my unit tests. I'm no longer able to import the service dependencies into my tests. I'm seeing errors like this when I run my unit tests:
Error: [$injector:unpr] Unknown provider: Service1Provider <- Service1
I can't find any article on the web that addresses these issues. My app structure is pretty standard and OOTB, nothing really different from angular-seed (except of course the separate files for each service).
Please let me know if you need more info.
I currently work with #mtical, and it turns out the error was indeed in karma.conf.js. As he said, we broke apart our services into multiple files, and our main service file was named "service.js". By default, karma loads all js files that are not explicitly listed in the karma.conf.js file in recursive alphabetical order.
This was causing our "service.js" file to be loaded after all of our other service files, which were listed before that file when in alphabetical order. Unfortunately, all of those other services had "service.js" as a dependency, so when our tests ran, they weren't able to find the services we needed.
The solution was to explicitly list "service.js" before the recursive loading of other files in our karma.conf.js file, as follows:
...
files : [
'app/lib/angular/angular.js',
'app/lib/angular/angular-*.js',
'test/lib/angular/angular-mocks.js',
'app/js/services/services.js',
'app/js/**/*.js',
'test/unit/**/*.js'
],
...
I have a C++ application which has written for years. There are a lot of classes.
Every time we want to change some parameter values, we need to manually update them in the code and recompile it. It is not convenient as the growing demands of the users. We would like the values of the classes to be configured out side of the application. Probably, reading form an xml is the best? for each user, we can make an xml configuration setting and send it together with the application. Is it a good approach?
For every class e.g: Class classA, should we create another class called: ConfigClassA then classA will use the configuration setting from ConfigClassA? We dont want to make a lot of changes in the current implementation.
Suppose there is a structure of an xml file:
<classes>
<class name="ClassA">
<variable id="isUp" value="true" />
</class>
</classes>
In Xml, we can get the portion classA and parse it to ConfigClassA then classA has an instance of ConfigClass.
Or anybody has a better approach?
Thanks in advance.
In general I think that the whole configuration should be loaded when an application is launched, so you can immediately notify the user in case of an invalid configuration. In order to validate an XML file, you can use XML Schema.
However, if you don't want to make a lot of changes in the current implementation, your idea could be a valid solution.
Using JSON or YAML will be more lightweight than XML, since the parser of the config file will be simpler. Anyway, XML is also feasible.
It's actually quite common to have configuration files. The format in which they are stored is not important. They are usually loaded only once, at the beginning of the program, not queried each time a method requests something.
Also, you should make a tool available for editing these files (such as a "Preferences..." panel).
I'm running my webApp using Jetty with my instrumented classes.
After the shutdown of Jetty i'm taking the generated .set file and creating a cobertura report using the command line tool.
I always get 100% coverage results on any class.
It seems that Cobertura takes into account only the lines that were executed during testing, and doesn't get the full class data.
I've tried to add source files to the reports - no help.
I also tried to take the .ser file created after the instrumentation and merge it with .ser file created after Jetty shutdown (it is actually the same file, but before running Jetty I backed-up the .ser which was created after instrumentation) - no help here either.
Can someone please help??
Thanks
100% coverage is a clear indicator, that the sources are missing for the report. You should check your configuration for creating the report.
Make sure that:
you give the right folder
the source folder is structured like the packages, and not just all classes in one dir
As explained at http://cobertura.sourceforge.net/faq.html, in the answer to the question "When I generate coverage reports, why do they always show 100% coverage everywhere?",
"Cobertura is probably using the wrong .ser file when generating the reports. When you instrument your classes, Cobertura generates a .ser file containing basic information about each class. As your tests run, Cobertura adds additional information to this same data file. If the instrumented classes can not find the data file when running then they will create a new one. It is important that you use the same cobertura.ser file when instrumenting, running, and generating reports."
In my case, I experienced this issue when instrumented classes were in one .ser and during execution I was generating another .ser. Generating the HTML report "just" from the second .ser shown the problem mentioned in the question. Merging the two datafiles (.ser), and regenerating the report, solved the issue.
Refer to http://cobertura.sourceforge.net/commandlinereference.html for "Merging Datafiles" information.