Compile unit tests in Adobe CQ5 CRXDE that reference Felix OSGI bundle JUnit code - unit-testing

I want to write some unit tests that run within Adobe CQ 5.4. I am doing what is described in this article for testing within CQ:
http://jtoee.com/2011/09/799/
However, after I create the unit test class in my Java code, it won't compile within CRXDE because it can't resolve the org.junit namespaces. I installed and activated the JUnit bundle in Felix as described (Apache Sling JUnit Core), but I am guessing there is something else I need to do in order for this active Felix bundle to be found in CRXDE. The Felix bundle in the CQ5 instance I am connected to shows these exported packages:
junit.framework,version=4.8.2
org.apache.sling.junit,version=1.0.7.SNAPSHOT
org.apache.sling.junit.annotations,version=1.0.7.SNAPSHOT
org.junit,version=4.8.2
org.junit.matchers,version=4.8.2
org.junit.rules,version=4.8.2
org.junit.runner,version=4.8.2
org.junit.runner.manipulation,version=4.8.2
org.junit.runner.notification,version=4.8.2
org.junit.runners,version=4.8.2
org.junit.runners.model,version=4.8.2
In this sample unit test code below, the last three import statements "cannot be resolved."
import org.apache.sling.api.resource.*;
import org.junit.*;
import org.junit.runner.*;
import org.apache.sling.junit.annotations.*;
#RunWith(SlingAnnotationsTestRunner.class)
public class MyUnitTest {
public ResourceResolver getResourceResolver() {
try {
return getResourceResolverFactory().
getAdministrativeResourceResolver(null);
} catch (LoginException e) {
fail(e.toString());
}
return null;
}
}
It is my novice understanding that the OSGI bundle installed in Felix should be accessible for me to reference in my Java classes using CRXDE, but it isn't happening for the JUnit bundle I installed. Why not? What do I need to do to get CRXDE to find the OSGI bundle reference and compile within CRXDE?

What you're doing looks correct at first sight.
Did you try restarting CQ after installing the required bundles? In theory that should not be required but I'm wondering if the bundle compiler is picking up the newly available packages correctly.
I have uploaded a content package with a similar simple example at http://dl.dropbox.com/u/715349/cq5-examples/junit-tests-1.0.zip (md5 2915123ad581aa225bd531247ea02878), after installing this package on a fresh CQ 5.4 instance the example test is correctly executed via http://localhost:4502/system/sling/junit/
You might want to try my sample and compare with yours.

Short Answer
The problem is not with CQ, the problem is with CRXDE. CRXDE automatically downloads and caches required jar files on your local machine so they don't have to be retrieved constantly from CQ.
If you switch to the 'Package Explore' navigation and then expand the project '{SERVER}{PORT}{HASH}' you should see a folder called Referenced Libraries. Right click and select Build Path >> Configure Build Path. From there you can add any dependencies you want into the project.
Long Answer
CRXDE is not a good tool for creating bundles. It is much better to create bundles through a full fledged IDE such as Eclipse and utilize Apache Maven as a build tool. Apache Maven can automatically manage your dependencies, run tests on your code and separate test vs. runtime dependencies.
That way you can avoid having to load dependencies that you don't really need such a jUnit into your OSGi console and you have more control over how your bundle is built and deployed.
Day has a really nice guide to getting you set up with building CQ projects with Eclipse.
http://dev.day.com/docs/v5_2/html-resources/cq5_guide_developer/ch04s02.html

Related

unit test cases for wizard in odoo 10

I am new to Odoo. I am using Odoo 10. I would like to write test cases for a new wizard I created under a module A. I put all my wizard code (views + models) inside wizards directory. I created unit test cases under the path <<module/tests>> followed all the file/class/method naming conventions. When I try to upgrade the module (with test enable) to run unit test cases, all the other modules tests scripts are run but not for the newly created module A. Please suggest what additional changes might be needed to enable test scripts for a newly created module with wizard.
Thank you.
I believe that the structure you have follow is according to the standard way. You can find the structure here Testing Module Structure.
Also please check that the naming on the folder and file in which you wrote the code,
for example - tests/test_todo.py Also, don't forget to update the import in tests/init.py to from. import test_todo
This is because Odoo expects the test module names to start with test_ when it searches for tests belonging to a module Code Reference
Command to Run the testcases:
python ./odoo.py -i module_to_test --log-level=test -d your_database --db-filter=your_database --test-enable --stop-after-init

Generated Service with Axis outside Liferay, can't use it inside plugins

Using Liferay 6.2 CE GA6 here.
In my current project I am forced to use a SOAP-RPC webservice generated outside liferay. I have the WSDL file and the procedure I used to get the client was:
Downloaded Axis 1.4 (same that Liferay 6.2 CE GA6 uses, I think)
Used WSDL2Java to generate Java classes that use the remote service.
Did a local test with those classes - everything fine.
Included those classes in a EXT plugin and deployed that plugin to
Liferay.
Created Arquilian test and run that test FROM ECLIPSE - Everything
works.
And now, thinking I got this covered, I created a new Service Plugin, created a test method:
public Boolean canICallMyService(){
MyWsCaller wsClient = new MyWsCaller();
return true;
}
And bam, exception:
09:33:21,394 ERROR
[http-bio-8080-exec-18][JSONWebServiceServiceAction:87] Unresolved
compilation problems: _ The import javax.xml.rpc cannot be
resolved_ The method X() from the type XServiceXmlCCServiceLocator
refers to the missing type ServiceException_ ServiceException cannot
be resolved to a type_ [Sanitized]
I know this must be related with me using Axis libs and Liferay using Axis libs as well, but I can't figure out how to overcome this classpath error. The import in question is jaxrpc.jar, that is included in Liferay's lib (webapps\ROOT\WEB-INF\lib). I have included in ALSO in my EXT plugin's lib directory. If I remove it from the EXT's lib, then the Arquilian test also fails with same error.
I have tried to put the required libs in the lib/ext in tomcat, this just makes Liferay give up on life at start with multiple Axis cast errors on all portlets.
Anyone has any idea what is the correct procedure and how can I replicate it?
I just want to use a Axis 1.4 generated webservice within a Portlet.
My current solution, that I consider kind of lame, is to deploy the webservice as a servlet in the same container and just run calls to localhost/service?q=parametersThatINeed. Works, and no Axis libs conflits.. but not really ideal.

Grails test-app classpath

I'm trying to use test support classes within my tests. I want these classes to be available for all different test types.
My directory structure is as follows;
/test/functional
/test/integration
/test/unit
/test/support
I have test helper classes within the /test/support folder that I would like to be available to each of the different test types.
I'm using GGTS and I've added the support folder to the classpath. But whenever I run my integration tests running 'test-app' I get a compiler 'unable to resolve class mypackage.support.MyClass
When I run my unit tests from within GGTS the support classes are found and used. I presume this is because the integration tests run my app in its own JVM.
Is there any way of telling grails to include my support package when running any of my tests?
I don't want my test support classes to be in my application source folders.
The reason that it works for your unit tests inside the IDE is that all source folders get compiled into one directory, and that is added to your classpath along with the jars GGTS picks up from the project dependencies. This is convenient but misleading, because it doesn't take into account that Grails uses different classpaths for run-app and each of the test phases, which you see when you run the integration tests. GGTS doesn't really run the tests; it runs the same grails test-app process that you do from the commandline, and captures its output and listens for build events so it can update its JUnit view.
It's possible to add extra jar files to the classpath for tests because you can hook into an Ant event and add it to the classpath before the tests start. But the compilation process is a lot more involved and it looks like it would be rather ugly/hackish to get it working, and would likely be brittle and stop working in the future when the Grails implementation changes.
Here are some specifics about why it'd be non-trivial. I was hoping that you could call GrailsProjectTestCompiler.compileTests() for your extra directory, but you need to compile it along with the test/unit directory for unit tests and the test/integration directory for integration tests, and the compiler (GrailsProjectTestCompiler) presumes that each test phase only needs to compile that one directory. That compiler uses Gant, and each test phase has its own Grailsc subclass (org.grails.test.compiler.GrailsTestCompiler and org.grails.test.compiler.GrailsIntegrationTestCompiler) registered as taskdefs. So it should be possible to subclass them and add logic to compile both the standard directory and the shared directory, and register those as replacements, but that requires also subclassing and reworking GrailsProjectTestRunner (which instantiates GrailsProjectTestCompiler), and hooking into an event to replace the projectTestRunner field in _GrailsTest.groovy with your custom one, and at this point my brain hurts and I don't want to think about this anymore :)
So instead of all this, I'd put the code in src/groovy and src/java, but in test-specific packages that make it easy to exclude the compiled classes from your WAR files. You can do that with a grails.war.resources closure in BuildConfig.groovy, e.g.
grails.war.resources = { stagingDir ->
println '\nDeleting test classes\n'
delete(verbose: true) {
// adjust as needed to only delete test-specific classes
fileset dir: stagingDir, includes: '**/test/**/*.class'
}
println '\nFinished deleting test classes\n'
}

Log4net cannot find configuration file when run from Visual Studio/Microsoft Test Framework

We are writing unit tests for our business layer running under .NET 4.0. The business layer is a straightforward C# class library that usually runs within SOAP and REST web services. Our application uses log4net within a separate wrapper assembly for logging. The C# code in the logging assembly has an assembly info directive that tells log4net the name of the configuration file, a la-
[assembly: log4net.Config.XmlConfigurator(ConfigFile="AcmeLogging.config", Watch=true)]
Initializing the log4net through the wrapper works fine in the web services. When we initialize it from with our unit test assembly it does not appear to see the configuration file. The configuratino file is configured through properties to be copied to the execution directory, and we do see it in the bin\debug directory. A quick console test program using the logging assembly running from within that same folder works fine. The curious thing is that the behavior problems are intermittent, and pop up on different developers' machines at different times and cannot be cured in any deterministic way.
Stepping through the wrapper assembly code, the log4netLogManager.GetLogger() call appears to return correctly, but the list of appenders returned by log.Logger.Repository.GetAppenders() is empty. Since this incorrect behavior is the same whether the file is in the Bin\Debug folder or not, we believe that it is not seeing the file.
Any clues as to what we're missing about running log4net in the Microsoft Test Framework would be greatly appreciated.
Unit tests are unique in the fact that if you need configuration files in your unit tests you need to include them as deployment items. Here is an example of how I do this within my test class:
[TestClass]
[DeploymentItem("hibernate.cfg.xml")]
public class AsyncForwardingAppenderTest
{
}
In addition to the Deployment attribute you need to Enable Deployment in your test settings. To do this go to Test->Edit Test Settings->. Then click on Deployment area on right. Click the checkbox Enable Deployment.
After doing this and running your test your config file should be located in your test results folder. TestResults\username_machine date stamp\Out. If your config file is not located in this folder it will not work. This is what the DeploymentItem attribute does. It sticks the file in this Out folder.
If you don't want to include the DeploymentItem attribute on every test class what I did was to create a base test class that all of the tests that use log4net inherit from and mark it with the DeploymentItem attribute.

Gradle Jetty plugin locking files

Is there a way to fix the file locking issue caused by jetty entirely from gradle?
Some clarification:
When using the Gradle Jetty plugin by running gradle jettyRun, jetty causes the static resource files (html, css, js, etc.) to be locked when using Windows.
You can see a description of the problem in Files locked on Windows.
The same article also describes how you can fix that. Basically you have to either:
Disable the use of file mapped buffer
Not use NIO at all.
Both things require to add some jetty specific configuration files to the project, which I do not want to do - the jetty plugin is used only for convenience, and maintaining configuration for it does not feel right.
I do not need NIO for testing on the local machine, so any solution works.
Edit:
For now, I picked the option at which you set useFileMappedBuffer to false. This is how to do it:
Specify a path to your webdefault.xml like
[jettyRun, jettyRunWar,jettyStop]*.with {
//other configs
webDefaultXml = file("${project.webAppDir}/WEB-INF/jetty-webdefault.xml")
}
Get file from the latest 6.1.x distribution of jetty. The plugin seems to support only jetty 6. You can localte it at jetty-6.1.26\etc\webdefault.xml. Obviously, you have to place it at the path specified at the previous step.
Change the default servlet init parameter useFileMappedBuffer to false
I will research the option of using embeded jetty insted of the plugin.
I found a plugin that seem to be a better alternative:
https://github.com/akhikhl/gretty
Positives
Does not lock your files and support hot deployment (even something Gretty call "fast reload")
Gretty 1.2.0 uses Jetty 9.2.9.v20150224. Jetty plugin provided by Gradle 2.2.1 uses Jetty 6.1.25.
same task is used jettyRun (or more simply run).
"Press any key to stop the server". Jetty plugin required CTRL+C then Y.
From what I can tell, the documentation seem to be awesome (Gradle's not so much)
Negatives
A bit more bloated code to setup the buildscript's classpath dependency or apply plugin directly from URL (see doc)
Gretty crash unless you explicitly apply plugin: 'war' (Jetty plugin extends the War plugin)
Kiril answered his own question, many thanks. You should follow Kiril's instructions and this will help you find the appropriate webdefault.xml.
To find out what version of Jetty is started by Gradle, execute
gradle jettyRun -i
And you'll see something like this:
...
Tmp directory = determined at runtime
Web defaults = org/mortbay/jetty/webapp/webdefault.xml
Web overrides = none
Webapp directory = C:\dev\my-project\src\main\webapp
Starting jetty 6.1.25 ...
jetty-6.1.25
...
It took me a while to find a copy of Jetty 6.1.25 as it is no longer listed on the Jetty download page (not even in the archive section!).
You can then grab the appropriate copy of webdefault.xml from here, adjusting the version number as appropriate for your needs:
http://grepcode.com/file/repo1.maven.org/maven2/org.mortbay.jetty/jetty/6.1.25/org/mortbay/jetty/webapp/webdefault.xml