Is there a not too dirty way to detect at runtime, whether the code was started with lein test? I just want to select a different redis database, so solutions like environ or using different resource files seem to be a bit overkill.
For example, leiningen automatically enables the test profile, but I haven't found a way to get a list of currently enabled profiles.
There is no simple way to do it. Neither lein.test nor clojure.test expose such information. Even if you find a way to hack into some private var of lein test or clojure.test and check it to determine if your code is run as part of lein test.
However, it would have a very big issue: your production code would need to require testing library code (e.g. clojure.test) or even worse your build tool code (lein test plugin code).
You might try to define such configuration var (dynamic or not) in your production code and set it in your tests using fixtures.
The best solution would be to configure your application dynamically based on the external variable like system property or environment variable (e.g. by using suggested environ). This way you can have as many different configuration sets as you need (e.g. prod vs unit test vs integration test vs performance tests and so on) and not just two (prod vs test).
It might seem like overkill, but component for instance is invented for exact usecases like this. Or dependency injection in general.
I know that feeling, it's just a private project, no need for difficult stuff etc. Thats why I put together my own template so that all I need to get started is run lein new ...
This is my solution to circumvent the "just want to select a different redis database" usecases.
Edit It is a template for a web framework: https://github.com/sveri/closp but a lot of these parts are not specific to web dev, especially the components part: https://github.com/sveri/closp/tree/master/resources/leiningen/new/closp/clj/components
There is also an integration test where I make use of test components specifically: https://github.com/sveri/closp/blob/master/resources/leiningen/new/closp/integtest/clj/web/setup.clj
I found a way with Cprop. Set a var in your "env/{test|prod|test}/config.clj" file:
(System/setProperty "lein.profile" "dev")
then you can read the value:
(require '[cprop.source :as source])
(str "from-system-props: >> " (:lein-profile (source/from-system-props)))
other option is to search for the key ":conf" in the system-props:
:conf "test-config.edn"
because the config file changes according to the profile.
Related
I need to find out how I can instruct Vite to replace references to local/relative modules at runtime. The use case here is the test runner mocha-vite-puppeteer, which uses Vite to run tests, but then stubbing of modules of course does not work when using Node machinery such as proxyquire or rewire.
So I basically need to either be tipped of some existing software that can help me in doing this, or some tips on how to create my own "vite-proxyquire" using import.meta and friends.
A normal use for temporarily stubbing out ./my-ugly-module might be that you want to avoid loading some sub-dependency that has some ugly transitive dependencies that suck the entire application tree into your little test, or you want to avoid loading a sub-dependency that has some ugly side effects on the global state.
Existing solutions
Modern Web, a refreshing "bundler- and frameworkless" approach to web development using standard tools, talk a bit about the issue around how the immutable nature of ES Modules prevent usual stubbing patterns. They present a solution in the form of Import Maps, which essentially would be similar to the alias config in Vite (Rollup really), mapping a path to a module to some other file. The problem with a static solution like this is that it would replace all imports of a given module, not just for a single test. Modern Web has a solution to this where they have chosen to use a custom html page for each such test. To make this less of a hassle with regards to running, they then have a custom test runner that handles dealing with all these extra test html files. Doing something like that could be one way of fixing it, but it would require developing quite a bit of middleware/plugin code IMHO to make it work transparently with Vite. Without any advanced tooling it would also introduce a lot of extra files that seems a bit of a downside compared to todays imperative mocking of dependencies with proxyquire, Jest or Test Double from inside of the test files.
I'm looking for a piece (or a set) of software that allows to store the outcome (ok/failed) of an automatic test and additional information (the test protocol to see the exact reason for a failure and the device state at the end of a test run as a compressed archive). The results should be accessible via a web UI.
I don't need fancy pie charts or colored graphs. A simple table is enough. However, the user should be able to filter for specific test runs and/or specific tests. The test runs should have a sane name (like the version of the software that was tested, not just some number).
Currently the build system includes unit tests based on cmake/ctest whose results should be included. Furthermore, integration testing will be done in the future, where the actual tests will run on embedded hardware controlled via network by a shell script or similar. The format of the test results is therefore flexible and could be something like subunit or TAP, if that helps.
I have played around with Jenkins, which is said to be great for automatic tests, but the plugins I tried to make that work don't seem to interact well. To be specific: the test results analyzer plugin doesn't show tests imported with the TAP plugin, and the names of the test runs are just a meaningless build number, although I used the Job Name Setter plugin to set a sensible job name. The filtering options are limited, too.
My somewhat uneducated guess is that I'll stumple about similar issues if I try other random tools of the same class like Jenkins.
Is anyone aware of a solution for my described testing scenario? Lightweight/open source software is preferred.
Is there a way to get Gradle (1.12) to list all of the available unit test classes in a project?
I'm considering putting a front-end on a series of tests we use in my company, and since new tests are always being added, I need a way to get a list of available tests.
I realize that I could scan the actual project for classes that reside in the test sources tree, but I was hoping for something easily parsed from Gradle. I just don't know if that's really an option and I'm having trouble getting decent search results since "test" is such a generic word.
Any help would be appreciated.
There is no official API in Gradle to expose this information. You can check if ClassScanner.java is what you need. Either look at Gradle sources or it is also used in EclipseTestExecuter.java Keep in mind that it is an implementation detail.
A simpler approach is to run these tests and enable logging where you will print names of executed tests. There is an example how to this in Gradle documentation I think.
I've been trying to write a JUnit test case for one of my Java class which creates a page with some given properties in CQ. For it, it need to get reference of SlingRepository and ResourceResolverFactory. I was using this to get an idea on how to achieve this. In the document it says that a POST to "http://$HOST:$PORT/system/sling/junit/" path is used to execute tests on server side. But in CQ I get a 404 error for this path.
Is there any alternative URL in CQ for this? Or will really appreciate if anyone can suggest a better approach?
Thanks
One approach is to use a Sling test runner to execute the JUnit tests via a browser. This is the approach you are mentioning. We had to first install the code in this JAR (org.apache.sling.junit.core) to add the code that allows the URL you listed to work. Once that code is there, this URL will allow you to run tests using the test runner's built in page to run/display tests: http://localhost:4502/system/sling/junit/). My team did this for a while, but we soon moved to a different approach--using the Intellij IDE to develop the Java code for CQ and write the JUnit tests, then executing them within the IDE using the built-in JUnit test runner. The same approach works in Eclipse. For our team this approach was superior because it allowed developers to remain in context in the IDE without having to switch to a browser to run the tests.
The key is being able to resolve the references to classes that are installed/available via CQ, such as the SlingRepository and ResourceResolverFactory classes--and other stuff we commonly used, such as the Resource, ResourceResolver, Node, and Session classes. We use a CQ extension (http://helpx.adobe.com/experience-manager/kb/HowToUseCQ5AsMavenRepository.html) to allow our CQ instance to act like a Maven repository. This allows us to export the CQ JARs so we can then reference them as dependencies in the Java projects we create whenever we may need to use some of the classes available via CQ itself.
Once we set up the project dependencies, then we were able to write code--and corresponding unit tests--within the Intellij IDE. We were able to run the tests within the IDE, allowing developers to remain in context and work on the code that will run in CQ just like they work on any Java code (including things like running tests in debug mode or with code coverage, running single tests, running all tests in a class, using keyboard shortcuts to kick off tests, etc.). For us this approach had many advantages over the browser-based Sling test runner, so I recommend this approach.
Some potential considerations:
Exporting from CQ as a Maven repo may not be the best performance--you may want to add things to your own Maven repo for faster access
You may want to script some of the steps so adding project dependencies is not a manual process, but rather is something done via an automated process
You could even export all CQ JARs--or add some scripting to parse out and repackage only the public classes--and make any CQ class available to your Java projects
I'm just getting started with a project that combines GWT, Google App Engine and the Google Eclipse plugin. Where is the best place to store my tests? I normally keep my code organized Maven-style, with src/main/java, and tests in src/test/java. The default setup I get from the plugin dumped my source directly into src, which I'm not too fond of, but I'd prefer not to fight against the tools. What's the "standard" place to put unit tests in such a project?
Solution:
create src/main/java, move the existing code under there
create src/test/java, add your tests here
go to Project -> Properties -> Java Build Path, add the new locations as Source Folders.
I've faced a kind of problem woth GAE testing: Some tests require an appengine-testing.jar wich conflicts with the main appengine-api-xxx.jar of the poject. That way, I was able to run tests for GAE but it conflicted with a normal run/debug launch. To be able to run the app in my local machine, I had to remove the appengine-testing.jar and then, a lot of compilation errors appeared in my test/ clases.
If you want an advice, set your test clases in another project (where you can use the jars without conflict)
Otherwise, if you got make it work, please, tell me how did you do.
Thanks a lot.
Put it where it pains you least.
GWT on Google App Engine is pretty new at this point; you are
optimistic to expect there is a "standard" place, especially since
you've already found an inconsistency in what the tools do.
Since you've already accepted the source starting at "src/", why not
put the test source in "test/"? This is certainly standard in many
contexts.