How do I run a specific set of appium java testng test cases in AWS Device farm? I could see that device farm ignores all the annotations for testNG including group and enabled. Is there any way out?
To only run a subset of tests the project will need to include the testng.xml file in the root of the *-tests.jar. Here is a GitHub post I've authored showing how to do that.
https://github.com/aws-samples/aws-device-farm-appium-tests-for-sample-app/pull/14
In the standard environment the tests are parse and executed individually. As a result some testng features like priority and nested groups don't get honored. Also tests are executed slower because the Appium server would be restarted between tests.
https://docs.aws.amazon.com/devicefarm/latest/developerguide/test-environments.html#test-environments-standard
If these feature are needed the project will need to use the custom environments in Device Farm.
https://docs.aws.amazon.com/devicefarm/latest/developerguide/custom-test-environments.html
This produces one set of logs and video of all the tests since the test package is not parsed.
Hth
-James
Related
Can I run my e2e test developed using Protractor on the AWS device farm?
Because I want to complete mobile testing of my project using the AWS device farm, and do not really understand can I do that or not. I found 3 types about that on the AWS forum, but it is too old from 2018.
First forum discussion
Second forum discussion
Third forum discussion
Maybe something changed?
I have protractor e2e tests written for the desktop browser and want to use those ones for the mobile browser too.
I will answer this for both mobile browsers and desktop testing.
Mobile Browsers
AWS Device Farm has 2 execution modes: Standard Mode and Custom Mode.
Standard mode gives you granular reporting if you don't generate a report for your tests locally. This splits up the artifacts for each test.
Custom mode gives you as close as possible execution state and results as you would get locally. It does not give you the granular reporting which is fine for most as you already get reports locally which will be available on Device Farm as well. It is recommended for customers to use custom mode as that is the one that is most up to date and adds supports for latest frameworks unless of course they absolutely need granular reporting.
Protractor on Device Farm
It is not officially support today.
However, Device Farm supports Appium Nodejs in custom mode. You get a yaml file where you can run shell commands on the host machine where the tests will be executed. So in case of protractor you could select this test type (Appium Nodejs), install the missing dependencies needed for the tests, start your server, and run your tests.
The points to evaluate: Since Device Farm takes your tests as inputs, you will have to upload the zip file of your tests. I would highly recommend checking the instructions for nodejs tests and using the same. Alternatively, you can also download your tests on the fly using the yaml file.
Desktop Browsers
Device Farm has a selenium grid that you can connect to from your local machine and run your tests. The browsers Chrome and Firefox run on Windows platform and Safari is not supported today. If you use a selenium grid on your local machine for your tests, then you most likely should be able to run the same tests using the Selenium grid on Device Farm. Of course, pending validation.
If you need more help on any of these items feel free to reach out to aws-devicefarm-support#amazon.com and I can help you further.
You can test in chrome with an emulated mobile mode:
You can add "mobileEmulation" in a new protractor.conf-mobile.js
chromeOptions: {
args: ['--disable-infobars', '--headless', '--disable-gpu', '--window-size=1920,1080'],
'mobileEmulation' : { 'deviceName': 'Galaxy S5' },
I had configured JaCoCo in WebSphere as JavaAgent (Refer: http://www.jacoco.org/jacoco/trunk/doc/agent.html).
Restarted the server, and ran a series of automated tests on the application (to give some load) and then stopped the server.
I can see the jacoco.exec getting generated in the Server (as configured to /tmp/ location).
Now, How do I generate the HTML report ?
Before voting down this question or marking it as duplicate, here is the reason why I'm posting this question. I went through the JaCoCo Documentation like http://www.jacoco.org/jacoco/trunk/doc/maven.html and also multiple StackOverflow questions but still I'm confused.
What I understood is that the Maven plugin allows us to run the Unit tests, Integration tests and then generate a report.
What I'm looking for is a report based on the load I had given to my application deployed in Websphere. I can see the jacoco.exec file generated, but not sure from the documentation on how to generate the HTML reports.
Thanks in advance.
You could use the jacoco:report-aggregate goal with Maven.
You could refer this http://www.eclemma.org/jacoco/trunk/doc/report-aggregate-mojo.html
P.S. : However, when i had the same issue, I used Sonar to read the exec file that was generated. It gives much more than just code coverage.
I was able to generate JaCoCo report as follows :
Configured JaCoCo as Java Agent
Restart the Server and do some transactions/give some load (in my case I ran a series of automated tests)
Stop the Server (This will actually generate the jacoco.exec file)
Create/Configure the Ant script and run it (This will read the jacoco.exec file and generate the html report). Refer : http://www.eclemma.org/jacoco/trunk/doc/ant.html
Even though my Project is a Maven project, I used ant script for the Report generation. I automated all the above steps using Bamboo and it made running and maintaining this Job easier.
I am attempting to set up Java code coverage for a fairly complex app that
combines multiple large modules, only one of which I need to check coverage on
uses a combination of ant and Maven for builds
cannot be run except as an installed application on a server, with configuration
the automated tests to be analyzed for coverage are not part of the application build and make use of API calls to the application server from a remote client
The examples given in the jacoco documentation and in the online sources I have found assume the app under test is not previously installed and the tests are unit/integration tests run as part of the build. The documentation does not cover the details of how the jacoco instrumentation is done or when the call is recorded to a particular line of code. If I use ant or maven to instrument a particular module, use that module to build the full app, install it on a server, and configure it, will my remote tests then generate the .exec file?
Any advice on how to achieve the end goal (knowing how much of our code is covered by the tests) is greatly appreciated, including better search terms than "jacoco for installed app" which as you can imagine is ... not very useful. My google-fu is humbled.
I am currently using jenkins to build a list of different modules for my project. I trigger the builds using Maven. I have sonarqube installed on the server and have set it up correctly so that when a module builds it is displayed on sonarqube and includes all of the basic details such as lines of code, technical debt etc. The modules all have Junit tests that run against them, and sonarqube displays this by saying that the Unit Test Sucess is 100% and it also says the number of tests that have been run in that module. However I cannot get the Unit tests coverage field to display anything and it is blank for all of the modules.
Here is an exert (one module) from my pom.xml
customer.sonar.projectBaseDir=.
customer.sonar.sources=D:/TFS/WorkSpace/DEV_2_HYBRID/APP_FO/application/customer/src/main/java
customer.sonar.Hybrid=Customer
customer.sonar.tests=D:/TFS/WorkSpace/DEV_2_HYBRID/APP_FO/application/customer/target/surefire-reports
customer.sonar.junit.reportsPath=D:/TFS/WorkSpace/DEV_2_HYBRID/APP_FO/application/customer/target/surefire-reports
The versions of the software I am using are as follows:
Sonarqube v.5.0,
Jenkins Sonarqube plugin v.2.1,
Maven v3.2.5
As I said at the beginning the unit test success rate does show successfully, so I believe it is only a small change needed that will get the unit test coverage field working.
Any help would be really appreciated!
You need to execute the coverage engine of your choice and provide the report to SonarQube via the appropriate property.
If you are using JaCoCo, the report importer is embeded in the java plugin, for other coverage engine (clover, cobertura...) you have to install the dedicated plugin.
For more information see the dedicated page of documentation.
I've a server running a proprietary language on which I'm able to run "unit tests" in this language. I cannot install a Hudson slave on this machine, but would like to have these tests results appearing in a job of hudson (to have at least a monitoring of the code quality for this server code).
I'm currently trying to use web services to get the results and store them in Hudson workspace, but I do fear it is not the right solution.
What solutions can you advice me ?
I finally have gotten through the web services path, although it was not easy.
There are some steps in this path
I created a maven mojo with groovy (see GMaven for more infos) which, using groovyws, called a web service that, from tests results, creates the junit report.
Armed with this mojo, I created a maven project that called the web service and stores the junit.xml file in an output folder
Finally, i created in hudson a maven job for this project and called it regularly. Thanks to junit reporting integration in maven builds, my tests results are visible as a graph in Hudson and user can drill down to failing tests.
Not sure if these are possible but...
Maybe one option is when the build job finished execute a second build target or script to scp the test results from the remote server to the local build server so they appear in hudson
Or if the platform allows
Map a directory on the remote machine to the local file system by using something like sshfs etc
karl
Yup, you can scp or whatever the results (in junit xml format) to the current workspace dir using a script task. Then have a "Publish JUnit test result report" post-build task & point it at the copied-in files.
Obviously if it's not in junit-compatible format you'll have to convert it.
Sounds like you're on the right path though