Is it possible to add memory/performance profiling of unit tests as a team city build step?
I am specifically interested in doing it for our .Net apps (test cases are written in NUnit). Any elaboration on this will be highly appreciated.
Teamcity version is "TeamCity Enterprise 7.1.2 (build 24170)"
Thanks.
As far as I know, TeamCity does not support profiling .Net apps yet. At least not in a direct way.
There is an open feature request TW-20190 planning to integrate dotTrace (a .Net profiler made also by JetBrains) in Teamcity.
But dotTrace can be called from command line, so you can write MSBuild/NAnt script to execute dotTrace. The problem is about how to display the profiling results. In "General Settings" page of your build config, you can point artifacts path to the dotTrace result folder, so that teamcity will publish the profiling results as artifacts for you to download.
Related
Jenkins has nice plugin for test results.
Test Results Analyzer Plugin (plugin url).
How to attach the same dashboard (with test results) for GitLab?
Here is my googling result:
Gitlab has "pages" feature but it depends on artifact life and only the last one is shown.
https://docs.gitlab.com/ee/ci/junit_test_reports.html
On the other hand there is an open and long story issue about this requirement for gitlab besides lots of closed ones.
https://gitlab.com/gitlab-org/gitlab-ce/issues/34102
Here the gitlab's info in this comparison page:
https://about.gitlab.com/comparison/gitlab-vs-jenkins.html
"Many languages use frameworks that automatically run tests on your code and create a report: one example is the JUnit format that is common to different tools. GitLab supports browsing artifacts and you can download reports, but we’re still working on a proper way to integrate them directly into the product."
I found a project for a complete solution and fully supported. I just started with this one and for now it is unbelievable. Just upload your reports to it. There are a lot of plug in support.
http://reportportal.io/
I am attempting to set up Java code coverage for a fairly complex app that
combines multiple large modules, only one of which I need to check coverage on
uses a combination of ant and Maven for builds
cannot be run except as an installed application on a server, with configuration
the automated tests to be analyzed for coverage are not part of the application build and make use of API calls to the application server from a remote client
The examples given in the jacoco documentation and in the online sources I have found assume the app under test is not previously installed and the tests are unit/integration tests run as part of the build. The documentation does not cover the details of how the jacoco instrumentation is done or when the call is recorded to a particular line of code. If I use ant or maven to instrument a particular module, use that module to build the full app, install it on a server, and configure it, will my remote tests then generate the .exec file?
Any advice on how to achieve the end goal (knowing how much of our code is covered by the tests) is greatly appreciated, including better search terms than "jacoco for installed app" which as you can imagine is ... not very useful. My google-fu is humbled.
So after much hunting I failed to find a continuous testing tool for IntelliJ 14.
I stumbled across a post that references uses eclipse and Ant in order to simulate this. On save, Ant then runs the tests for any tests that were modified.
I've tried to replicate this but, alas! I've never used Ant before and am finding it extremely difficult. I've setup and configured a generic Ant build file in Intellij but simply cannot figure out how to achieve my task.
Any help, pointers in the right direction is very much appreciated. I've searched but only found information that needs to be decrypted first.
Eclipse has the builder feature, you create an AntBuilder for your project, see also https://stackoverflow.com/a/15075732/130683.
IntelliJ has a trigger feature that might serve the purpose.
Also Infinitest , which provides a Continous Testing Plugin for Eclipse and IntelliJ might be helpful.
Ant is a build tool. Although IntelliJ does that for you, you need IntelliJ to do this which means you can't distribute your application without IntelliJ.
Ant uses a dependency matrix for building. This is sometimes difficult for developers to understand, but it basically means that you define the steps, how the steps are dependent upon each other, and let the build tool figure out exactly how to do its job. Ant is for Java like Make is to C and C++ applications.
Ant uses targets which are the steps you specify to do. For example, you might have a target called package that will build your jar or war. That target might depend upon another target called compile to compile the code. That target might depend upon a code generation phases (like if you had WSDL files).
Each target is a set of tasks. For example, the compile target is likely to have the <javac> task in it. It might also need the <mkdir> task to create the work directories where you classfiles are stored.
There are plenty of books on Ant, and there's a tutorial on the Ant Website. You didn't explain the issues you were having, so it's hard to be more specific than this.
Ant can also run your unit tests too. There's a <junit> target which can run the tests, and you specify whether or not you want to run almost all of your <junit> tests via the <batchtest> sub-entity or if you have a program driver you specify via the <test> entity.
Once you get an Ant script that can build and run your tests outside of IntelliJ, you can now get a Continuous Integration tool like Jenkins. A continuous integration tool watches your repository for changes, and if a change occurs, will then build your application. It's a great way to catch errors early on.
What does this have to do with Continuous Testing? Well, if you have your Ant script able to run unit tests, the Continuous Integration engine not only can build your app, but then run the unit tests with each and every change that occurs.
Jenkins is nice because it's very simple to use. You download a jenkins.war and you can launch the Jenkins webpage via the java -jar jenkins.war command. This brings up a web server on port 8080 on your machine. Obviously, Jenkins can be configured to run on different ports and under Tomcat if you so desire. It can integrate with Windows Active Directory, LDAP, and many other user verification systems.
Jenkins will show you charts and graphs of your tests, let you know which tests failed or passed, and will notify you of any problems via email, tweets, IM, Jabber, and even Facebook posts. People have even setup a traffic light in their offices that turns red when builds or tests fail.
Take it one step at a time. Get a good book on Ant. Read the tutorial on the Ant website. Then try to get a working Ant script to just to build your app. If you are having specific issues, you can ask for help.
Once you have the build going, extend the script to run your unit tests. Once that is done, download Jenkins and try to get that up and running.
In VS2013 you can run the compiler for native code with the /analyze flag that will generate .xml files holding the output of the analyze. This will be interpreted by the UI and shown to the developer.
Is there a solution on how to integrate this into a Jenkins build or are any tool which can read such .xml files like the vc.nativecodeanalysis.all.xml and display it as a web page?
Jenkins is essentially a dashboard that outsources tasks to other tools to "do their thing".
With .NET builds your only option is a freestyle build with heavy use of Windows Batch command post build steps or MSBuild steps. Jenkins only knows the path to MSBuild that you tell it in "Manage Jenkins > Configure System"
After that it will outsource msbuild Post Build steps to MSBuild with the parameters you pass it.
Jenkins can consume JUnit test results and many other tools are written to convert test results into JUnit for Jenkins to consume. The mountain is brought to Jenkins rather than the other way around.
If MSBuild doesn't produce a graph, Jenkins unfortunately won't be able to.
You might keep an eye on the Static Analysis plugin over time to see if it adds support for this.
At the moment, we are running our automated (not CI as such) builds via FinalBuilder via a very simple homegrown Apache interface that just launches the FB scripts on our server. (I like FinalBuilder, and will keep it, but it's CI server, FinalBuilder Server just doesn't cut it IMHO -- especially it doesn't support any "agent" concept at the moment to distribute builds across machines.)
We are doing native C++ development on Windows with a bit .NET mixed in where it's needed and makes sense.
Our current FinalBuilder scripts do everything quite well, from creating nightly builds to full releases (build / automated translation / build / unit test / create setup / put created artifacts on a network share / ...), but our webinterface, queuing abilities, user traceability and reporting is pretty limited.
I have looked around and it seems that TeamCity and Bamboo tick similar boxes, but most descriptions I can find cover only Java and/or .NET simple builds.
So my specific question is, given
several (20-30) complicated FinalBuilder Scripts that work to my satisfaction and that I will have to integrate into ("call" from) the new automation/"CI" server
Native Windows C++ and .NET projects
The actual build (= compiler invocation(s)) is done via a few Visual Studio solution files at the moment
Currently one build server machine, wishing to scale to 2-3 atm.
Using JIRA as issue tracker
using AccuRev as SCM
which tool is better suited, and why: TeamCity (currently 6.5) or Bamboo (currently 3.1).
(Note that I also hope to get some highly subjective answers on the TeamCity and Bamboo forums.)
For TeamCity side, it integrates with Jira, has AccuRev plugin, and has a good support for VisualStudio/C++ projects. It can also run arbitrary scripts.
You can trigger a build and obtain some build results via HTTP-based API. In the UI, you can see which changes have been built and in which build configurations. Easily integrate any custom HTML reports into TeamCity UI (no coding), publish artifacts.
Probably, you should try both solutions and see which one is more suitable for you (with Teamcity, you can use full-functional server for free, the only limit is number of build agents and number of build configurations).
Disclaimer: I'm a TeamCity developer
I found Bamboo more credible than TeamCity. Here are my reasons:
Those Jira plugins for VS or Eclipse are Bamboo plug-ins too. :) no extra add-ins needed.
Better support for Jira integration.
Nice user interface, like the one you used for Jira.
Ability to better integration with other Atlassian tools, such as FishEye.
Cheaper. A 10$ license will suffice your company.
More add-ons on Bamboo than TeamCity, lots of plug-ins.
For completeness' sake: I ended up using Jenkins + Finalbuilder. :-)
I worked in a similar environment using FinalBuilder for build automation, AccuRev for source control and a native windows projects.
I ended up selecting Electric Commander as the best CI solution for the job. It is possible to reuse parts of the FinalBuilder scripts and call them from Electric Commander but simply calling the FB script as one build step would result in you missing out on some of the key advantages of using Electric Commander - realtime log file processing, the ability to parallelize right down to individual step levels in Electric Commander and data collection and reporting.
Electric Commander has an API that exposes all product functionality which can be used in combination with AccuRev triggers to achieve a very flexible solution.
Disclaimer - I liked Electric Commander so much I joined the company and am currently employed by Electric Cloud.
You can try Electric Commander by going to www.electric-cloud.com and clicking on "Try It!"