How to handle different versions of dependencies? - build

I have got a make file project which uses several tools such as cppcheck or asn1c. There are several developers which use this project on their local custom linux machines. The problem is, that every linux machine has a different version of the needed tools. So for example one developer has cppcheck 1.8 and another on has installed 1.6. Now I run into trouble because the different versions of the tools have different behavior. For example some developer run successfully through cppcheck and some do not.
So I ask how to handle different versions of dependencies?
I have got some Ideas:
Adding the source code of the tools to the project and compile the tools before running the build process itself.
Compiling the tools statically and add the binary to the project. Thus every developer would use the exact same binary.
Give a virtual machine or a remote access to every developer. Thus every one uses the same environment
Instruct all developers which Linux distribution to use and to keep their system up to date.

Set up a dedicated server(s) to handle the builds after they are committed to your versioning system.
This way you can make sure they all use the same versions, you can setup multiple buildservers with different sets of libraries to check against different versions of dependencies or OS-es.
This is a pretty common setup for software development.
See this SO post for the rationale behind it: https://stackoverflow.com/a/1099146/2186184

Related

build C++ projects in Maven with maven-nar-plugin

has anyone using maven-nar-plugin to build C++ code for different platforms, using different compilers? If there is someone please give me more info regarding this.
I am just wondering how can be a NAR file built for different platforms with maven. I know that in order to be able to build a project on a specific platform you should run the build on that platform, and use the specific compilers and linkers of that platform. But my experience is related to Java projects and Maven and as you already know java is pretty platform portable so I've not experienced problems like thin until now.
So, any help and details about how to build projects with maven-nar-plugin would be appreaciated!
Thanks
Currently, the most surefire way to build multiple platform binaries with maven-nar-plugin is to actually do it on different platforms, rather than attempting a cross-compilation-based solution.
For example, the ImageJ project uses maven-nar-plugin to build a small native launcher for Linux, OS X and Windows, 32-bit and 64-bit versions.
To accomplish this, the project has a Jenkins CIS on a Linux server, a Windows 7 64-bit VM in VirtualBox with a Jenkins slave, and an OS X desktop Jenkins slave, each of which performs the Maven build for its respective platforms.
It was quite involved to set up; there are detailed instructions in this ticket of the ImageJ issue tracker.
On a related note, a group of interested developers have recently resurrected maven-nar-plugin, migrating the official repository to a new maven-nar organization. One of the items of interest is cross-compilation, which would make it easier to build native binaries for multiple AOLs on the same platform without resorting to Jenkins slaves. But there are many challenges (e.g., GCC changes behavior often), and it is not yet easy to do. We invite interested developers to join the discussion on the new maven-nar-plugin mailing list!

Build server / continuous integration recommendation for C++ / Qt-based projects

I'm looking to implement a build server for Qt-based C++ projects. The server needs to check out the necessary code / assets from Subversion, build the executable files, assemble the artifacts for installation projects, and build the installation media files. The target platforms and (rough) toolchains are:
Windows (32- and 64-bit): qmake, nmake, msbuild, wix toolchain. The end result is an installer EXE and DVD image.
Mac OS X: qmake, make, custom bash scripts to assemble package. The end result is an application bundle within a disk image and a DVD image.
Ubuntu (32- and 64-bit): qmake, make, debuild-based scripts. The end result is a collection of DEB files and a DVD image.
Fedora (32- and 64-bit): qmake, make, rpmbuild-based scripts. The end result is a collection of RPM files and a DVD image.
So that's at least 4 build agents (maybe more if 32- and 64-bit can't be done on the same box) and 7 configurations. Open-source projects are preferred, but that is not an absolute requirement.
Most of the tools I'm seeing seem to be catered to Java (Jenkins, CruiseControl, etc.) or .Net (CruiseControl.net, etc.) Can those be used with a C++ toolchain, or will I constantly be fighting the system? Anything that you have used in the past and found works well with Qt / C++?
I use Jenkins for building and packaging many C++ projects, based on qmake, cmake, and makefiles.
There are plugins for cmake, qmake, and msbuild, but any command line scripts can be run as well.
I have done packaging using Jenkins with no problems, as it is just another command line step in a project.
There are good plugins for monitoring the number of warnings/errors produced by the compiler (I normally use GCC).
It also has matrix builds which allow you to build a project several times with different combinations of compiler flags, pre-processor variables, platform, etc. One project I set up is a matrix build with 5 boolean preprocessor flags on two platforms, which then does 2^6=64 builds. These can take a bit of setting up to get correct.
Here you can read a quick example:
Continuous Integration Server - Hudson
I think that Hudson, jenkins and builbot are worth a try. Wasting a day or two evaluating and trying them with a quick example will help you to choose confidently.
Most of the tools I'm seeing seem to be catered to Java (Jenkins, CruiseControl, etc.) or .Net (CruiseControl.net, etc.) Can those be used with a C++ toolchain, or will I constantly be fighting the system? Anything that you have used in the past and found works well with Qt / C++?
Any reasonably capable CI system will have a piece that will allow you to execute any program you want for your build command.
Here's what I would consider:
Does the CI system run on your system(s) of choice
Does it allow you an easy way to view your logs
Does it integrate with your test runner
Does it integrate with your code coverage reports (e.g. BullseyeCoverage w/C++ & Qt)
Will it publish your files in a manner sensible for your needs
Will at provide an archive/store of files, if necessary (e.g. pdbs & lib*.so.debug)
If the CI system doesn't support feature X, will you have to write it for each supported OS/system
Is the CI system / UI easy for you to use.
I did the above using CruiseControl and most things were pretty easy. I wrote everything in make or qmake and simply called out to the command that I needed executed. For unit test and code coverage integration I output stuff to XML and transformed it to something supported by CruiseControl.
My recommendation, take a look at the recommended CI systems and examine them based on the criteria above.
I'm using buildbot for this. I've been using it for 4 years, and I feel very happy with it.
It is an application written in python, that runs on a server and can manage multiple clients on various OSes. I'm currently using Windows XP, Windows 7, Debian, Ubuntu and CentOS build slaves. My projects are C++, and one of them (the end user GUI) is made in Python. But we've also integrated with other frameworks, for other features than GUI.
What is really good about buildbot is that it works by running command lines on slaves. With this, you can do whatever you want. Even on Windows systems to compile using Visual Studio! From these command lines, you get all the output centralized on the server, and accessible.
You may also find alternatives on this site that references many of them.
Disclaimer: I looked at it 3 years ago, I don't know if it is still accurate.
Hudson or Jenkins is pretty good.
Jenkins is indeed pretty popular for developing such a custom service, even after all these years, considering the question is already 7 years old.
Felgo also offers a Continuous Integration and Delivery (CI/CD) service for Qt. It supports desktop platforms as well as iOS, Android and embedded targets. The full feature set is described in the blog post.
Disclaimer: I am a software developer at Felgo

What tool should I use to create my buildmachine?

I am working on my free time on a multiplatform/multi-architecture library written in C++.
Before every release, I have to boot up several computers (One on Windows, one on Linux, another one on Mac OS, ...) just to make sure the code compiles and runs fine on every platform.
So I decided to create my own buildmachine but I really don't know what tools exist to do this. I'd like my buildmachine to run on Linux but any other solution will be accepted.
Ideally, I would just have to click on a "Build all" button, and it would compile my library for the different platforms/architectures, generate archives from the result and/or report potential errors.
My project "constraints" are:
It is written in C++
It compiles on Windows using SConstruct/MinGW and Visual Studio 2010
It compile on Linux and Mac OS using SConstruct/g++
The sources are stored into Subversion (svn)
Do you know any tool/set of tools that could help me achieving my goal ?
Thank you very much.
I would setup 3 VMs (VirtualBox is free), one for each platform.
Install TeamCity (or Hudson) on Linux and agents on the other VMs and then it's just a matter of configuring the build system.
At the very basic level you should have 2 tasks: one to checkout the sources from Subversion and another to invoke scons.
I'm not too familiar with Hudson but TeamCity is certainly capable of generating reports of a build, display progress etc.

Continuous Integration server for C++ - What about library dependencies?

I am currently researching a good setup for a continuous integration server which would build various C++ applications for several Linux distributions.
My primary question is how other users here have handled the differences in system libraries between Linux distributions?
While it might be relatively easy to build direct dependencies such as UI libraries along with an application, "indirect" dependencies such as glibc look like a big pain if they had to be built alongside the application every time. I am therefore thinking of moving the actual build execution into a separate virtual machine for each distribution, e.g. using rlogin to run the commands. My goal is to prevent binary incompatibilities between build-machine library versions and those deployed in the target distributions.
Does anyone here have any experience with such a process and could tell if the above sounds like a feasible approach?
We use Jenkins (Contiguous Integration) and CMake (build system) for this purpose. Jenkins is similar to Buildbot, i.e. it also has buildmaster and buildslaves. Currently I have setup 8 slaves to build for 4 different platforms (FC8, FC10, FC12 and Windows 7). We build both debug and release binaries, so I dedicated one slave for each platform and build type.
As for the third party libraries like Qt & Boost, I compiled them on each platform and checked them into a separate repository.
#esavard: We use CMake 2.8 to do cross compilation, I have not used minigw but a quick google search indicates that it is possible. Here is a link to a tutorial to cross compile for Windows on Linux using CMake and miniGW.
I have not used Buildbot and cannot comment on its features but thought I should mention an alternative that we are currently using.
Hope this helps.
Buildbot has the notion of buildmasters and buildslaves.
A buildmaster takes care of displaying the web GUI, sending email, triggering builds, and other housekeeping. The buildslaves wait on the buildmaster and when commanded perform builds.
We have buildbot set up to build on a number of different platforms, some of them VMs, and it's working well for us.
Certainly buildbot and many virtual machines is the way to go with this. We have VMWare ESX server hosting many build slaves which overnight compile our application. The application is then tested on another virtual machine (not the build slave and just having a default OS install) to verify that it works and all dependency's are packaged.
The nice thing I would like to do is make the testing run time phase an automated step but I haven't been given the time to do that yet.

Building C++ on both Windows and Linux

I'm involved in C++ project targeted for Windows and Linux (RHEL) platforms. Till now the development was purely done on Visual Studio 2008. For Linux compilation we used 3rd party Visual Studio plugin, which read VS solution/perojects files and remotely compiled on Linux machine.
Recently the decision was to abandon the 3rd party plugin.
Now my big concern is a build system. I was looking around for cross platform build tools. This way I don't need to maintain two set of build files (e.g. vcproj/solution for Windows and make files for Linux).
I found the following candidates:
a. Scons
b. cmake
What do you think about the tools for cross-platfrom development?
Yet another point that bothers me is that Visual Studio (+ Visual Assist) will loose a lot functionality without vcproj files - how you handle the issue with the tools?
Thanks
Dima
PS 1: Something that I like about Scons is that it
(a) uses python and hence it's flexible, while cmake uses propriety language (I understand that it's not a winner feature for a build-system) (b) self contained (no need to generate makefiles on Linux as with cmake).
So why not Scons? Why in your projects the decision was to use cmake?
CMake will allow you to still use Visual Studio solutions and project files. Cmake doesn't build the source code itself, rather it generated build-files for you. For Linux this can be Code::Blocks, KDevelop or plain makefiles or still other more esoteric choices . For Windows it can be among others Visual Studio project files and still others for MacOS.
So Visual Studio solutions and projects are created from your CMakeLists.txt. This works for big projects just fine. E.g. current Ogre3d uses CMake for all platforms (Windows, Linux, MacOS and IPhone) and it works really well.
I don't know much about scons in that regard though, I only used to build one library and only in Linux. So I can't compare these two on fair ground. But for our multi-platform projects CMake is strong enough.
I haven't used Scons before, so can't say how that works, but CMake works pretty well.
It works by producing the build files needed for the platform you're targeting.
When used to target VC++, it produces solution and project files so from VS, it appears as if they were native VS projects. The only difference is, of course, that if you edit the project or solution directly through VS, the changes will be erased the next time you run CMake, as it overwrites your project/solution files.
So any changes have to be made to the CMake files instead.
We have a big number of core libraries and applications based on those libraries. We maintain a Makefile based build system on Linux and on Windows using the Visual Studio solution for each project or library.
We find it works well for our needs, each library or app is developed either on linux or windows with cross compilation in mind (e.g. don't use platform specific api's). We use boost for stuff like file paths, threads and so on. In specific cases we use templates/#defines to select platform specific solution (for example events). When is ready we move to the other system (linux or windows), recompile, fix warnings/errors and test.
Instead of spending time figuring out tools that can cross compile on both platforms we use system that is best for each platform and spend time fixing specific issues and making the software better.
We have GUI apps only on Windows atm. so there's no GUI to cross compile. Most of our development that is shared between Windows and Linux is server side networking (sockets, TCP/IP, UDP ...) and then client side tools on Linux and GUI apps on Windows.
Using with perforce for source code version management we find in quite many cases that the Linux Makefile system is much more flexible for what we need then Windows VS. Especially for using multiple workspaces (views of source code versions) where we need to point to common directories and so on. On Linux this can be done automatically running a script to update environment variables, on Visual Studio referencing environment variables is very inflexible because it's hard to update automatically between views/branches.
Re sync question:
I assume you are asking how to make sure that the two build systems get synchronized between linux and windows. We are actually using Hudson on Linux and CruiseControl on Windows (we had windows first with cruise control, when I went to setup linux version I figured Hudson is better so now we have mixed environment). Our systems are running all the time. When something is updated it is tested and released (either windows or linux version) so you would know right away if it does not work. During testing we make sure all the latest features are there and fully functional. I guess that's it, no dark magic involved.
Oh you mean build scripts ... Each application has it's own solution, in solution you setup up dependencies. On Linux side I have a makefile for each project and a build script in project directory that takes care of all dependencies, this mostly means build core libraries and couple of specific frameworks required for given app. As you can see this is different for each platform, it is easy to add line to build script that changes to directory and makes required project.
It helps to have projects setup in consistent way.
On Windows you open project and add dependency project. Again no magic involved. I see this kind of tasks as development related, for example you added new functionality to a project and have to link in the frameworks and headers. So from my point of view there is no reason to automate these - as they are part of what developers do when they implement features.
Another options is premake. It's like cmake in that it generates solutions from definition files. It's open source and the latest version is very highly customizable using Lua scripting. We were able to add custom platform support without too much trouble. For your situation it has support for both Visual Studio and GNU makefiles standard.
See Premake 4.0 Homepage
CruiseControl is a good choice for continuous integration. We have it running on Linux using Mono with success.
Here is an article about the decision made by KDE developers to choose CMake over SCons. However I've to point that this article is almost three years old, so scons should have improved.
Here is comparison of SCons with other building tools.
Had to do this a lot in the past. What we did is use gnu make for virtually everything including windows at times.
You can use the project files under windows if you prefer and use gnu make for Linux.
There isn't really a nice way to write cross platform makefiles because the target file will
be different among other things (and pathname issues, \ vs / etc). In general, you'll probably be tweaking the code across the various platforms to take subtle differences into account, so a tweak to a make file and checking on the other platforms would have to happen
anyway.
Many OS projects maintain Makefiles for different platforms such as zlib where they are named like Makefile.win, Makefile.linux etc. You could follow their lead.