I'm interested in using Azure Functions in order to build release packages for windows. Mainly centering around electron/node. Specifically, need to be able to do binary compiles of some modules (sqlite3 namely). Which means VC++ Build environment + Windows SDK, Python ^2.7, Node and related...
Has anyone done this via Azure Functions? I know there are similar questions for getting NuGet packages in an azure function, and it does seem that there is an option for the C++ tools, but not sure where to even start.
The main hope is to do this for less than the cost of a full VM for something that is likely to run less than 2-3 times a month.
I'm sorry - this is not a scenario suitable for Azure Functions.
The only way to include "binaries" in a Functions is by referencing assemblies you uploaded upfront. You do not have access to other binaries like VC++ build tools.
If you do not want to pay for a full fledged VM, may be think about using VSTS with its build and release features. With these you basically utilize MS VC++ etc. build tools in a shared and I assume cheaper way. You then are able to trigger those builds from outside with REST / service hooks.
Related
I have got a make file project which uses several tools such as cppcheck or asn1c. There are several developers which use this project on their local custom linux machines. The problem is, that every linux machine has a different version of the needed tools. So for example one developer has cppcheck 1.8 and another on has installed 1.6. Now I run into trouble because the different versions of the tools have different behavior. For example some developer run successfully through cppcheck and some do not.
So I ask how to handle different versions of dependencies?
I have got some Ideas:
Adding the source code of the tools to the project and compile the tools before running the build process itself.
Compiling the tools statically and add the binary to the project. Thus every developer would use the exact same binary.
Give a virtual machine or a remote access to every developer. Thus every one uses the same environment
Instruct all developers which Linux distribution to use and to keep their system up to date.
Set up a dedicated server(s) to handle the builds after they are committed to your versioning system.
This way you can make sure they all use the same versions, you can setup multiple buildservers with different sets of libraries to check against different versions of dependencies or OS-es.
This is a pretty common setup for software development.
See this SO post for the rationale behind it: https://stackoverflow.com/a/1099146/2186184
BACKGROUND
Over the course of my career I have been surprised by how many projects I've seen where it is a real challenge to compile and execute a project in Visual Studio. The source of the problem generally is due to: missing dependencies, lack of documentation, broken project references, etc.
To avoid these headaches I try to automate projects/solutions such that:
the run-time environment is automatically setup when a project is compiled on the developer machine (e.g. use batch scripts to import missing Windows Registry keys)
when compiling a project, the correct dependencies are automatically retrieved (on both the build machine & the developer machines)
THE PROBLEM
To date, I have had a fair amount of success with this approach. However, I have recently been handed a native C++ project that has a dependency on the Microsoft Windows SDK. At compile time, the project makes use of Windows environment variables to locate missing dependencies (e.g. Microsoft Windows SDK).
I understand that using environment variables is how things used to be done. However, by relying on the software developer to configure the development environment:
you are assuming that they will configure the environment properly
the developer is wasting time on configuration when their time could be better spent developing
I do not want to debate the merits of having a developer configure the development environment, but rather, I would like to know:
Given the technology (e.g. TFS) that exists today, what is a reliable and repeatable approach to handling large dependencies (e.g. Windows SDK) for C++ projects in a team environment?
POTENTIAL SOLUTIONS
continue to use environment variables
Adv: once the dependencies are installed, it is very easy for the build machine to compile projects
Dis: you have to spend time documenting to ensure that you can configure the build machine from scratch (e.g. step1: install dependency A, step2: install dependency B, etc.)
Dis: You are relying on the magic environment variables to be pointing at the right target.
Dis: the developer is wasting time configuring when they should be developing
check dependencies into TFS
Adv: everything is kept in one centralized location
Adv: by design, source control keeps a history
Adv: in a sense, source control makes things self-documenting
Dis: Compiling on the build machine now takes considerably longer as the build machine
workspace has to repeatedly retrieve the Windows SDK from TFS
Other?
CONTEXT
Programming Language: unmanaged C++
Source Control: TFS 2012
Dependencies:
Microsoft Windows SDK (~416Mb)
in house libraries
I have limited knowledge of how to administer/configure the TFS build machine.
REFERENCES
Microsoft: Team Development with Visual Studio TFS (Chapter 6)
I remember while working for a security company, the team had a script that usually copies all dependencies for you as soon as you hit compile, to a specific folder for you. its in build properties, for an MFC project, however, it was confusing to me at the time.
the reference seemed very helpful thank you
I'm looking to implement a build server for Qt-based C++ projects. The server needs to check out the necessary code / assets from Subversion, build the executable files, assemble the artifacts for installation projects, and build the installation media files. The target platforms and (rough) toolchains are:
Windows (32- and 64-bit): qmake, nmake, msbuild, wix toolchain. The end result is an installer EXE and DVD image.
Mac OS X: qmake, make, custom bash scripts to assemble package. The end result is an application bundle within a disk image and a DVD image.
Ubuntu (32- and 64-bit): qmake, make, debuild-based scripts. The end result is a collection of DEB files and a DVD image.
Fedora (32- and 64-bit): qmake, make, rpmbuild-based scripts. The end result is a collection of RPM files and a DVD image.
So that's at least 4 build agents (maybe more if 32- and 64-bit can't be done on the same box) and 7 configurations. Open-source projects are preferred, but that is not an absolute requirement.
Most of the tools I'm seeing seem to be catered to Java (Jenkins, CruiseControl, etc.) or .Net (CruiseControl.net, etc.) Can those be used with a C++ toolchain, or will I constantly be fighting the system? Anything that you have used in the past and found works well with Qt / C++?
I use Jenkins for building and packaging many C++ projects, based on qmake, cmake, and makefiles.
There are plugins for cmake, qmake, and msbuild, but any command line scripts can be run as well.
I have done packaging using Jenkins with no problems, as it is just another command line step in a project.
There are good plugins for monitoring the number of warnings/errors produced by the compiler (I normally use GCC).
It also has matrix builds which allow you to build a project several times with different combinations of compiler flags, pre-processor variables, platform, etc. One project I set up is a matrix build with 5 boolean preprocessor flags on two platforms, which then does 2^6=64 builds. These can take a bit of setting up to get correct.
Here you can read a quick example:
Continuous Integration Server - Hudson
I think that Hudson, jenkins and builbot are worth a try. Wasting a day or two evaluating and trying them with a quick example will help you to choose confidently.
Most of the tools I'm seeing seem to be catered to Java (Jenkins, CruiseControl, etc.) or .Net (CruiseControl.net, etc.) Can those be used with a C++ toolchain, or will I constantly be fighting the system? Anything that you have used in the past and found works well with Qt / C++?
Any reasonably capable CI system will have a piece that will allow you to execute any program you want for your build command.
Here's what I would consider:
Does the CI system run on your system(s) of choice
Does it allow you an easy way to view your logs
Does it integrate with your test runner
Does it integrate with your code coverage reports (e.g. BullseyeCoverage w/C++ & Qt)
Will it publish your files in a manner sensible for your needs
Will at provide an archive/store of files, if necessary (e.g. pdbs & lib*.so.debug)
If the CI system doesn't support feature X, will you have to write it for each supported OS/system
Is the CI system / UI easy for you to use.
I did the above using CruiseControl and most things were pretty easy. I wrote everything in make or qmake and simply called out to the command that I needed executed. For unit test and code coverage integration I output stuff to XML and transformed it to something supported by CruiseControl.
My recommendation, take a look at the recommended CI systems and examine them based on the criteria above.
I'm using buildbot for this. I've been using it for 4 years, and I feel very happy with it.
It is an application written in python, that runs on a server and can manage multiple clients on various OSes. I'm currently using Windows XP, Windows 7, Debian, Ubuntu and CentOS build slaves. My projects are C++, and one of them (the end user GUI) is made in Python. But we've also integrated with other frameworks, for other features than GUI.
What is really good about buildbot is that it works by running command lines on slaves. With this, you can do whatever you want. Even on Windows systems to compile using Visual Studio! From these command lines, you get all the output centralized on the server, and accessible.
You may also find alternatives on this site that references many of them.
Disclaimer: I looked at it 3 years ago, I don't know if it is still accurate.
Hudson or Jenkins is pretty good.
Jenkins is indeed pretty popular for developing such a custom service, even after all these years, considering the question is already 7 years old.
Felgo also offers a Continuous Integration and Delivery (CI/CD) service for Qt. It supports desktop platforms as well as iOS, Android and embedded targets. The full feature set is described in the blog post.
Disclaimer: I am a software developer at Felgo
I am working on my free time on a multiplatform/multi-architecture library written in C++.
Before every release, I have to boot up several computers (One on Windows, one on Linux, another one on Mac OS, ...) just to make sure the code compiles and runs fine on every platform.
So I decided to create my own buildmachine but I really don't know what tools exist to do this. I'd like my buildmachine to run on Linux but any other solution will be accepted.
Ideally, I would just have to click on a "Build all" button, and it would compile my library for the different platforms/architectures, generate archives from the result and/or report potential errors.
My project "constraints" are:
It is written in C++
It compiles on Windows using SConstruct/MinGW and Visual Studio 2010
It compile on Linux and Mac OS using SConstruct/g++
The sources are stored into Subversion (svn)
Do you know any tool/set of tools that could help me achieving my goal ?
Thank you very much.
I would setup 3 VMs (VirtualBox is free), one for each platform.
Install TeamCity (or Hudson) on Linux and agents on the other VMs and then it's just a matter of configuring the build system.
At the very basic level you should have 2 tasks: one to checkout the sources from Subversion and another to invoke scons.
I'm not too familiar with Hudson but TeamCity is certainly capable of generating reports of a build, display progress etc.
I am currently researching a good setup for a continuous integration server which would build various C++ applications for several Linux distributions.
My primary question is how other users here have handled the differences in system libraries between Linux distributions?
While it might be relatively easy to build direct dependencies such as UI libraries along with an application, "indirect" dependencies such as glibc look like a big pain if they had to be built alongside the application every time. I am therefore thinking of moving the actual build execution into a separate virtual machine for each distribution, e.g. using rlogin to run the commands. My goal is to prevent binary incompatibilities between build-machine library versions and those deployed in the target distributions.
Does anyone here have any experience with such a process and could tell if the above sounds like a feasible approach?
We use Jenkins (Contiguous Integration) and CMake (build system) for this purpose. Jenkins is similar to Buildbot, i.e. it also has buildmaster and buildslaves. Currently I have setup 8 slaves to build for 4 different platforms (FC8, FC10, FC12 and Windows 7). We build both debug and release binaries, so I dedicated one slave for each platform and build type.
As for the third party libraries like Qt & Boost, I compiled them on each platform and checked them into a separate repository.
#esavard: We use CMake 2.8 to do cross compilation, I have not used minigw but a quick google search indicates that it is possible. Here is a link to a tutorial to cross compile for Windows on Linux using CMake and miniGW.
I have not used Buildbot and cannot comment on its features but thought I should mention an alternative that we are currently using.
Hope this helps.
Buildbot has the notion of buildmasters and buildslaves.
A buildmaster takes care of displaying the web GUI, sending email, triggering builds, and other housekeeping. The buildslaves wait on the buildmaster and when commanded perform builds.
We have buildbot set up to build on a number of different platforms, some of them VMs, and it's working well for us.
Certainly buildbot and many virtual machines is the way to go with this. We have VMWare ESX server hosting many build slaves which overnight compile our application. The application is then tested on another virtual machine (not the build slave and just having a default OS install) to verify that it works and all dependency's are packaged.
The nice thing I would like to do is make the testing run time phase an automated step but I haven't been given the time to do that yet.