Jenkins - Executing a script before any build job built - build

I'd like to execute a script before I execute any project/build on a Jenkins server. I've checked out the BuildWrapper script and a couple of others.
These all seem to be based on a specific job, but I'd like to get this sorted to work across any job/build on this server.

You could add an upstream job that all jobs depends on.
Specifically, In the buildsteps of your job, you can choose "Trigger/Call builds on
other projects", add ParentJob and select "Block until the triggered
projects finish their builds" before invoking the buildsteps of your job.

Related

Can gcloud builds submit run in background?

I'm trying to automate my builds using Google Cloud Build. For various reasons the primary use flow will be to manually kick off builds rather than using a Github trigger.
Right now it looks like when you run
gcloud builds submit .
it kicks off the build process but the gcloud command must stay running for the build to continue. If I Ctrl-C it then it stops the entire build process.
I could run it in the background but presumably if I get disconnected from the network or my laptop goes to sleep that will also interrupt the build process(?)
Clearly the build can run in the background since that's what a Github trigger does; is there any way I can do the same thing from the command line?
If you add --async to your gcloud builds ... command, the job will be run asynchronously, as a long-running operation.
You may query the state of this operation in order to determine the state of the build.
https://cloud.google.com/sdk/gcloud/reference/builds/submit
Alternatively, you may use something like Linux screen to keep any job running after you detach.

Writing unit test for jenkins jobs

I have many jenkins jobs which do things like
execute myProgram.exe to convert input.txt to output.txt
if (the conversion is successful) {
trigger another jenkins job
} else {
send a e-mail to notify someone that the build fails
}
All of the them are Freestyel project
I want to write unit test code to test both the success and failure cases of my jenkins jobs.
If the build succeeds, the test code should check if output.txt's content is correct and avoid triggering another jenkins job.
If the build fails, the test code should check if the e-mail was successfully sent to the receiver.
Is there any test framework for doing things like this?
It seems like I can find a solution here. But I couldn't find examples telling me how to write unit test that uses existing jenkins jobs in that tutorial.
Or should I use another tool (not jenkins) for doing this kind of jobs?
How to test Jenkins Pipelines is currently an ongoing issue; see JENKINS-33925.
Though in that thread, you'll see that a couple of people have been working on solutions, e.g. https://github.com/macg33zr/pipelineUnit
You can use Jenkins Job Builder and describe your jobs in YAML files.
Do your configuration changes in your branch and continuously deploy them to test Jenkins server.
It is pretty simple with different config files in Jenkins Job Builder.
Run your jobs on test Jenkins master and merge to master after running your jobs.
Jenkins Job Builder Docs

Run CUnit exes generated on a Jenkins build on other machines automatically

I'm currently running a a build job on Jenkins that generates a bunch of CUnit testing exes. What I'd like to do is take those binaries and run them automatically on a bunch of other machines upon successful completion of the build.
For example: Run the build -> success -> trigger copy of EXEs to other machines -> run said EXEs -> gather output.
My question is whether or not this is possible to automate with jenkins? I'm not entirely sure the direction I should be going in. My best guess is to configure a bunch of other jobs that will trigger on successful completion of the Build job. These jobs will retrieve the files in question from somewhere, run them, and report back.
Any input would be greatly appreciated.
In the post-build actions of your build job, mark the generated executables as artifacts, then you can use the Copy Artifact plugin to distribute the test executables to another test job (or more than one) that runs a Jenkins build slave on the test machine(s). As you've mentioned, you can configure a successful build to trigger the test jobs. Based on other answers, it looks like CUnit generates an XML report of the test output that Jenkins can parse, so in the test job's post-build actions, configure the location of the test results.
From a management perspective, it is easier if there is one test job because you don't have to figure out how to partition the executables and you can read the results in one report. But depending on your use case, it might make more sense to have separate test jobs if the tests require different environments or if it makes sense to partition the test results.

Jenkins distributed builds

I need information regarding distributed build with Jenkins. The distribution i need is not the normal Jenkins distributed build (Master/slave config) where it acts like a load balancer so that the job will get executed on the available node.
For cpp projects , there are tools like distcc,netcc etc to distribute build across several machines on network so that the compilation will be fast. Is there any similar tools or way that we can use inorder to reduce the build timing?
thanks in advance
Jenkins is not a compiler - it is merely a coordinator for software build activities.
There is nothing stopping you from using distcc or similar in a build script that Jenkins starts, and the compiler nodes does not need to be aware of the fact that Jenkins started it.
If you have a distributed compiler and can make use of it from your command prompt, it can be called from a Jenkins job as well.

How to configure Jenkins to detect SVN changes and execute a job?

I have configured Jenkins and created a job to checkout, build and run a simple project from SVN. I could configure this job to run periodically, like once every five minutes. But I'd like it to build the project only when something has changed on the SVN repository. I read the "Builds by source changes" section of this document, but could not figure out what exactly I am meant to do! Any help would be appreciated!
When you configure your job you have to do this:
In the Source Code Management specify the source management system you use (for instance SVN) and fill all required fields (url, authentication, ...) (probably already done since you are able to do a checkout)
in the Build Triggers section : choose Poll SCM with a schedule */10 * * * * for checking the repository every 10 minutes.
Go to the configuration of your project and scroll down to the Build Triggers directly under the Source Code Management section. Here you've to configure it this way:
The syntax to schedule the job is in the crontab format take a look here.
what you are looking for is Subversion post commit hook that will execute what ever you script in your hook script. Take a look at the following example by Mike West:
Mike West - Subversion Post-commit-hook
good luck!