A Build Flow Plugin script can call build.setDescription() to set the build's description. Can something similar be done in a JobDSL script? Or would the script have to go through injecting an environment variable?
The Build Flow Plugin and the Job DSL Plugin are not necessarily comparable, they address different use cases. The Job DSL describe the static configuration of jobs whereas the Build Flow DSL describes a dynamic flow control of jobs.
That said, the Job DSL can configure the Description Setter Plugin as a post-build action:
job {
...
publishers {
...
buildDescription('', '${BRANCH}')
}
}
See the Job DSL wiki for details: https://github.com/jenkinsci/job-dsl-plugin/wiki/Job-reference#build-description-setter
To set the description of the seed job (the job with runs the Job DSL scripts), you can print something to the console log using println and then use the Description Setter Plugin to parse the log and set the description. Or you can use the Jenkins API from the DSL script:
def build = hudson.model.Executor.currentExecutor().currentExecutable
build.description = 'whatever'
Related
Is there a way to specify ignoreExisting on pipelineJob? I don't see it listed in plugin/job-dsl/api-viewer/index.html but maybe I'm missing a way to do it.
In my setup all jobs are defined using job dsl thru the configuration as code module. All jobs defined by jobs dsl are use to load pipelines where all info for the jobs is configured. Since all configuration from the jobs are stored in the pipeline, I'd like to be able to define each job and have it not be modified by job dsl again unless the job is removed.
Current behavior is that the job dsl overwrites any changes in the job made by the pipeline which is not what I want. Any way around this? I thought ignoreExisting would do the trick but it doesn't seem to be available in pipelineJob
I'm writing a cloud function that process a pub/sub message which contains a grpc message. I would like, at install/(re)deploy time, the cloud function to perform some action: pull the protobuf definition from some GitHub repository and generate the correspondent python code with grpcio-tools, roughly in the same line of this.
So far I can only find in the documentation how to add dependencies, however I'm looking for some sort of on install "hook": something that would allow me to perform some actions before the function is actually deployed.
Is this possible? Any advice?
Cloud Functions has no such hook. You can perform the work in a script on the machine that performs the deployment. It's not uncommon to write scripts to automate work like this.
I have many jenkins jobs which do things like
execute myProgram.exe to convert input.txt to output.txt
if (the conversion is successful) {
trigger another jenkins job
} else {
send a e-mail to notify someone that the build fails
}
All of the them are Freestyel project
I want to write unit test code to test both the success and failure cases of my jenkins jobs.
If the build succeeds, the test code should check if output.txt's content is correct and avoid triggering another jenkins job.
If the build fails, the test code should check if the e-mail was successfully sent to the receiver.
Is there any test framework for doing things like this?
It seems like I can find a solution here. But I couldn't find examples telling me how to write unit test that uses existing jenkins jobs in that tutorial.
Or should I use another tool (not jenkins) for doing this kind of jobs?
How to test Jenkins Pipelines is currently an ongoing issue; see JENKINS-33925.
Though in that thread, you'll see that a couple of people have been working on solutions, e.g. https://github.com/macg33zr/pipelineUnit
You can use Jenkins Job Builder and describe your jobs in YAML files.
Do your configuration changes in your branch and continuously deploy them to test Jenkins server.
It is pretty simple with different config files in Jenkins Job Builder.
Run your jobs on test Jenkins master and merge to master after running your jobs.
Jenkins Job Builder Docs
I want to access the config of the currently executing seed job from within my DSL script.
For example, I want to use the same SCM settings as my seed job for the jobs that I am creating.
How do I do this?
There is no built-in DSL way to do that. You need to have a look at the Jenkins API. To obtain the SCM settings of the currently executing job, do this:
hudson.model.Executor executor = hudson.model.Executor.currentExecutor()
hudson.model.FreeStyleBuild build = executor.currentExecutable
hudson.model.FreeStyleProject project = build.project
hudson.scm.SCM scm = project.scm
I have configured Jenkins and created a job to checkout, build and run a simple project from SVN. I could configure this job to run periodically, like once every five minutes. But I'd like it to build the project only when something has changed on the SVN repository. I read the "Builds by source changes" section of this document, but could not figure out what exactly I am meant to do! Any help would be appreciated!
When you configure your job you have to do this:
In the Source Code Management specify the source management system you use (for instance SVN) and fill all required fields (url, authentication, ...) (probably already done since you are able to do a checkout)
in the Build Triggers section : choose Poll SCM with a schedule */10 * * * * for checking the repository every 10 minutes.
Go to the configuration of your project and scroll down to the Build Triggers directly under the Source Code Management section. Here you've to configure it this way:
The syntax to schedule the job is in the crontab format take a look here.
what you are looking for is Subversion post commit hook that will execute what ever you script in your hook script. Take a look at the following example by Mike West:
Mike West - Subversion Post-commit-hook
good luck!