I've been searching over the web from yesterday about this and can't find a proper answer so I was wondering if there is someone over here that might help me to answer to my problem also to say me it's impossible at the moment to retrieve this information.
As the title I wrote, I have a pipeline in Jenkins which connects 3-4 jobs and everything runs perfectly and sequentially.
It is made like this, just to be clear: JOB1 -> JOB2 -> JOB3.
All I want to know, and I can't find, is if there is a way to check the build pipeline status itself. Do Jenkins maintains this information?
Like... I would be able to know when the pipeline is finished:
Pseudo: if pipeline is finished then do something ... end
Just an idea. You can add something at the end of the build script, then when the build is finished, you will get the thing you added.
Related
I've got a job that's failing and I think the problem is that I've misunderstood what the layout of the directory structure for the running job.
How can I see what's actually on disk so I can diagnose what's happening?
Can I do it from the GoCD UI, or am I going to have to connect to the agent box and look at things that way?
In Jenkins, I'd just use the "workspace" link to eyeball the layout.
Currently, I'm adding directorying listing commands to the jobs themselves, then inspect the out in the logs.
A comprehensive way to inspect the structure is to define an "artifact" of * - this declares the entire pipeline's working directory as an artifact, then you can inspect it in the UI.
This is probably a very bad plan, because it's going to use up tons of disk space and it takes a long time to create the artifact so it slows down your pipeline a lot.
Maybe the question sounds stupid but I was wondering if once the job is successfully finished and having ID, is it possible to start the same job again?
Or is it necessary to create another one?
Because otherwise I would have the job with the same name throughout the list.
I just want to know if there is a way to restart it without recreating it again.
It's not possible to run the exact same job again, but you can create a new job with the same name that runs the same code. It will just have a different job ID and show up as a separate entry in the job list.
If you want to make running repeated jobs easier, you can create a template. This will let you create jobs from that template via a gcloud command instead of having to run your pipeline code.
Cloud Dataflow does have a re-start function. See SDK here. One suggested pattern (to help with deployment) is to create a template for the graph you want to repeatedly run AND execute the template.
I have many jenkins jobs which do things like
execute myProgram.exe to convert input.txt to output.txt
if (the conversion is successful) {
trigger another jenkins job
} else {
send a e-mail to notify someone that the build fails
}
All of the them are Freestyel project
I want to write unit test code to test both the success and failure cases of my jenkins jobs.
If the build succeeds, the test code should check if output.txt's content is correct and avoid triggering another jenkins job.
If the build fails, the test code should check if the e-mail was successfully sent to the receiver.
Is there any test framework for doing things like this?
It seems like I can find a solution here. But I couldn't find examples telling me how to write unit test that uses existing jenkins jobs in that tutorial.
Or should I use another tool (not jenkins) for doing this kind of jobs?
How to test Jenkins Pipelines is currently an ongoing issue; see JENKINS-33925.
Though in that thread, you'll see that a couple of people have been working on solutions, e.g. https://github.com/macg33zr/pipelineUnit
You can use Jenkins Job Builder and describe your jobs in YAML files.
Do your configuration changes in your branch and continuously deploy them to test Jenkins server.
It is pretty simple with different config files in Jenkins Job Builder.
Run your jobs on test Jenkins master and merge to master after running your jobs.
Jenkins Job Builder Docs
I am working on a testing automation project and I am able to automate end to end test cases but in my scope I have to enable functionality to run test-case when a workflow is checked-in to repository. For this I need a trigger with information like what workflow and in which folder is checked it's checked in. Unfortunately I am unable to get anything till now and my admins are also not sure if this can be achieved. Any lead will be a great help.
Even if I can get the logs of workflow check-in, I can process them in real-time to get this information.
Is it possible to trigger a Hudson/Jenkins build only when a certain string appears in a commit-message?
For instance, I want to trigger a build that rolls out my application to the dev environment by writing a commit message like:
MYPROJECT-123 Fixed NPE in MyClass.java #deploy:DEV
The general idea is described in this great talk on Continuos Deployment but I couldn't find any information on how to do this in Hudson.
I would prefer to have this behavior in Hudson itself and not in an external system like commit-hooks or web-hooks.
I don't know of an out of the box way you can parse the SCM message as part of the trigger. You have a couple of options that might achieve what you want though
Write your own Hudson SCM plugin
Chain your jobs together into a build pipeline. The first job could simply look for that message in the changelog.xml to determine if the next build is triggered or not.
If you are looking at building a pipeline of build jobs, check out the build-pipeline-plugin. http://www.centrumsystems.com.au/blog/?p=121
Anyone got a more elegant solution??
Cheers,
Geoff
There is a plugin called Commit Message Trigger Plugin, but it had just a 0.1 release.
Maybe the easiest way is to use a version control post commit (or push) trigger to start a Hudson Job. You'd one anyway to automatically start your build.