In Jenkins "Job DSL Plugin", how to specify alternate location of pom.xml in 'mavenJob'? - jenkins-job-dsl

I was looking at the instructions here and cannot figure out how to set a alternate pom.xml location of the Root POM, other than default.
https://jenkinsci.github.io/job-dsl-plugin/#path/mavenJob
Does anyone out there know how to set that?

You can use the rootPOM DSL method.
mavenJob('example') {
rootPOM('sub-module/pom.xml')
}

Try using the following code:
mavenJob('JobXXX') {
scm {
github('Repository/Project', 'master')
}
goals('clean compile build')
rootPOM('projectname/pom.xml')
}

Related

CakePHP 3.7 Shell commands inside a plugin couldn't execute

namespace Admin\Shell;
use Cake\Console\Shell;
class AdminAlertShell extends Shell{
...
...
}
Here 'Admin' is plugin, So I created this file inside the plugins folder structure.
File path : /plugins/Admin/src/Shell/AdminAlertShell.php
Tried to run this in CLI
bin/cake admin_alert
But an exception throws
Exception: Unknown command cake admin_alert. Run cake --help to get the list of valid commands. in [localpath/vendor/cakephp/cakephp/src/Console/CommandRunner.php, line 346]
It was working. But I don't know what happened to this. I had upgraded cakephp 3.5 to 3.7. But, I am not sure this caused the issue.
I just tracked down the source of the issue in my project.
Inside my plugin there was a file: src/Plugin.php
Inside this class there was the following lines of code:
/**
* #inheritDoc
*/
public function console(CommandCollection $commands): CommandCollection
{
// Add console commands here.
return $commands;
}
This was probably generated via bake.
I saw that the parent was not called. In the path is added in the parent.
Change this method to look like this:
/**
* #inheritDoc
*/
public function console(CommandCollection $commands): CommandCollection
{
// Add console commands here.
$commands = parent::console($commands);
return $commands;
}
Now the parent is called and the path is added to the command collection.
As a side note I also see the middleware is not calling its parent.
Think it would be a good idea to fix that one aswell.
As an alternative you can just clear out the class and all defaults should be used.
Hope this saves someone the hours it cost me to figure this one out.

How can I configure Jenkins using .groovy config file to set up 'Build strategies -> Tags' in my multi-branch pipeline?

I want something similar for 'Basic Branch Build Strategies' plugin https://plugins.jenkins.io/basic-branch-build-strategies
I figure out to make it something like this but it's not working:
def traits = it / sources / data / 'jenkins.branch.BranchSource' / source / traits
traits << 'com.cloudbees.jenkins.plugins.bitbucket.TagDiscoveryTrait' {
strategyId(3)
}
traits << 'jenkins.branch.buildstrategies.basic.TagBuildStrategyImpl' {
strategyId(1)
}
Here you can find full config file: https://gist.github.com/sobi3ch/170bfb0abc4b7d91a1f757a9db07decf
The first trait is working fine 'TagDiscoveryTrait' but second (my change) doesn't apply on Jenkins restart, 'TagBuildStrategyImpl'.
How can I configure 'Build strategies -> Tags' in .groovy config for my multibranch pipeline using 'Basic Branch Build Strategies' plugin?
UPDATE: Maybe I don't need to use traits at all. Maybe there is a simpler solution. I'm not expert in Jenkins groovy configuration.
UPDATE 2: This is scan log for my code https://gist.github.com/sobi3ch/74051b3e33967d2dd9dc7853bfb0799d
I am using the following Groovy init script to setup a Jenkins job with a "tag" build strategy.
def job = instance.createProject(WorkflowMultiBranchProject.class, "<job-name>")
PersistedList sources = job.getSourcesList()
// I am using Bitbucket, you need to replace this with your source
def pullRequestSource = new BitbucketSCMSource("<repo-owner>", "<repo-name>")
def source = new BranchSource(pullRequestSource)
source.setBuildStrategies([new TagBuildStrategyImpl(null, null)])
sources.add(source)
If I am recognizing the syntax correctly, the question is about Job DSL plugin.
The problem with the attempted solution is that the TagBuildStrategyImpl is not a Trait (known as Behavior in UI) but a Build Strategy. The error confirms this:
java.lang.ClassCastException: jenkins.branch.buildstrategies.basic.TagBuildStrategyImpl cannot be cast to jenkins.scm.api.trait.SCMSourceTrait
Class cannot be cast because TagBuildStrategyImpl does not extend SCMSourceTrait, it extends BranchBuildStrategy.
The best way to discover the JobDSL syntax applicable for a specific installation of Jenkins is to use the built-in Job DSL API Viewer. It is available under <jenkins-location>/plugin/job-dsl/api-viewer/index.html, e.g. https://ci.jenkins.io/plugin/job-dsl/api-viewer/index.html
On the version I am running what you are try to achieve would look approximately like this:
multibranchPipelineJob('foo') {
branchSources {
branchSource {
source {
bitbucket {
...
traits {
bitbucketTagDiscovery()
}
}
}
buildStrategies {
buildTags { ... }
}
}
}
}

How to load AWS credentials in Jenkins job DSL?

I have the following DSL structure:
freeStyleJob {
wrappers {
credentialsBinding {
[
$class:"AmazonWebServicesCredentialsBinding",
accessKeyVariable: "AWS_ACCESS_KEY_ID",
credentialsId: "your-credential-id",
secretKeyVariable: "AWS_SECRET_ACCESS_KEY"
]
}
}
steps {
// ACCESS AWS ENVIRONMENT VARIABLES HERE!
}
}
However, this does not work. What is the correct syntax to do so? For Jenkins pipelines, you can do:
withCredentials([[
$class: "AmazonWebServicesCredentialsBinding",
accessKeyVariable: "AWS_ACCESS_KEY_ID",
credentialsId: "your-credential-id",
secretKeyVariable: "AWS_SECRET_ACCESS_KEY"]]) {
// ACCESS AWS ENVIRONMENT VARIABLES HERE!
}
but this syntax does not work in normal DSL job groovy.
tl;dr how can I export AWS credentials defined by the AmazonWebServicesCredentialsBinding plugin into environment variables in Groovy job DSL? (NOT PIPELINE PLUGIN SYNTAX!)
I found a solution to solve this problem:
wrappers {
credentialsBinding {
amazonWebServicesCredentialsBinding {
accessKeyVariable("AWS_ACCESS_KEY_ID")
secretKeyVariable("AWS_SECRET_ACCESS_KEY")
credentialsId("your-credentials-id")
}
}
}
This will lead to the desired outcome.
I'm not able to re-use Miguel's solution (even with installed aws-credentials plugin), so here is another approach with DSL configure block
configure { project ->
def bindings = project / 'buildWrappers' / 'org.jenkinsci.plugins.credentialsbinding.impl.SecretBuildWrapper' / 'bindings'
bindings << 'com.cloudbees.jenkins.plugins.awscredentials.AmazonWebServicesCredentialsBinding' {
accessKeyVariable("AWS_ACCESS_KEY_ID")
secretKeyVariable("AWS_SECRET_ACCESS_KEY")
credentialsId("credentials-id")
}
}
This is the full detailed answer that #bitbrain did with possible fix for issue reported by #Viacheslav
freeStyleJob {
wrappers {
credentialsBinding {
amazonWebServicesCredentialsBinding {
accessKeyVariable("AWS_ACCESS_KEY_ID")
secretKeyVariable("AWS_SECRET_ACCESS_KEY")
credentialsId("your-credentials-id")
}
}
}
}
Ensure this is on the classpath for compilation:
compile "org.jenkins-ci.plugins:aws-credentials:1.23"
If you have tests running you might also need to add the plugin to the classpath:
testPlugins "org.jenkins-ci.plugins:aws-credentials:1.23"
I believe this is why there are reports of people needing to manually modify the XML to get this to work. Hint: if you can pass the compile stage (or compile in IDE) but not able to compile tests then this is the issue.

Clean task does not clean up specified outputs.file

I wrote a build.gradle script to automatically download hazelcast from a given URL. Afterwards the file is unzipped and only the mancenter.war as well as the origin zip file remains in the destination directory. Later on this war file is taken referenced for a jetty run.
Nevertheless, although I defined outputs.file for two of my tasks, the files do not get cleaned up when I execute gradle clean. Thus I'd like to know what I have to do that the downloaded and unzipped files get removed when I execute gradle clean. Here is my script:
Btw., if you have any recommendations how to enhance the script, please don't hesitate to tell me!
apply plugin: "application"
dependencies {
compile "org.eclipse.jetty:jetty-webapp:${jettyVersion}"
compile "org.eclipse.jetty:jetty-jsp:${jettyVersion}"
}
ext {
distDir = "${projectDir}/dist"
downloadUrl = "http://download.hazelcast.com/download.jsp?version=hazelcast-${hazelcastVersion}"
zipFilePath = "${distDir}/hazelcast-${hazelcastVersion}.zip"
warFilePath = "${distDir}/mancenter-${hazelcastVersion}.war"
mainClass = "mancenter.MancenterBootstrap"
}
task downloadZip() {
outputs.file file(zipFilePath)
logging.setLevel(LogLevel.INFO)
doLast {
ant.get(src: downloadUrl, dest: zipFilePath)
}
}
task extractWar(dependsOn: downloadZip) {
outputs.file file(warFilePath)
logging.setLevel(LogLevel.INFO)
doLast {
ant.unzip(src: zipFilePath, dest: distDir, overwrite:"true") {
patternset( ) {
include( name: '**/mancenter*.war' )
}
mapper(type:"flatten")
}
}
}
task startMancenter(dependsOn: extractWar, type: JavaExec) {
main mainClass
classpath = sourceSets.main.runtimeClasspath
args warFilePath
}
UPDATE
I found this link, which describes how to provide additional locations to delete when the clean task is invoked. Basically you can do sth. like this:
clean{
delete zipFilePath
delete warFilePath
}
I got confirmation from the source code that the clean task simply deletes the build directory. It assumes that you want to clean everything, and that all the task outputs are somewhere in this build directory.
So, the easiest and best practice is to only store outputs somewhere under the build directory.
You can add tasks to clean like this:
clean.dependsOn(cleanExtractWar)
clean.dependsOn(cleanDownloadZip)
cleanTaskName is a virtual task will clear all outputs for TaskName.

using phpunit without composer

I'm trying to instal PHPunit on an old system,
I'm dealing with several phar issues,
from now i've managed to have PHPunit running, to have my autoload working, also the pPHPunit, but now, it is trying to call composer.
i Had to add an extention "PHPUnit/Extensions/Story", it's also working, but now, i've got to manage composer...
I tried to add the phar, to extract the phar , ... but nothing seems to work (if "Composer\Autoload\ClassLoader.php" work, then I've got an "Instantiator\Instantiator.php" missing...)
So, is it possible to have PHPunit running without composer?
I juste solved the problem :
despite I called "spl_autoload_register" for my own framework afeter including PHPunit and Composer"s ones, mine was sometimes called before, so I juste added a whitelisting in my autoloader (see $tabLibCommunPrefixes):
function phpunit_bootstrap_autoload($class_name) {
$prefixe = substr($class_name, 0, strpos($class_name, '_'));
$tabLibCommunPrefixes = array('Smarty', 'Zend', 'Bvb', 'Composer', 'domxml-php4-compat', 'FirePhp', 'Mobile', 'Nusoap', 'Pear', 'phing', 'PhpMailer', 'phpThumb', 'Sitra', 'Smarty3', 'smarty', 'test', 'upload', );
if (in_array($prefixe, $tabLibCommunPrefixes)) {
require_once str_replace('_', '/', $class_name) . '.php';
return true;
}
return false;
}
One can simply use composer to handle only PHPunit and it's dependencies.
So the easiest way is to simply use composer. There is nothing wrong at using composer for just a small part of your dependencies. In fact, for some (small) projects I even use it for no dependency at all (only to handle the autoloading).
You can use it in the subdirectory test, or more conventionally at the root of the project.