Clean task does not clean up specified outputs.file - build

I wrote a build.gradle script to automatically download hazelcast from a given URL. Afterwards the file is unzipped and only the mancenter.war as well as the origin zip file remains in the destination directory. Later on this war file is taken referenced for a jetty run.
Nevertheless, although I defined outputs.file for two of my tasks, the files do not get cleaned up when I execute gradle clean. Thus I'd like to know what I have to do that the downloaded and unzipped files get removed when I execute gradle clean. Here is my script:
Btw., if you have any recommendations how to enhance the script, please don't hesitate to tell me!
apply plugin: "application"
dependencies {
compile "org.eclipse.jetty:jetty-webapp:${jettyVersion}"
compile "org.eclipse.jetty:jetty-jsp:${jettyVersion}"
}
ext {
distDir = "${projectDir}/dist"
downloadUrl = "http://download.hazelcast.com/download.jsp?version=hazelcast-${hazelcastVersion}"
zipFilePath = "${distDir}/hazelcast-${hazelcastVersion}.zip"
warFilePath = "${distDir}/mancenter-${hazelcastVersion}.war"
mainClass = "mancenter.MancenterBootstrap"
}
task downloadZip() {
outputs.file file(zipFilePath)
logging.setLevel(LogLevel.INFO)
doLast {
ant.get(src: downloadUrl, dest: zipFilePath)
}
}
task extractWar(dependsOn: downloadZip) {
outputs.file file(warFilePath)
logging.setLevel(LogLevel.INFO)
doLast {
ant.unzip(src: zipFilePath, dest: distDir, overwrite:"true") {
patternset( ) {
include( name: '**/mancenter*.war' )
}
mapper(type:"flatten")
}
}
}
task startMancenter(dependsOn: extractWar, type: JavaExec) {
main mainClass
classpath = sourceSets.main.runtimeClasspath
args warFilePath
}
UPDATE
I found this link, which describes how to provide additional locations to delete when the clean task is invoked. Basically you can do sth. like this:
clean{
delete zipFilePath
delete warFilePath
}

I got confirmation from the source code that the clean task simply deletes the build directory. It assumes that you want to clean everything, and that all the task outputs are somewhere in this build directory.
So, the easiest and best practice is to only store outputs somewhere under the build directory.

You can add tasks to clean like this:
clean.dependsOn(cleanExtractWar)
clean.dependsOn(cleanDownloadZip)
cleanTaskName is a virtual task will clear all outputs for TaskName.

Related

How can I configure Jenkins using .groovy config file to set up 'Build strategies -> Tags' in my multi-branch pipeline?

I want something similar for 'Basic Branch Build Strategies' plugin https://plugins.jenkins.io/basic-branch-build-strategies
I figure out to make it something like this but it's not working:
def traits = it / sources / data / 'jenkins.branch.BranchSource' / source / traits
traits << 'com.cloudbees.jenkins.plugins.bitbucket.TagDiscoveryTrait' {
strategyId(3)
}
traits << 'jenkins.branch.buildstrategies.basic.TagBuildStrategyImpl' {
strategyId(1)
}
Here you can find full config file: https://gist.github.com/sobi3ch/170bfb0abc4b7d91a1f757a9db07decf
The first trait is working fine 'TagDiscoveryTrait' but second (my change) doesn't apply on Jenkins restart, 'TagBuildStrategyImpl'.
How can I configure 'Build strategies -> Tags' in .groovy config for my multibranch pipeline using 'Basic Branch Build Strategies' plugin?
UPDATE: Maybe I don't need to use traits at all. Maybe there is a simpler solution. I'm not expert in Jenkins groovy configuration.
UPDATE 2: This is scan log for my code https://gist.github.com/sobi3ch/74051b3e33967d2dd9dc7853bfb0799d
I am using the following Groovy init script to setup a Jenkins job with a "tag" build strategy.
def job = instance.createProject(WorkflowMultiBranchProject.class, "<job-name>")
PersistedList sources = job.getSourcesList()
// I am using Bitbucket, you need to replace this with your source
def pullRequestSource = new BitbucketSCMSource("<repo-owner>", "<repo-name>")
def source = new BranchSource(pullRequestSource)
source.setBuildStrategies([new TagBuildStrategyImpl(null, null)])
sources.add(source)
If I am recognizing the syntax correctly, the question is about Job DSL plugin.
The problem with the attempted solution is that the TagBuildStrategyImpl is not a Trait (known as Behavior in UI) but a Build Strategy. The error confirms this:
java.lang.ClassCastException: jenkins.branch.buildstrategies.basic.TagBuildStrategyImpl cannot be cast to jenkins.scm.api.trait.SCMSourceTrait
Class cannot be cast because TagBuildStrategyImpl does not extend SCMSourceTrait, it extends BranchBuildStrategy.
The best way to discover the JobDSL syntax applicable for a specific installation of Jenkins is to use the built-in Job DSL API Viewer. It is available under <jenkins-location>/plugin/job-dsl/api-viewer/index.html, e.g. https://ci.jenkins.io/plugin/job-dsl/api-viewer/index.html
On the version I am running what you are try to achieve would look approximately like this:
multibranchPipelineJob('foo') {
branchSources {
branchSource {
source {
bitbucket {
...
traits {
bitbucketTagDiscovery()
}
}
}
buildStrategies {
buildTags { ... }
}
}
}
}

How to load AWS credentials in Jenkins job DSL?

I have the following DSL structure:
freeStyleJob {
wrappers {
credentialsBinding {
[
$class:"AmazonWebServicesCredentialsBinding",
accessKeyVariable: "AWS_ACCESS_KEY_ID",
credentialsId: "your-credential-id",
secretKeyVariable: "AWS_SECRET_ACCESS_KEY"
]
}
}
steps {
// ACCESS AWS ENVIRONMENT VARIABLES HERE!
}
}
However, this does not work. What is the correct syntax to do so? For Jenkins pipelines, you can do:
withCredentials([[
$class: "AmazonWebServicesCredentialsBinding",
accessKeyVariable: "AWS_ACCESS_KEY_ID",
credentialsId: "your-credential-id",
secretKeyVariable: "AWS_SECRET_ACCESS_KEY"]]) {
// ACCESS AWS ENVIRONMENT VARIABLES HERE!
}
but this syntax does not work in normal DSL job groovy.
tl;dr how can I export AWS credentials defined by the AmazonWebServicesCredentialsBinding plugin into environment variables in Groovy job DSL? (NOT PIPELINE PLUGIN SYNTAX!)
I found a solution to solve this problem:
wrappers {
credentialsBinding {
amazonWebServicesCredentialsBinding {
accessKeyVariable("AWS_ACCESS_KEY_ID")
secretKeyVariable("AWS_SECRET_ACCESS_KEY")
credentialsId("your-credentials-id")
}
}
}
This will lead to the desired outcome.
I'm not able to re-use Miguel's solution (even with installed aws-credentials plugin), so here is another approach with DSL configure block
configure { project ->
def bindings = project / 'buildWrappers' / 'org.jenkinsci.plugins.credentialsbinding.impl.SecretBuildWrapper' / 'bindings'
bindings << 'com.cloudbees.jenkins.plugins.awscredentials.AmazonWebServicesCredentialsBinding' {
accessKeyVariable("AWS_ACCESS_KEY_ID")
secretKeyVariable("AWS_SECRET_ACCESS_KEY")
credentialsId("credentials-id")
}
}
This is the full detailed answer that #bitbrain did with possible fix for issue reported by #Viacheslav
freeStyleJob {
wrappers {
credentialsBinding {
amazonWebServicesCredentialsBinding {
accessKeyVariable("AWS_ACCESS_KEY_ID")
secretKeyVariable("AWS_SECRET_ACCESS_KEY")
credentialsId("your-credentials-id")
}
}
}
}
Ensure this is on the classpath for compilation:
compile "org.jenkins-ci.plugins:aws-credentials:1.23"
If you have tests running you might also need to add the plugin to the classpath:
testPlugins "org.jenkins-ci.plugins:aws-credentials:1.23"
I believe this is why there are reports of people needing to manually modify the XML to get this to work. Hint: if you can pass the compile stage (or compile in IDE) but not able to compile tests then this is the issue.

CakePHP3 Plugin cell is missing

My Process
make plugin cell
$ bin/cake bake plugin Abc
$ bin/cake bake cell Abc.New
upper process make 3 files
plugins/Abc/src/View/Cell/NewCell.php
plugins/Abc/src/Template/Cell/Menu/display.php
and test file.
insert layout/default.ctp next code
<?php $cell = $this->cell('Abc.New'); ?>
error
Cell class Abc.New is missing.
Cake\View\Exception\MissingCellException
I can't find solution. please help me!!
Post is a bit old but in case someone else stumbles on this thread...
Cells rely on Namespaces to load and render the correct [cell].ctp file. In other words, even though you have done the required Plugin::loadAll(); in your bootstrap.php file, you still need to modify the composer.json file and add the Plugin. For example, my Plugin is called 'Metronic', notice the extra 2 lines in autoload and autolaod-dev
"autoload": {
"psr-4": {
"App\\": "src",
"Metronic\\": "./plugins/Metronic/src"
}
},
"autoload-dev": {
"psr-4": {
"App\\Test\\": "tests",
"Cake\\Test\\": "./vendor/cakephp/cakephp/tests",
"Metronic\\Test\\": "./plugins/Metronic/tests"
}
},
See CakePHP manual here http://book.cakephp.org/3.0/en/plugins.html#autoloading-plugin-classes.
My suggestions is that you use the Bake command to create Plugins in the future. The manual does not explicitly say this, but this is what happens when you use the Bake command:
It creates the basic directory structure for the Plugin
It inserts a line in bootstrap.php e.g.Plugin::load('Metronic', ['bootstrap' => false, 'routes' => true]);
It inserts 2 lines in the composer.json file (as per my example above)
The only thing you need to then do is tell Composer to refresh its autoloading cache
$ bin\cake bake plugin Metronic
$ php composer.phar dumpautoload
Hope this helps..

Dojo build to single file

I want to build my Dojo JavaScript code that I have carefully structured into packages into a single JavaScript file. I'm a little confused as to how to do it.
For now I have this:
var profile = {
...
layers: {
'app': {
include: [
'dojo/module1',
'dojo/module2',
...,
'dojo/moduleN',
'package2/module1',
'package2/module2',
...,
'package2/moduleN'
]
}
}
...
};
Do I really have to manually add all the modules to the app layer? Can't I just say "all", or better yet, "all referenced"? I don't want to include the dojo/something modul if I don't use it. Also, in my release folder, that's all I would like to have - one file.
So - can this even be achieved? Clean Dojo automatic build of only referenced modules into a single (minified and obfuscated of course) JavaScript file?
Take a look at the examples in the Layers section of this build tutorial:
It’s also possible to create a custom build of dojo.js; this is particularly relevant when using AMD, since by default (for backwards compatibility), the dojo/main module is added automatically by the build system to dojo.js, which wastes space by loading modules that your code may not actually use. In order to create a custom build of dojo.js, you simply define it as a separate layer, setting both customBase and boot to true:
var profile = {
layers: {
"dojo/dojo": {
include: [ "dojo/dojo", "app/main" ],
customBase: true,
boot: true
}
}
};
You can include an entire "app" in a single layer by including the root of that app (or module). Note that if a module in that app is not explicitly required by that app, it would have to be included manually. See the second example in the Layers section in the above tutorial for an illustration of that.
You can also define packages to include in your layers, if you want to change or customize the layout of your project:
packages: [
{name:'dojo', location:'other/dojotoolkit/location/dojo'},
/* ... */
],
layers: {
'dojo/dojo': { include: ['dojo/dojo'] },
/* ... */
}
You don't have to specify all the modules, if the module you add already has dependencies on others. For example, if you include 'app/MainApplication' to a layer, the builder would include all the modules that app/MainApplication depens on. If your MainApplication.js touches everything in your project, everything would be included.
During the build of a layer, dojo parses require() and define() calls in every module. Then it builds the dependency tree. Nls resources are also included.
In your code, you should name your layer as a file in existing package. In my build, it caused errors when I name a layer with a single word. You should code
var profile =
layers: {
'existingPackage/fileName': {
...
}
}
If you want to have exacltly one file, you have to include 'dojo/dojo' in your layer and specify customBase and boot flags.
Dojo always build every package before building layers. You will always have dojo and dijit folders in your release directory containing minified versions of dojo filies in them.
Just copy the layer file you need and delete everything other.

Where should I put Velocity template files for a command line utility built with Maven?

I have a small command line utility project that I'm using Maven to manage. The utility is a very simple app to populate a Velocity template and dump the results to a new file. My problem is where to put my Velocity templates. When I put them in src/test/resources/foo/bar/baz, mvn test fails because it can't find the referenced template, even though it is clearly there in target/classes/foo/bar/baz, which is where the .class file for the test and the class under test are located. If I put the template in the top-level directory of the project, the test passes, but then I'm not following the Maven project structure, and I suspect that the actual packaged .jar file wouldn't function. What am I missing?
UPDATE:
Method under test:
public final void mergeTemplate(final String templateFileName, final Writer writer) throws ResourceNotFoundException, ParseErrorException, MethodInvocationException, IOException, Exception {
Velocity.init();
Velocity.mergeTemplate(templateFileName, Charset.defaultCharset().name(), context(), writer);
}
Test method:
#Test
public void testMergeTemplate() throws Exception {
final FooGenerator generator = new FooGenerator();
final StringWriter writer = new StringWriter();
generator.mergeTemplate("foo.yaml", writer);
Assert.assertEquals("Something went horribly, horribly wrong.", EXPECTED_RESULT, writer.toString().trim());
}
The only place I can place foo.yaml and have the tests pass is in the root directory of the project, i.e., as a peer of src and target.
You can programmatically configure TEMPLATE_ROOT as follows:
Properties props = new Properties();
props.put("file.resource.loader.path", templateRootDir);
VelocityEngine engine = new VelocityEngine();
engine.init(props);
engine.evaluate(...);
You should put them in src/main/resources/foo/bar/baz because they need to be included in the main jar file.
So it turns out that instead of using something like
generator.mergeTemplate("foo.yaml", writer);
I should use something like
InputStream fooStream = getClass().getResourceAsStream("foo.yaml");
generator.mergeTemplate(fooStream, writer);
You could just configure Velocity to use the ClasspathResourceLoader, instead of the default FileResourceLoader.
I have tried Velocity.setProperty() to set the properties similar to what was said above by #Jin Kim and was able to run it.
VelocityEngine ve = new VelocityEngine();
ve.setProperty(RunTimeConstants.RESOURCE_LOADER,"file");
ve.setProperty("file.resource.loader.path",templaterootdir);
ve.init();