Gradle project with only configuration and no sources - build

I'd like to create a new Gradle project without any sources. I'm going to put there some configuration files and I want to generate a zip file when I build.
With maven I'd use the assembly plugin. I'm looking for the easiest and lightest way to do this with Gradle. I wonder if I need to apply the java plugin even if I don't have any sources here, just because it provides some basic and useful tasks like clean, assemble and so on. Generating a zip is pretty straightforward, I know how to do that, but I don't know where and how to put the zip generation within the gradle world.

I've done it manually until now. In other words, for projects where all I want to do is create some kind of distro and I need the basic lifecycle tasks like assemble and clean, I've simply created those tasks along with the needed dependencies.
But there is the 'base' plugin (mentioned under "Base plugins" of the "Standard Gradle Plugins" in the user's guide) that seems to fit the bill nicely for this functionality. Note though that the user guide mentions that this and the other base plugins are not yet considered part of the Gradle API and are not really documented.
The results are pretty much identical to yours, the only difference being that there are no confusing java specific tasks that always remain UP-TO-DATE.
apply plugin: 'base'
task dist(type: Zip) {
from('solr')
into('solr')
}
assemble.dependsOn(dist)
Sample run:
$ gradle clean assemble
:clean
:dist
:assemble
BUILD SUCCESSFUL
Total time: 2.562 secs

As far as I understood, it might sound strange but looks like I need to apply the java plugin in order to create a zip file. Furthermore it's handy to have available some common tasks like for example clean. The following is my build.gradle:
apply plugin: 'java'
task('dist', type: Zip) {
from('solr')
into('solr')
}
assemble.dependsOn dist
I applied the java plugin and defined my dist task which creates a zip file containing a solr directory with the content of the solr directory within my project. The last line is handy to have the task executed when I run the common gradle build or gradle assemble, since I don't want to explicitly call the dist task.
This way if I work with multiple projects I just need to execute gradle build on the parent to generate all the artifacts, including the configuration zip.
Please let me know if you have better solutions and add your own answer!

You could just use the groovy plugin and use ant. I did something like this. I do also like javanna's answer.
task jars(dependsOn: ['dev_jars']) << {
def fromDir = file('/database-files/non_dev').listFiles().sort()
File dist = new File("${project.buildDir}/dist")
dist.mkdir()
fromDir.each { File dir ->
File destFile = new File("${dist.absolutePath}" + "/" + "database-connection-" + dir.name + ".jar")
println destFile.getAbsolutePath()
ant.jar(destfile:destFile, update:false, baseDir:dir)
}
}

Related

PHPStan, exclude all and specify files to check

Im trying to setup PHPStan on a older, bigger, codebase. How could i exclude everything and than maybe by config define what to analyse.
I have thought about using a separate folder for it, but that would mean constantly moving files which might lead to breaking of the code. So i am hoping to exclude everything and then adding files to the analysers per file.
At this moment the only solution i was able to find is defining a script in composer.json
"scripts": {
"phpstan": "./vendor/bin/phpstan analyse --memory-limit=1G --no-progress --level 1 `cat phpstan_analyse_files`"
}
And keeping a list of files to analyise in the file phpstan_analyse_files
The best way to do what you need is excludePaths section as of PHPStan 1.0:
# phpstan.neon
parameters:
excludePaths:
- 'old-code/OldClass.php'
- 'another-old-code/*'
See docs or this real project phpstan.neon setup for inspiration.

Bamboo plan using dependent's build plan number when pulling down artifact from parent plan

I have two Bamboo plans, the first one produces a shared artifact (a library) and the second one attempts to download it. The first plan puts the build number into the artifact name, the copy pattern is defined this way:
release-x64-b${bamboo.buildNumber}-runtime.zip
So I get a number of artifacts in the plan directory:
release-x64-b671-runtime.zip
....
release-x64-b678-runtime.zip
The dependent plan is instructed to simply download the artifact. I think it's using the copy pattern from the parent plan because I'm running into an issue where the dependent plan is substituting its own build number when downloading the artifact, here's a log excerpt:
Preparing to download plan result PROJECT-WVN-678 artifact: Shared artifact: [x64 Nightly Runtime], pattern: [release-x64-b207-runtime.zip]
(The dependent build number is 207 while the parent build number is 678). Is there a way for me to work around this 'feature'?
I hope if you have a dependent plan, then it is a subsequent stage. So, in the parent build you can save the build number in mvn_version using powershell script.
$buildnum=$env:bamboo_buildNumber
Then put the value into a text file
echo "`nmvn_version=$buildnum" | out-file -encoding utf8 mvn_version.txt
Then add a task, inject bamboo variables, where you can set the path of file ./mvn_version.txt and namespace as inject. Choose the radio button as result, so the value will be subsequently accessible in following stages/dependent plans and release plan also.

Get scons to generate a new build number

I'd like to get scons to read a previous version number from a file, update a source file with a new version number and current date and then write the number back to the original file ready for the next build.
This needs to happen only when the target is out of date. IOW the version number doesn't change if no build takes place. The original file is source controlled and isn't a source file else it could trigger another build on check-in (due to CI). CLARIFICATION From scons' point of view the code will always be out of date due to the auto-generated source file but scons will only be run from a Continuous Integration job (Jenkins) when a SCM change is detected.
I've looked into AddPostMethod, but this seems to fire for all files within the list of source files.
Command and Builder methods use the VARIANT_DIR so I can't edit these files and then check them back in as they no longer map to the repo.
I'm hoping I'm just misunderstanding some of the finer details of scons else I'm running out of ideas!
Update
Thinking this through some more, Tom's comment is correct. Although I have two files, one version controlled text file (non-source code) and one non-source controlled source file there is no way to check one file in and prevent a continuous build/check-in cycle. Jenkins will see the new text file and spin off a build, and scons will see the new generated file. So unless I delete the generated file at some point, although this seems to go against the workflow of both tools.
Does anyone have any method for achieving this? It seems pretty straightforward. Ultimately I just want to generate build numbers each time a build is started.
From SCons User Guide section 8, Order-Only Dependencies, you can use the Requires method:
import time
# put whatever text you want in your version.c; this is just regular python
version_c_text = """
char *date = "%s";
""" % time.ctime(time.time())
open('version.c', 'w').write(version_c_text)
version_obj = Object('version.c')
hello = Program('hello.c',
LINKFLAGS = str(version_obj[0]))
Requires(hello, version_obj)
Two things to note: first you have to add the explicit Requires dependency. Second, you can't make version_obj a source of the Program builder, you have to cheat (here we pass it as a linkflag), otherwise you'll get an automatic full dependency on it.
This will update the version.c always, but won't rebuild just because version.c changed.

How can I use gradle to build my project both with and without our profiling aspects?

We've developed some profiling aspects that we would like to include in a testing build, but not in our production build. I'm looking for a best-practices way of structuring the build.gradle file and the source directories.
My initial thought was to create a compileJavaAJ task, and a jarAJ task which depends on compileJavaAJ. compileJavaAJ would look awfully similar to the compileJava defined in the aspectJ plugin, http://github.com/breskeby/gradleplugins/raw/0.9-upgrade/aspectjPlugin/aspectJ.gradle. The problem with just applying this plugin is that it completely replaces compileJava (i.e. the one using javac). I need two build targets - one that uses javac, the other that uses ajc. I welcome suggestions if there's a better approach though.
Next, I need to decide where to put the aspectJ code. I don't want to put it in src/main/java, because the java compiler will choke on it. So, I'm thinking of defining a new SourceSet, src/main/aspectJ, which only compileJavaAJ knows about. A SourceSet is supposed to model java code though, so I'm not quite sure if this is the correct approach.
Any input is greatly appreciated. Thanks!
I would use a property like "withAspectJ" to differentiate between compiling with and without your aspects. Have a look on the following snippet:
if(project.hasProperty('withAspectj') && project.getProperty("withAspectj")){
sourceSets {
main {
java {
srcDir 'src/main/aspectj'
}
}
}
}
This snippets adds the directory src/main/aspectj to your main sourceset if a property named withAspectj evaluates to true. Now you can put all your aspects into this specific directory. If you don't pass the withAspectj property, the replaced compileJava task will compile your code without wiring the aspects into it.
But if you run your build from the command line:
gradle build -PwithAspectj=true
all aspects located in src/main/aspectj will be wired into your code.
hope that helped,
regards,
René

How would I produce JUnit test report for groovy tests, suitable for consumption by Jenkins/Hudson?

I've written several XMLUnit tests (that fit in to the JUnit framework) in groovy and can execute them easily on the command line as per the groovy doco but I don't quite understand what else I've got to do for it to produce the xml output that is needed by Jenkins/Hudson (or other) to display the pass/fail results (like this) and detailed report of the errors etc (like this). (apologies to image owners)
Currently, my kickoff script is this:
def allSuite = new TestSuite('The XSL Tests')
//looking in package xsltests.rail.*
allSuite.addTest(AllTestSuite.suite("xsltests/rail", "*Tests.groovy"))
junit.textui.TestRunner.run(allSuite)
and this produces something like this:
Running all XSL Tests...
....
Time: 4.141
OK (4 tests)
How can I make this create a JUnit test report xml file suitable to be read by Jenkins/Hudson?
Do I need to kick off the tests with a different JUnit runner?
I have seen this answer but would like to avoid having to write my own test report output.
After a little hackage I have taken Eric Wendelin's suggestion and gone with Gradle.
To do this I have moved my groovy unit tests into the requisite directory structure src/test/groovy/, with the supporting resources (input and expected output XML files) going into the /src/test/resources/ directory.
All required libraries have been configured in the build.gradle file, as described (in its entirety) here:
apply plugin: 'groovy'
repositories {
mavenCentral()
}
dependencies {
testCompile group: 'junit', name: 'junit', version: '4.+'
groovy module('org.codehaus.groovy:groovy:1.8.2') {
dependency('asm:asm:3.3.1')
dependency('antlr:antlr:2.7.7')
dependency('xmlunit:xmlunit:1.3')
dependency('xalan:serializer:2.7.1')
dependency('xalan:xalan:2.7.1')
dependency('org.bluestemsoftware.open.maven.tparty:xerces-impl:2.9.0')
dependency('xml-apis:xml-apis:2.0.2')
}
}
test {
jvmArgs '-Xms64m', '-Xmx512m', '-XX:MaxPermSize=128m'
testLogging.showStandardStreams = true //not sure about this one, was in official user guide
outputs.upToDateWhen { false } //makes it run every time even when Gradle thinks it is "Up-To-Date"
}
This applies the Groovy plugin, sets up to use maven to grab the specified dependencies and then adds some extra values to the built-in "test" task.
One extra thing in there is the last line which makes Gradle run all of my tests every time and not just the ones it thinks are new/changed, this makes Jenkins play nicely.
I also created a gradle.properties file to get through the corporate proxy/firewall etc:
systemProp.http.proxyHost=10.xxx.xxx.xxx
systemProp.http.proxyPort=8080
systemProp.http.proxyUser=username
systemProp.http.proxyPassword=passwd
With this, I've created a 'free-style' project in Jenkins that polls our Mercurial repo periodically and whenever anyone commits an updated XSL to the repo all the tests will be run.
One of my original goals was being able to produce the standard Jenkins/Hudson pass/fail graphics and the JUnit reports, which is a success: Pass/Fail with JUnit Reports.
I hope this helps someone else with similar requirements.
I find the fastest way to bootstrap this stuff is with Gradle:
# build.gradle
apply plugin: 'groovy'
task initProjectStructure () << {
project.sourceSets.all*.allSource.sourceTrees.srcDirs.flatten().each { dir ->
dir.mkdirs()
}
}
Then run gradle initProjectStructure and move your source into src/main/groovy and tests to test/main/groovy.
It seems like a lot (really it's <5 minutes of work), but you get lots of stuff for free. Now you can run gradle test and it'll run your tests and produce JUnit XML you can use in build/test-reports in your project directory.
Since you're asking for the purposes of exposing the report to Jenkins/Hudson, I'm assuming you have a Maven/Ant/etc build that you're able to run. If that's true, the solution is simple.
First of all, there's practically no difference between Groovy and Java JUnit tests. So, all you need to do is add the Ant/Maven junit task/plugin to your build and have it execute your Groovy junit tests (just as you'd do if they were written in Java). That execution will create test reports. From there, you can simply configure your Hudson/Jenkins build to look at the directory where the test reports get created during the build process.
You can write your own custom RunListener (or SuiteRunListener). It still requires you to write some code, but it's much cleaner than the script you've provided a link to. If you'd like, I can send you the code for a JUnit reporter I've written in JavaScript for Jasmine and you can 'translate' it into Groovy.