Jenkins job DSL - get jenkins to checkout a specific commit - jenkins-job-dsl

I simply need to put together a job that checks-out the same commit every time.
Checking the docs, I can't seem to find a way to do it:
https://jenkinsci.github.io/job-dsl-plugin/#path/javaposse.jobdsl.dsl.jobs.FreeStyleJob.scm-git
// checkout repo1 to a sub directory and clean the workspace after checkout
job('example-1') {
scm {
git {
remote {
name('remoteB')
url('git#server:account/repo1.git')
}
extensions {
cleanAfterCheckout()
relativeTargetDirectory('repo1')
}
}
}
I'd like to do something like commit('hash') somewhere in that block.

You can specify the commit using the branch setting.
// checkout repo1 to a sub directory and clean the workspace after checkout
job('example-1') {
scm {
git {
remote {
name('remoteB')
url('git#server:account/repo1.git')
}
branch('hash')
extensions {
cleanAfterCheckout()
relativeTargetDirectory('repo1')
}
}
}
}

Related

Testing Gradle Plugin That Uses project.gradle.projectsEvaluated

I am trying to test a Gradle plugin that uses the BuildAdapter#ProjectsEvaluated function to allow custom configurations and dependencies to be added to my project. The problem that I'm having is that if I evaluate the project using InternalProject#evaluate in the junit test then the projectsEvaluated function is never called. If I switch to the more robust GradleRunner then I do not appear to have the ability to inspect the project(s) later to actually see if my plugin works. I have provided sample code below that shows a sample plugin and a junit test. Does anyone know how I could test this functionality?
class MyPlugin implements Plugin<Project> {
#Override
void apply(Project project) {
project.extensions.create("myPlugin", MyPluginExtension)
project.gradle.projectsEvaluated {
/*
custom logic that needs to evaluate prior
to gradle adding dependencies to project
*/
}
}
The JUnit is provided below:
class MyPluginPluginTest {
private Project project
#Before
void setup() {
project = ProjectBuilder.builder().build()
project.repositories.mavenCentral()
project.apply plugin: 'java'
project.apply plugin: MyPlugin
}
/*
Test to check whether or not the correct dependency
was added to the project when the plugin was evaluated
*/
#Test
void projectHasCheckerFrameworkDependencies() {
((ProjectInternal) project).evaluate()
Set<File> files = project.configurations.getByName('myPlugin').resolve()
assertNotEquals(0, files.size())
assertTrue(files.any { it.name.endsWith("myDependency-${project.jarName.version}.jar") })
}
}
In order to gain access to the information I required I added the projectsEvaluated closure to the projectExtensions as follows:
class MyPlugin implements Plugin<Project> {
#Override
void apply(Project project) {
project.extensions.create("myPlugin", MyPluginExtension)
def projectsEvaluatedClosure = {
/*
custom logic that needs to evaluate prior
to gradle adding dependencies to project
*/
}
project.extensions.add("myPluginProjectsEvaluatedClosure", projectsEvaluatedClosure)
project.gradle.projectsEvaluated closure
}
}
Now that the projectsEvaluated closure is accessible in the project I executed the closure manually from the test case. This isn't an ideal solution, but I was able to verify that the code was working properly.

How to test Terraform files

I'm defining my infrastructure in Terraform files. I like Terraform a lot, but I'm having trouble figuring out how to test. I have awspec, which is really nice and runs RSpec-like tests against the result of your build via the AWS API. But is there a way to do unit tests, like on the results of terraform plan? What kind of workflow are others using with Terraform?
I'm going to expand on Begin's answer with more information about Kitchen-Terraform.
Kitchen-Terraform is a set of open source plugins that run within Test-Kitchen, these are supposed to go into your Terraform module repository to test that module's functionality before being used in a repository that creates the resources. Please feel free to check the documentation of those two projects for more details, but I will go through my recommendations for integration testing your Terraform code.
Install Ruby, Terraform
For this example, the Terraform module repo will be called: my_terraform_module
mkdir -p my_terraform_module
cd my_terraform_module
mkdir -p test/integration/kt_suite/controls \
test/fixtures/tf_module/
Create a Gemfile:
source "https://rubygems.org/" do
gem "kitchen-terraform"
end
Install the necessary components (uses the Gemfile for the dependencies of kitchen-terraform)
gem install bundler
bundle install
Create the Test-Kitchen file .kitchen.yml - this brings together the testing frame, Test-Kitchen and Kitchen-Terraform
---
driver:
name: terraform
root_module_directory: test/fixtures/tf_module
parallelism: 4
provisioner:
name: terraform
transport:
name: ssh
verifier:
name: terraform
groups:
- name: basic
controls:
- file_check
- state_file
platforms:
- name: terraform
suites:
- name: kt_suite
Your Terraform code should be at the root of the Terraform module repository such as:
my_terraform_module/
|-- main.tf
Example code that can go in main.tf
resource "null_resource" "create_file" {
provisioner "local-exec" {
command = "echo 'this is my first test' > foobar"
}
}
Then we reference the Terraform module just like we would in Terraform live repos - but in a test fixture instead in this file: test/fixtures/tf_module/main.tf
module "kt_test" {
source = "../../.."
}
Then from there, you can run Terraform apply, but it's done a little differently with Kitchen-Terraform and Test-Kitchen, you run a converge which helps keep track of state and a couple other items.
bundle exec kitchen converge
Now you've seen your Terraform code do an apply, we need to test it. We can test the actual resources that were created, which would be like an integration test, but we can also test the state file, which is a semi unit test, but I am not aware of anything that can currently do unit tests against the HCL code of Terraform.
Create an inspec default profile file: test/integration/kt_suite/inspec.yml
---
name: default
Create an Inspec control for your integration testing: test/integration/kt_suite/controls/basic.rb - I'm using a test for the example Terraform code I used earlier for the main.tf
# frozen_string_literal: true
control "file_check" do
describe file('.kitchen/kitchen-terraform/kt-suite-terraform/foobar') do
it { should exist }
end
end
And this is an example test of pulling information from the state file and testing if something exists in it. This is a basic one, but you can definitely exand on this example.
# frozen_string_literal: true
terraform_state = attribute "terraform_state", {}
control "state_file" do
describe "the Terraform state file" do
subject do json(terraform_state).terraform_version end
it "is accessible" do is_expected.to match /\d+\.\d+\.\d+/ end
end
end
Then run Inspec controls with Test-Kitchen and Kitchen-Terraform:
bundle exec kitchen verify
I took a lot of this from the getting started guide and some of the tutorials over here: https://newcontext-oss.github.io/kitchen-terraform/getting_started.html
We recently open sourced Terratest, our swiss army knife for testing infrastructure code.
Today, you're probably testing all your infrastructure code manually by deploying, validating, and undeploying. Terratest helps you automate this process:
Write tests in Go.
Use helpers in Terratest to execute your real IaC tools (e.g., Terraform, Packer, etc.) to deploy real infrastructure (e.g., servers) in a real environment (e.g., AWS).
Use helpers in Terratest to validate that the infrastructure works correctly in that environment by making HTTP requests, API calls, SSH connections, etc.
Use helpers in Terratest to undeploy everything at the end of the test.
Here's an example test for some Terraform code:
terraformOptions := &terraform.Options {
// The path to where your Terraform code is located
TerraformDir: "../examples/terraform-basic-example",
}
// This will run `terraform init` and `terraform apply` and fail the test if there are any errors
terraform.InitAndApply(t, terraformOptions)
// At the end of the test, run `terraform destroy` to clean up any resources that were created
defer terraform.Destroy(t, terraformOptions)
// Run `terraform output` to get the value of an output variable
instanceUrl := terraform.Output(t, terraformOptions, "instance_url")
// Verify that we get back a 200 OK with the expected text
// It can take a minute or so for the Instance to boot up, so retry a few times
expected := "Hello, World"
maxRetries := 15
timeBetweenRetries := 5 * time.Second
http_helper.HttpGetWithRetry(t, instanceUrl, 200, expected, maxRetries, timeBetweenRetries)
These are integration tests, and depending on what you're testing, can take 5 - 50 minutes. It's not fast (though using Docker and test stages, you can speed some things up), and you'll have to work to make the tests reliable, but it is well worth the time.
Check out the Terratest repo for docs and lots of examples of various types of infrastructure code and the corresponding tests for them.
From my research this is a tough issue, since Terraform is not meant to be a full featured programming language and you are declaring what resources you want with Terraform, not how to build them, trying to unit-test doesn't really give you the assurance you are building resources how you'd like without actually running an apply. This makes attempts to unit-test feel more like a linting to me.
However, you could parse your HCL files with something like pyhcl, or parse you're plan files, however from my experience this was a lot of work for little benefit (but I could be missing an easier method!).
Here are some alternatives if you wanted to test the results of your terraform applys:
kitchen-terraform is a tool for writing Test Kitchen specs for your infrastructure.
kitchen-verifier-awspec helps bring together awspec and kitchen-terraform, although I have not used it personally.
If you are using AWS, I have found AWS Config to be able to provide a lot of the same benefits as other infrastructure testing tools, without as much setup/maintenance. Although it is fairly new, and I have not used it extensively.
Also if you are paying for Terraform Premium you get access to Sentinel, which seems to provide a lot of similar benefits to AWS Config, however I have not used it personally.
In addition to the answers, I will add my two cents. I was not very happy using GO lang with Terratest although it works perfectly well. It is just that GO is not my favorite programming language. I looked for some frameworks in Java and I found terraform-maven. At first glance, I only found examples in Groovy, but since Groovy run on JVM, it is feasible to implement the same examples in Java.
I translated part of the S3PreProvisionSpec.groovy to Java. It is testing this main.tf file.
#TestInstance(TestInstance.Lifecycle.PER_CLASS)
public class S3PreProvisionTest {
private final String TF_CHANGE = "change";
private final String TF_AFTER = "after";
private final String TF_TAGS = "tags";
private final Map<String, String> mandatoryTags = Map.of(
"application_id", "cna",
"stack_name", "stacked",
"created_by", "f.gutierrez#yieldlab.de"
);
private Terraform terraform;
private TfPlan tfplan;
#BeforeAll
void setup() {
terraform = new Terraform().withRootDir("s3_pre_post_demo")
// .withProperties(Map.of("noColor", "true"))
;
tfplan = terraform.initAndPlan();
}
#AfterAll
void cleanup() {
terraform.destroy();
}
#Test
void mandatoryTagsForS3Resources() {
List<Map> s3Bucket = tfplan.getResourcesByType("aws_s3_bucket");
System.out.println("=========================");
s3Bucket.forEach(map -> {
Map tfChangeMap = (Map) map.get(TF_CHANGE);
Map tfAfterMap = (Map) tfChangeMap.get(TF_AFTER);
Map tfTagsMap = (Map) tfAfterMap.get(TF_TAGS);
assertEquals(3, tfTagsMap.size());
mandatoryTags.forEach((k, v) -> {
assertEquals(v, tfTagsMap.get(k));
});
try {
JSONObject jsonObject = new JSONObject(map);
JSONObject jsonChange = jsonObject.getJSONObject(TF_CHANGE);
JSONObject jsonAfter = jsonChange.getJSONObject(TF_AFTER);
JSONObject jsonTags = jsonAfter.getJSONObject(TF_TAGS);
System.out.println(">>>>>>>>>>>>>>>>>>>> " + jsonTags.toString());
mandatoryTags.forEach((k, v) -> {
try {
assertEquals(v, jsonTags.getString(k));
} catch (JSONException e) {
e.printStackTrace();
}
});
} catch (JSONException e) {
e.printStackTrace();
}
});
}
}
One approach is to output the results to a file using -out=tempfile, then run a script to validate whatever you're trying to do, and if all passes you can pass the file into the apply command.
look at -out here:
https://www.terraform.io/docs/commands/plan.html
You can use github.com/palantir/tfjson to parse a .plan file to json.
There is an issue at the moment that give a "unknown plan file version: 2" error. This is because the vendored version of terraform is too old.
The fix is:
go get github.com/palantir/tfjson
cd $GOPATH/src/github.com/palantir/tfjson
rm -rf vendor
go get -v ./...
There is then an error in ../../hashicorp/terraform/config/testing.go. To fix just change the line
t.Helper()
to
//t.Helper()
Run go get again and then go install
go get -v ./...
go install ./...
You should then be able to do the following which will produce json output.
terraform plan --out=terraform.plan
tfjson terraform.plan

Gradle TestNG results output

I can't seem to figure out how to configure log output and results folders for Gradle TestNG runs.
First, Gradle picks $project_dir/build/reports/tests by default for HTML report output. How do I change this default location? I tried setting testReportDir in build.gradle to no avail. Specifying a map in call to useTestNG(), e.g.
test {
if (runTest) {
// enable TestNG support (default is JUnit)
useTestNG {
outputDirectory = file("$buildDir/test-output")
}
}
}
does not work as well.
Second, I tried using TestNG's Reporter for some additional logging, e.g:
#Test
public void testMethod() {
parseFile();
Reporter.log("File parsed");
Assert.assertTrue(...);
}
But I can't seem to find the logs! Can someone please help?
The testReportDir property is read-only. You'll need to set Project.testReportDirName. You should be able to enable test logging like so.
test {
testLogging.showStandardStreams = true
}

Automatically restart GulpJS watch when gulp file changes

Lets say I have gulp watch running and I add a file to a task. Is there any way to have the gulp watch process automatically restart since a task has changes?
gulp.watch isn't working for new or deleted files.
In order to accomplish that, you can use gulp-watch plugin: https://www.npmjs.org/package/gulp-watch
var gulp = require('gulp'),
watch = require('gulp-watch');
gulp.task('live', function () {
watch({glob: 'global/files/**/*.extension'}, function(files) {
// Do stuffs
});
});
This solution comes from this question:
Gulps gulp.watch not triggered for new or deleted files?

Is it possible to configure system properties dynamically for a Gradle test task?

Is it possible to configure system properties dynamically for a Gradle test task? I haven't found a way to make it work.
I am working within my own Gradle plugin, GwtPlugin. The apply() method looks like this:
/** Apply the plugin. */
void apply(Project project) {
project.plugins.apply(JavaPlugin)
project.plugins.apply(WarPlugin)
project.extensions.create("gwt", GwtPluginExtension, project)
project.extensions.create("testSuite", TestSuitePluginExtension, project)
project.convention.plugins.gwt = new GwtPluginConvention(project)
applyGwt(project)
applyTestSuite(project)
}
In the applyTestSuite() method, I create a tasks for my test suites. The definition for the integrationtest task looks like this:
// Run integration tests, assumed to be found in a class suites/IntegrationTestSuite.
project.task("integrationtest", type: org.gradle.api.tasks.testing.Test, dependsOn: project.tasks.buildApplication) {
workingDir = { project.testSuite.getWorkingDir() == null ? project.projectDir : project.testSuite.getWorkingDir() }
scanForTestClasses = false
scanForTestClasses = false
enableAssertions = false
outputs.upToDateWhen { false }
include "suites/IntegrationTestSuite.class"
systemProperty "integration.test.server.wait", project.gwt.getServerWait()
beforeSuite { descriptor ->
if (descriptor.className == "suites.IntegrationTestSuite") {
project.convention.plugins.gwt.rebootDevmode()
}
}
afterSuite { descriptor ->
if (descriptor.className == "suites.IntegrationTestSuite") {
project.convention.plugins.gwt.killDevmode()
}
}
}
I want to get configuration for the integration.test.server.wait system property from project.gwt.getServerWait(). I can't figure out how to make this work, and I'm beginning to think it's not possible.
If I hardcode the system property, everything works as expected:
systemProperty "integration.test.server.wait", 10
The problem seems to be that the system property is set when the task is defined, but my extension doesn't have any values at that point. I can't figure out how to work around this.
For instance, I tried putting the project.gwt.getServerWait() call in a closure, but in that case the system property gets set to a string like:
com.me.gradle.GwtPlugin$_applyTestSuite_closure10_closure42#320de756
I also tried moving the systemProperty line to a doFirst block. In that case, the doFirst block gets a sensible value from my extension (I can print it), but the assignment is apparently too late to influence the test runner.
Is there any way for me to accomplish this? If not, is there another way to pass dynamic configuration to my test runner?
I have found a way to make this work. The trick is to set the system property on the test task sometime later, when the property is certain to be available. The simplest way to make that happen seems to be via a dummy dependency:
project.task("integrationtestconfig") << {
outputs.upToDateWhen { false }
project.tasks.integrationtest.systemProperty("integration.test.server.wait", project.gwt.getServerWait())
}
project.task("integrationtest", type: org.gradle.api.tasks.testing.Test,
dependsOn: project.tasks.buildApplication, project.tasks.integrationtestconfig)
...
It's not the elegant solution I was hoping for, but it does work and it's not too difficult to follow.
What I do is:
ext {
// Sets a sensible default
myProperty project.properties.myProperty
}
task preTest << {
// Computes the property...
project.ext.myProperty = ...
}
task myTest {
...
doFirst {
systemProperty 'myProperty', project.ext.myProperty
}
}
I define a sensible default value in the gradle.properties file:
myProperty = A sensible default value
In a multi module environment, it can be trickier. I then use rootProject.ext.myProperty from the test task.
Best regards,
Damien.
Dunno what version you were using to do this, but this seems the most convenient way to me:
task doSomethingBeforeTest {
doLast {
// some stuff to do
test {
systemProperties['some.property'] = 'prop'
}
}
}
basically, just put that test block in your task and set the property. (this works as of gradle 4.0 - not sure about previous versions, but I imagine it would).
I'm not sure if this works, but have you tried to do it in this way
doLast {
systemProperty "integration.test.server.wait", project.gwt.getServerWait()
}
within your plugin script?
May be this has to do with the phase (configuration, ... ) when the things are resolved in gradle.