Using Jenkins Job DSL to set "Polling ignores commits in certain paths" for Git plugin - jenkins-job-dsl

I have a Jenkins job that uses MultiScm to clone 2 git repositories.
During polling, I want it to ignore one of the 2 repos. I can set "Polling ignores commits in certain paths" manually in the configuration to make that work (using ".*" as path to exclude).
I want to enable this through job-dsl as the job is created trhough it; however, I can't find the config that has changed. The job's config.xml is identical with or without the "Polling ignores..." setting.
Any idea on how to enable this through job-dsl?

When I add the "Polling ignores commits in certain paths" behaviour, the following elements are added to the config XML:
<project>
...
<scm class="org.jenkinsci.plugins.multiplescms.MultiSCM" plugin="multiple-scms#0.5">
<scms>
<hudson.plugins.git.GitSCM plugin="git#2.4.0">
...
<extensions>
<hudson.plugins.git.extensions.impl.PathRestriction>
<includedRegions>foo</includedRegions>
<excludedRegions>bar</excludedRegions>
</hudson.plugins.git.extensions.impl.PathRestriction>
</extensions>
</hudson.plugins.git.GitSCM>
</scms>
...
</scm>
...
</project>
You can use the a Configure Block within the git context to add this config:
job('example') {
multiscm {
git {
remote {
github('jenkins/job-dsl-plugin')
}
configure { gitScm ->
gitScm / 'extensions' << 'hudson.plugins.git.extensions.impl.PathRestriction' {
includedRegions('foo')
excludedRegions('bar')
}
}
}
}
}

Related

Vendoring npm packages in deno

How does one vendor a npm package in deno?
import_map.json:
{
"imports": {
"lume/": "https://deno.land/x/lume#v1.12.1/",
}
}
Lume has some npm dependencies, like https://registry.npmjs.org/markdown-it/-/markdown-it-13.0.0.tgz.
deno.jsonc:
{
"importMap": "import_map.json",
}
dev_deps.ts:
export * as lume from "https://deno.land/x/lume#v1.12.1/mod.ts";
command:
$ deno vendor --force --unstable dev_deps.ts
# ...
Download https://registry.npmjs.org/markdown-it-attrs/-/markdown-it-attrs-4.1.3.tgz
# ...
thread 'main' panicked at 'Could not find local path
for npm:markdown-it-attrs#4.1.3', cli/tools/vendor/mappings.rs:138:11
I tried adding export * as ma from "npm:markdown-it-attrs"; to dev_depts.ts, but it did nothing.
I found the following issue on github.
Maybe this issue does have something to do with it.
I didn't find anything about how to resolve the problem in the official deno documentation and the lume documentation.
Unfortunately, currently you cannot use import_map in your Deno project if your goal is to publish a module that aims to be used in other applications, simply because you don't handle the way deno runtime will start.
From the application point of view, the deno run command cannot search every import_map configurations in your dependencies and handle them properly.
The import_map feature should be used only at end application level.
The fallback is to use by onvention a deps.ts source file to centralize all your dependencies.

Looking for documentation setup jest for unit test of admin plugin

my project was created with the swdc create-project ...
Is there a documentation, a tutorial or description for the right setup/configuration unit testing with JEST for custom plugin in administration?
This tutorial describes only how to write a test
But i think there must be a official setup documentation because of versions etc.
EDIT: a tutorial with code is now avialable
Using the suggested solution and execute the test, throws an configuration error:
● Test suite failed to run
Configuration error:
Could not locate module src/core/factory/module.factory mapped as:
undefined/src$1.
Please check your configuration for these entries:
{
"moduleNameMapper": {
"/^src(.*)$/": "undefined/src$1"
},
"resolver": undefined
}
...
Cause of error:
process.env.ADMIN_PATH not setted but required in %Project%/custom/plugins/%MyPlugin%/src/Resources/app/administration/node_modules/#shopware-ag/jest-preset-sw6-admin/jest-preset.js
My solution:
set process.env.ADMIN_PATH in %Project%/custom/plugins/%MyPlugin%/src/Resources/app/administration/jest.config.js
// jest.config.js
...
const { join, resolve } = require('path');
process.env.ADMIN_PATH = resolve('../../../../../../../src/Administration/Resources/app/administration');
...
I think it is easiest to just copy and adapt from a plugin that already has jest tests set up. Look at the administration directory for SwagPayPal for example. Copy the dependency and script sections from their package.json. Also copy the entire jest.config.js. Then within the administration directory of your plugin you should be able to npm install followed by npm run unit or npm run unit-watch and it should find *.spec.js files within the test sub-directory.

How do I deploy monorepo code to AWS Lambda using lerna?

I am attempting to make two AWS Lambda functions (written in typescript). Both of these functions share the same code for interacting with an API. In order to not have to copy the same code to two different Lambdas, I would like to move my shared code to a local module, and have both my Lambdas depend on said module.
My initial attempt at staring code between the two lambdas was to use a monorepo and lerna. My current project structure looks like this:
- lerna.json
- package.json
- packages
- api
- package.json
- lambdas
- funcA
- package.json
- func B
- package.json
lerna.json:
{
"packages": [
"packages/api",
"packages/lambdas/*"
],
"version": "1.0.0"
}
In each of my package.json for my Lambda functions, I am able to include my local api module as such:
"dependencies": {
"#local/api": "*"
}
With this, I've been able to move the common code to its own module. However, I'm now not sure how to bundle my functions to deploy to AWS Lambda. Is there a way for lerna to be able to create a bundle that can be deployed?
As cp -rL doesn't work on the mac I had to come up with something similar.
Here is a workflow that works if all of your packages belong to one scope (#org):
In the package.json of your lerna repo:
"scripts": {
"deploy": "lerna exec \"rm -rf node_modules\" && lerna bootstrap -- --production && lerna run deploy && lerna bootstrap"
}
In the package that contains your lambda function:
"scripts":{
"deploy": "npm pack && tar zxvf packagename-version.tgz && rm -rf node_modules/#org && cp -r node_modules/* package/node_modules && cd package && npm dedupe"
}
Now replace "packagename-version" and "#org" with the respective values of your project. Also add all of the dependent packages to "bundledDependencies".
After running npm run deploy in the root of your lerna mono repo you end up with a folder "package" in the package that contains your lambda function. It has all the dependencies needed to run your function. Take it from there.
I had hoped that using npm pack would allow me to utilize .npmignore files but it seems that that doesn't work. If anyone has an idea how to make it work let me know.
I have struggled with this same problem for a while now, and I was finally forced to do something about it.
I was using a little package named slice-node-modules, as found here in this similar question, which was good enough for my purposes for a while. As I have consolidated more of my projects into monorepos and begun using shared dependencies which reside as siblings rather than being externally published, I ran into shortcomings with that approach.
I've created a new tool called lerna-to-lambda which was specifically tailored to my use case. I published it publicly with minimal documentation, hopefully enough to help others in similar situations. The gist of it is that you run l2l in your bundling step, after you've installed all of your dependencies, and it copies what is needed into an output directory which is then ready to deploy to Lambda using SAM or whatever.
For example, from the README, something like this might be in your Lambda function's package.json:
"scripts": {
...
"clean": "rimraf build lambda",
"compile": "tsc -p tsconfig.build.json",
"package": "l2l -i build -o lambda",
"build": "yarn run clean && yarn run compile && yarn run package"
},
In this case, the compile step is compiling TypeScript files from a source directory into JavaScript files in the build directory. Then the package step bundles up all the code from build along with all of the Lambda's dependencies (except aws-sdk) into the directory lambda, which is what you'd deploy to AWS. If someone were using plain JavaScript rather than TypeScript, they could just copy the necessary .js files into the build directory before packaging.
I realize this question is over 2 years old, and you've probably figured out your own solutions and/or workarounds since then. But since it is still relevant to me, I assume it's still relevant to someone out there, so I am sharing.
Running lerna bootstrap will create a node_modules folder in each "package". This will include all of your lerna managed dependencies as well as external dependencies for that particular package.
From then on, your deployment of each lambda will be agnostic of the fact that you're using lerna. The deployment package will need to include the code for that specific lambda and the node_modules folder for that lambda - you can zip these and upload them manually, or use something like SAM or CloudFormation.
Edit: as you rightly point out you'll end up with symlinks in your node_modules folder which make things awkward to package up. To get around this, you could run something like this prior to packaging for deployment:
cp -rL lambdas/funcA/node_modules lambdas/funcA/packaged/node_modules
The -L will force the symlinked directories to be copied into the folder, which you can then zip.
I have used a custom script to copy the dependencies during the install process.. This will allow me to develop and deploy the application with the same code.
Project structure
In the package.json file of the lambda_a, I have the following line:
"scripts": {
"install": "node ./install_libs.js #libs/library_a"
},
#libs/library_a can be used by the lambda code using the following statement:
const library_a = require('#libs/library_a')
for SAM builds, I use the following command from the lmbdas frolder:
export SAM_BUILD=true && sam build
install_libs.js
console.log("Starting module installation")
var fs = require('fs');
var path = require('path');
var {exec} = require("child_process");
if (!fs.existsSync("node_modules")) {
fs.mkdirSync("node_modules");
}
if (!fs.existsSync("node_modules/#libs")) {
fs.mkdirSync("node_modules/#libs");
}
const sam_build = process.env.SAM_BUILD || false
libs_path = "../../"
if (sam_build) {
libs_path = "../../" + libs_path
}
process.argv.forEach(async function (val, index, array) {
if (index > 1) {
var currentLib = libs_path + val
console.log(`Building lib ${currentLib}`)
await exec(`cd ${currentLib} && npm install` , function (error, stdout, stderr){
if (error) {
console.log(`error: ${error.message}`);
return;
}
console.log(`stdout: ${stdout}`);
console.log('Importing module : ' + currentLib);
copyFolderRecursiveSync(currentLib, "node_modules/#libs")
});
}
});
function copyFolderRecursiveSync(source, target) {
var files = [];
// Check if folder needs to be created or integrated
var targetFolder = path.join(target, path.basename(source));
if (!fs.existsSync(targetFolder)) {
fs.mkdirSync(targetFolder);
}
// Copy
if (fs.lstatSync(source).isDirectory()) {
files = fs.readdirSync(source);
files.forEach(function (file) {
var curSource = path.join(source, file);
if (fs.lstatSync(curSource).isDirectory()) {
copyFolderRecursiveSync(curSource, targetFolder);
} else {
copyFileSync(curSource, targetFolder);
}
});
}
}
function copyFileSync(source, target) {
var targetFile = target;
// If target is a directory, a new file with the same name will be created
if (fs.existsSync(target)) {
if (fs.lstatSync(target).isDirectory()) {
targetFile = path.join(target, path.basename(source));
}
}
fs.writeFileSync(targetFile, fs.readFileSync(source));
}

How to debug Vue CLI 3 unit tests with WebStorm? Debugger doesn't hit breakpoint

Question
What do I have to do to make WebStorm hit the breakpoint?
Is it necessary to set the %NODE_DEBUG_OPTION%? If yes, how do I do this in combination with vue-cli.service?
Steps to reproduce:
vue create myapp
Set options to:
? Please pick a preset: Manually select features
? Check the features needed for your project: Babel, Unit
? Pick a unit testing solution: Jest
? Where do you prefer placing config for Babel, PostCSS, ESLint, etc.? In package.json
? Save this as a preset for future projects? No
Open myapp in WebStorm
Open npm Tool Windows
Set breakpoint in tests/unit/example.spec.js
Rightclick on test:unit->Debug test:unit
console output:
To debug the "test:unit" script, make sure the %NODE_DEBUG_OPTION% string is specified as the first argument for the node command you'd like to debug.
For example:
"scripts": {
"start": "node %NODE_DEBUG_OPTION% server.js"
}
myapp#0.1.0 test:unit C:\Users\c-jay\myapp
vue-cli-service test:unit
PASS tests/unit/example.spec.js
Configuration:
WebStorm 2018.2.4
Vue CLI v3.0.5
According to cli-plugin-jest and npm tasks debug in WebStorm I've edited the npm test:unit call in the scripts-section of my package.json to:
"test:unit": "node %NODE_DEBUG_OPTION% node_modules/#vue/cli-service/bin/vue-cli-service.js test:unit"
And webstorm hits the breakpoints as expected. This is for Windows. On Mac should it be:
"test:unit": "node $NODE_DEBUG_OPTION node_modules/#vue/cli-service/bin/vue-cli-service.js test:unit"

Gradle test task - Gradle 1.6 and with Gradle 2.3 or later

I have a Java project.
PS: In my project, I don't have any java program/source code in src/test/java. - This folder just contains a blank.txt file.
I have two different Gradle versions:
Gradle 1.6 with Java 7 (as Java 8 is NOT compatible with Gradle 1.6 or any version < less than 1.10 version if I'm correct).
The other version is: Gradle 2.3 with Java 8.
Using both of the above mentioned Gradle 1.6 + Java7 OR Gradle 2.3 + Java 8 versions my project build successfully.
Though, I noticed one thing: That while running the build, it calls "test" task automatically (as per the Gradle design, test task runs for free); I found during Gradle 1.6 + Java7 run --- I see the following output.
:jar
:assemble
:compileTestJava UP-TO-DATE
:processTestResources
:testClasses
:test
:check
As you'll notice, it says I don't have any test source code (i.e. src/test/java doesn't contain any source code OR there's nothing new for Gradle to compile this time may be nothing changed since last time gradle ran the build) and that's why compileTestJava task is showing UP-TO-DATE in front of it.
But, :test task is showing that it ran successfully. I have used jacoco (code coverage) section within test { .. } task, then it actually ran that part (as there is no UP-TO-DATE in front of test task). Jacoco section is NOT defined in my project's build.gradle but actually it's coming from a top level / GRADLE_HOME/init.d/some-common-top-level.gradle file (where test { ... has jacoco { ... } .. } section in it).
As I mentioned above, test task didn't say UP-TO-DATE, therefore, after Gradle build process was complete, I can see it created the following folder/files structure inside build/tmp/expandedArchives/org.jacoco.... folder:
$ ls -ltr build/tmp/expandedArchives/
total 4
drwxr-xr-x+ 1 e020001 Domain Users 0 Jul 7 20:45 org.jacoco.agent-0.7.2.201409121644.jar_778m6tp3jrtvcetasufl59dmau
$ ls -ltr build/tmp/expandedArchives/org.jacoco.agent-0.7.2.201409121644.jar_778m6tp3jrtvcetasufl59dmau/
total 272
drwxr-xr-x+ 1 e020001 Domain Users 0 Jul 7 20:58 META-INF
-rwxr-xr-x 1 e020001 Domain Users 2652 Jul 7 20:58 about.html
-rwxr-xr-x 1 e020001 Domain Users 272311 Jul 7 20:58 jacocoagent.jar
drwxr-xr-x+ 1 e020001 Domain Users 0 Jul 7 20:58 org
The same is NOT happening when I'm running Gradle 2.3 and Java8.
Build is successful but I'm not getting build/tmp/expandedArchives/org.jacoco.... folder containing jacocoagent.jar file.
Any idea, why Gradle 2.3 is not creating this jacoco specific .jar file.
With Gradle 2.3+Java8, the following output shows UP-TO-DATE in front of both :compileTestJava and :test tasks (which was not the case with Gradle 1.6 for test task).
I ran "gradle clean build".
:compileTestJava UP-TO-DATE
:compileTestGroovy UP-TO-DATE
:processTestResources
:testClasses
:test UP-TO-DATE
:check
I need Gradle 2.3 to generate this jacocoagent.jar under build/tmp/expandedArchives/org.jacoco..... folder so that I can use it in a downstream Jenkins job (which runs non-Unit tests) as this project does have some Integration tests and I'm fetching the jacocoagent.jar from the parent main build job (which runs gradle clean build including test task) in downstream job so that I can pass it to TOMCAT JVM while starting Tomcat (so that I can get jacocoIT.exec code coverage for IT tests). But, after I switched to Gradle 2.3, all projects where I don't have src/test/java ... now jacocoagent.jar is not getting created and the copy artifact plugin fails while trying to copy the .jar file from parent job.
One more point:
With Gradle 1.6 + Java7, if I run gradle clean build, it successfully creates that jacocoagent.jar inside build/tmp/expandedArchives/org.jacoco..... folder but it works this way, only when I run gradle clean build or "gradle clean; gradle test".
If I run gradle clean build, and then remove build/tmp folder, and now just run: gradle test, it shows me UP-TO-DATE in front of both :compileTestJava and :test tasks and doesn't create build/tmp/expandedArchives/org.jacoco.... folder containing jacocoagent.jar file.
For more info, I'm attaching the profile run (i.e. using --profile option) while running gradle test task for Gradle 1.6 + java 7.
I see that, in the profile html file that when test task is run, it first calls compileJava as per Gradle process logic and then test task and it's also calling depedencies --- :jacocoAgent (as per the dependency resolution tab):
But,
with Gradle 2.3 + Java8, the dependency Resolution / order and Task execution step is not same (or in the order as compared to Gradle 1.6) for generating or showing any reference to jacocoAgent dependency as it's not even calling it.
Running Gradle1.6 +Java7 test task with -i (or --info) option shows why it ran test task even though I had no test source code, see the reason why:
Note: Recompile with -Xlint:unchecked for details.
:processResources
Skipping task ':processResources' as it has no source files.
:processResources UP-TO-DATE
:classes
Skipping task ':classes' as it has no actions.
:compileTestJava
Skipping task ':compileTestJava' as it has no source files.
:compileTestJava UP-TO-DATE
:processTestResources
Executing task ':processTestResources' due to:
No history is available.
:testClasses
Skipping task ':testClasses' as it has no actions.
:test
file or directory '/my/workspace/project/build/classes/test', not found
Executing task ':test' due to:
No history is available.
file or directory '/my/workspace/project/build/classes/test', not found
Finished generating test XML results (0.001 secs)
Generating HTML test report...
Finished generating test html results (0.012 secs)
BUILD SUCCESSFUL
you can force the test task to be executed no matter what the status of inputs and outputs are:
test{
outputs.upToDateWhen{false}
}
for earlier gradle versions you can ensure the class directory exists by
task createTestClassesDir << {sourceSets.test.output.classesDir.mkdirs()}
test.dependsOn createTestClassesDir
Summary:
With Gradle 2.3, if there are no valid .java/.groovy (or etc) test code, then test task won't even run and thus there'll be no jacocoagent.jar created somewhere deep in build/tmp/exapandedArchives/org.jacoco.xxx.... folder.
Solution was to include the following (in top level $GRADLE_HOME/init.d/some-global-file.gradle) inside allprojects { .... } section. All we are doing is, if src/test/java (standard) or any legacy folder structure (src/java if your project structure is like this) doesn't have any valid test source code, then we can add a dummy test file (DummyTestXYZ.java or groovy) and let test task run which will generate jacocoagent.jar (which we can use / tie in Tomcat options for generating jacoco report for non-unit aka integration tests). This way, if your main build job calls a downstream/child job to run your IT tests, it won't fail as it can fetch jacocoagent.jar (from main build job's workspace) as test task will create jacocoagent.jar in build/tmp/expandedArchives/org.jacoco.xx.x.xx..x folder (that you can get using Copy Artifact plugin in Jenkins).
PS: Change the if statement logic acc. to your own folder setup i.e. in which folder you'd want to create the DummyTestXYZ.java file. In our case, all new projects were using src/test/java (standard folder structure as per Maven/Gradle standard) and during the new project creation, we are adding valid sample unit tests checked-in to the source control. Thus, in the logic below, we are actually ignoring to create this DummyTestXYZ.java in case src/test/java exists and creating this file only if src/test/java folder doesn't exist in the project (i.e. this is a project which has legacy folder structure) + test/java (legacy folder for storing JUnit unit tests) has no .java programs and/or if test/java doesn't exist then create it first and then create the dummy file. I know, we could have uploaded jacocoagent.jar at some location on Jenkins server and use that file while starting Tomcat for getting code coverage for IT tests. The dummy test file we added requires junit:junit:4.10 or 4.11 library version for the :compileTestJava task to succeed.
compileJava {
doLast {
def dirName = "${projectDir}/test/java"
if(!file( "${projectDir}/src/test/java" ).exists())
if(!file( dirName ).exists())
new File( dirName ).mkdirs()
if(file( dirName ).exists()) {
def javaCnt = new FileNameByRegexFinder().getFileNames(dirName, /.*\.java/).size()
if(javaCnt == 0) {
def f = new File( dirName , 'DummyTestXYZ.java' )
def w = f.newPrintWriter()
w.println('import org.junit.Test;')
w.println('')
w.println('public class DummyTestXYZ {')
w.println('#Test' )
w.println('public void test() {')
w.println('}')
w.println('}')
w.close()
}
}
}
}
test {
doFirst {
testResultsDirName = "test-results/UT"
testReportDirName = "tests/UT"
}
maxParallelForks = 5
forkEvery = 50
//ignoreFailures = true
// Following Jacoco section is required only in Jenkins
// But a developer can uncomment them if they want this feature to work for their
// Desktop local Gradle builds.
jacoco {
//Following vars works only with versions >= 1.7 version of Gradle
destinationFile = file("$buildDir/jacoco/UT/jacocoUT.exec")
}
doLast {
if (file("${projectDir}/test/java/DummyTestXYZ.java").exists()) {
println "++"
println "++"
println "++"
println "======================================================="
println "DEV Team – Please add valid Unit tests in this project."
println "======================================================="
println "++"
println "++"
println "++"
sleep(30 * 1000)
new File("${projectDir}/build/classes/test").deleteDir()
new File("${buildDir}/jacoco/UT").deleteDir()
new File("${buildDir}/test-results/UT").deleteDir()
delete "${projectDir}/test/java/DummyTestXYZ.java"
}
}
}
//Do the same (as above test code) for any other similar test tasks like integartionTest, acceptanceTest etc..
jacocoTestReport {
//cleaning any compile time generated (for ex: JiBx classes files) so that jacoco task won't fail for not finding the actual source files (.java/.groovy for the compile time generated .class files)
doFirst {
delete fileTree (dir: "${buildDir}/classes", include: "**/JiBX_*.class")
}
group = "Reporting"
description = "Generate Jacoco coverage reports after running tests."
//ignoreFailures = true
executionData = fileTree(dir: 'build/jacoco', include: '**/*.exec')
reports {
xml{
enabled true
//Following value is a file
destination "${buildDir}/reports/jacoco/xml/jacoco.xml"
}
csv.enabled false
html{
enabled true
//Following value is a folder
destination "${buildDir}/reports/jacoco/html"
}
}
sourceDirectories = files(['src/java','src/main/java', 'src/main/groovy'])
classDirectories = files('build/classes/main')
doLast {
if (file("${projectDir}/test/java/DummyTestXYZ.java").exists()) {
delete "${projectDir}/test/java/DummyTestXYZ.java"
}
}
}