We recently started to migrate from a very old monolithic ant build to gradle. During the setup I realized that the gradle wrapper task only provides a gradlew script for the root project but not for the sub-modules.
Is there a way to also provide "copyies" of this script to the modules that point to the root project gradle directory for the jar?
I use a custom bash script to be able to run gradlew from subdirectories. It looks in the parent directories for a gradlew script and calls it. Using this, you don't need to copy the gradlew script into the subproject directories. Just put the script on your path. The script is available at github: https://gist.github.com/breskeby/5913145
There's no need to provide such copies. Single copy of gradle wrapper is enough and should be placed on the top of project's structure.
If You need to run tasks that are located in subprojects You need to provide a path to such task. E.g.
gradlew :view:module1:jar
Related
I am trying to modify my configuration file, dataSettings.json, located somewhere inside the build artifacts folder. Figuring out the correct access path to it is like working in the dark. Using "**/dataSettings.json" as a path doesn't work in my task since I don't know the artifact's folder structure, nor wether dataSettings.json even exists.
Is there a way to quickly view the contents of a build artifacts folder within DevOps?
Add a script step in your shell scripting language of choice (bash, PowerShell, Windows command prompt, etc) that recursively outputs the directory structure. Specific commands are easy to Google. i.e. PowerShell would be gci -rec. DOS would be dir /s. Bash would be ls -R.
You can quickly view the contents of the artifacts in many of the tasks in your release pipeline.
For example, If you are using File transform task or Azure App Service deploy task. You can click the 3dots at the right end of the Package or folder field to view the contents and folder structure of the artifacts.
The Source Folder field of Copy files tasks for example:
If the artifacts is a zip file. You can navigate to its correponding build pipeline runs and download the artifacts locally to check its contents. You can download the build artifacts at the Build summary page.
When using the AWS SAM CLI to build a serverless application, it located dependencies magically and installs them all as the "build" step. For example, using a NodeJS application:
$> sam build
Building resource 'HelloWorldFunction'
Running NodejsNpmBuilder:NpmPack
Running NodejsNpmBuilder:CopyNpmrc
Running NodejsNpmBuilder:CopySource
Running NodejsNpmBuilder:NpmInstall
Running NodejsNpmBuilder:CleanUpNpmrc
Build Succeeded
Built Artifacts : .aws-sam/build
Built Template : .aws-sam/build/template.yaml
Commands you can use next
=========================
[*] Invoke Function: sam local invoke
[*] Deploy: sam deploy --guided
$>
Looking at the official documentation they're happy to simply treat it like magic, saying that it:
iterates through the functions in your application, looks for a manifest file (such as requirements.txt) that contains the dependencies, and automatically creates deployment artifacts that you can deploy to Lambda
But what if I have a dependency beyond just those specified in the manifest file? What if my code depends on a compiled binary file, or a static data file?
I would like to add additional build steps so that when I run sam build it compiles these files or copies them appropriately. Is there any way to do this?
sam build is running npm install. So if you insert your own script into a step such as preinstall in package.json, sam build will also execute that step.
package.json
{
...
"preinstall": "cp -r ../../../common ./"
...
}
The above preinstall script is a hack that embeds the common directory in the root folder of the sam inited project in the zip of each lambda handler so that it can be referenced from each.
You should also create a symbolic link in the local lambda handler directory, like ln -s ../common ./common, so that local and lambda work with the same code.
You will need to wrap this command into another custom command and add the steps you need to it.
You can create a make file with multiple targets that satisfy your requirements.
I haven't used sam build before, I usually have a make target for that purpose.
you can give it a try with this bootstrap template here https://github.com/healthbridgeltd/nodejs-sam-bootstrap which is more efficient than using sam build.
I am new to Jenkins, specially with using python script in Jenkins. The problem I am facing is as follow:
I am trying to run a python script from a python file in the post-build step of the Jenkins. I have added all the plugins required for that purpose to my understanding. i.e I have included Post-BuildScript plugin, python jenkins plugin etc.
Now when I build console output shows invalid script command caused the failure. I have attached the results below. can anybody help me with that please?
In post build step I am providing the full or absolute path to the python script file i.e
ExecutepythonScriptpath
Results
It may be useful to mention here I have also tried using just the path without writing python preceding the path, also tried with forward as well as backward slash in the path. without any success.
I have managed to resolve that issue. There are two parts of solution:
First one is if you want to run simple python script in post-build -->Add a post build step for Execute python Script (That will require you install plugin for post build ) . In that window created after adding post build step you can simply put any python command to run.
Second part of the solution is for, when user would like to run a list of commands from a python script file from the same post build step window in that case user has to make sure to put all the required python files which you want to execute into the Jenkins workspace->project directory(project for which we are running the Jenkins ) .
Moreover, for Python2.7 in order to execute that python script file user simply need to write script as
execfile(file.py)
One more thing to remember is insert python.exe path in the environment variables.
I have an application that is built with a build script named linuxApp.gradle. We have specified in settings.gradle
rootProject.name = "JobThreader"
As long as the root project folder is also named "JobThreader", when we execute the installApp task from the application plugin, the application is built to
JobThreader/build/install/JobThreader
However, if the root project folder is any other name, like "workspace" in the case of Jenkins, then the application is built to
workspace/build/install/workspace
We have verified this behavior both on our Linux Jenkins server and our local Windows machine.
We have attempted the following commands with identical results
gradlew clean installApp -b linuxApp.gradle
gradlew clean installApp -b linuxApp.gradle -c settings.gradle
How can we get the application to install to workspace/build/install/JobThreader in our Jenkins example?
When -b is used, any settings script will be ignored. (-b can be useful for experimentation, but isn't typically used for real builds.) When -b is not used and a settings script is found or passed explicitly via -c, it's up to that settings script to configure the names of build scripts. For example:
rootProject.buildFileName = "linuxApp.gradle"
I'm using Jenkins to build a maven 2 project. As part of the build a couple of jar files get generated in the target directory. I would like Jenkins to archive/copy a specific jar from the target location to a custom folder.
How can I achieve this ? I've tried using the 'Archive the artifacts' post build option but it does not allow me to select the file under target. I get a error message saying such a location does not exist.
I'm new to Jenkins so any help is appreciated.
Thanks.
Sirius
You may have your file specification or the base directory for the artifacts wrong. From the help text:
Can use wildcards like 'module/dist/*/.zip'. See the #includes of Ant fileset for the exact format. The base directory is the workspace.
So you'll need to figure out where your target directory is relative to the workspace directory.
The archive feature copies/saves your build artifacts out of the workspace into the build's individual directory. You cannot specify where it puts them. That said, I would probably leave archiving turned on if you'll ever need to refer back to a previous version.
You can use a script build step to create the dir if it does not exist and perform the copy.
But you have not said why you want to move the artifacts around. If it is to make them available to other projects, you should look instead at the Copy Artifact build step.