I'm using Jenkins to build a maven 2 project. As part of the build a couple of jar files get generated in the target directory. I would like Jenkins to archive/copy a specific jar from the target location to a custom folder.
How can I achieve this ? I've tried using the 'Archive the artifacts' post build option but it does not allow me to select the file under target. I get a error message saying such a location does not exist.
I'm new to Jenkins so any help is appreciated.
Thanks.
Sirius
You may have your file specification or the base directory for the artifacts wrong. From the help text:
Can use wildcards like 'module/dist/*/.zip'. See the #includes of Ant fileset for the exact format. The base directory is the workspace.
So you'll need to figure out where your target directory is relative to the workspace directory.
The archive feature copies/saves your build artifacts out of the workspace into the build's individual directory. You cannot specify where it puts them. That said, I would probably leave archiving turned on if you'll ever need to refer back to a previous version.
You can use a script build step to create the dir if it does not exist and perform the copy.
But you have not said why you want to move the artifacts around. If it is to make them available to other projects, you should look instead at the Copy Artifact build step.
Related
I am running a build through a Codebuildpipeine. I am uploading artifacts based on each stage as documented which is working fine. As you know each time a build is run the artifact folder creates a new folder for the new set of artifacts (all in S3) to be uploaded. What I want to do is retrieve the new folder name that is created in the Artifact folder into my buildspec so I can use it as a variable. Does anyone have a link or a way I can reference this? I would be willing to settle if I can get the entire URL where I can parse it?
I am trying to modify my configuration file, dataSettings.json, located somewhere inside the build artifacts folder. Figuring out the correct access path to it is like working in the dark. Using "**/dataSettings.json" as a path doesn't work in my task since I don't know the artifact's folder structure, nor wether dataSettings.json even exists.
Is there a way to quickly view the contents of a build artifacts folder within DevOps?
Add a script step in your shell scripting language of choice (bash, PowerShell, Windows command prompt, etc) that recursively outputs the directory structure. Specific commands are easy to Google. i.e. PowerShell would be gci -rec. DOS would be dir /s. Bash would be ls -R.
You can quickly view the contents of the artifacts in many of the tasks in your release pipeline.
For example, If you are using File transform task or Azure App Service deploy task. You can click the 3dots at the right end of the Package or folder field to view the contents and folder structure of the artifacts.
The Source Folder field of Copy files tasks for example:
If the artifacts is a zip file. You can navigate to its correponding build pipeline runs and download the artifacts locally to check its contents. You can download the build artifacts at the Build summary page.
I am trying to switch over to using CodeBuild to build my code so I can then easily push it to my EC2 instances instead of manually building and copying.
I can manually run ant on my station and all will build as it should.
I am now trying to use the AWS CodeBuild console to try this.
I zipped up my source code files and put it in an S3 bucket and put its location in the source fields of AWS CodeBuild. I have the build.xml in this same bucket and I also put the build.xml in the base of the codes zip file. In the build commands I put "ant".
I assume that the build.xml needs to go somewhere else?
Do I need more then just "ant" in the build commands? That is all I use when i manually build the project.
From what I have read i should be able to zip up my code , put it in the S3 location and CodeBuild will extract it and build it correct?
Also, under "Environment: How to Build" - what is the "Output files" section for? It's not for the artifacts that are built correct?
Any other tips or tricks? I am very new to all of this so any help is appreciated! I just learned about ant this week. This is building a rather large project with many classes being built - Will this cause an issue? Like I stated earlier - I do have it building file if I run it manually on my system.
Here is the error I get when I build through Code Build:
[Container] 2019/03/21 15:32:27 Entering phase BUILD
[Container] 2019/03/21 15:32:27 Running command ant
Buildfile: build.xml does not exist!
Build failed
I figured out my issue - I zipped the build files from the folder level and not the root level. I re-zipped and it can now see the build.xml.
I built again with these changes and it looks like I am close! It failed for the following -
https://forums.aws.amazon.com/ 2019/03/21 20:57:13 Expanding myapp.jar
https://forums.aws.amazon.com/ 2019/03/21 20:57:13 Skipping invalid artifact path myapp.jar
https://forums.aws.amazon.com/ 2019/03/21 20:57:13 Phase complete: UPLOAD_ARTIFACTS Success: false
https://forums.aws.amazon.com/ 2019/03/21 20:57:13 Phase context status code: CLIENT_ERROR Message: no matching artifact paths found
Isn't myapp.jar what the build is creating?
I am very confused as to what the Artifact/name should be - isn't this what is being created from the build? It is asking for an ARN - how can there be an ARN for it when it is not created?
Also very confused as to what the Environment/Output files is? It is required but I have no idea what should go in this field? It states that output files can not be empty. Does this mean it wants all the class files that are being built? If so then this build is creating over 30 class files in multiple locations - that is a ton to list.
Thanks
Ernie
I have it working! I will post my findings for others going that might be struggling -
So I figured out that the "Outputs" means what are all the files and/or directories that you want to go into your final artifact after all is built.
I have two directories that I want in the final jar artifact. One is WebContent and the other is build. They both have multiple sub-directories. I put "WebContent/*,build/*" in the output files field. It gave me a jar artifact but when I open the jar it did not have any sub-directories. In order to get it to include all sub-directories I had to make the output files field with "WebContent/**/*,build/**/*". All sub-directories are now in the zip and it appears as if the build was successful.
Hopefully this can help others out.
Now on to creating a script for this and also getting this to work from GitLab.
I am using s3 plugin in Jenkins to copy my project from GIT to S3.
Its working fine; except that it copies only the top level files. It doesn't copy the subdirectories or the files with in the sub directory.
How can I achieve a full copy?
It depends on your OS where the Jenkins job is executed: JENKINS issue 27576 seems to indicate it was an issue, but PR 55 also shows the right syntax to use for a recursive upload:
We had the S3 plugin configured with the source parameter as trunk/build/resources/**/* on Windows builders.
So in your case, make sure your path to upload finishes with /**/* in order to consider all files.
Ant -- copying files and subdirectories from only one subdirectory on a tree
This helped me a lot
if you only what to upload to s3 the whole folder use: foldername/**/
i used this to host a nuxt project in s3 with the dist generated folder.
I'd like to build my images on Bluemix and not locally.
Can I use COPY to include my own files into the image ?
If so where can I store the files ?
Or maybe if I had the files on github I could pull them from github ?
Yes, you can!
When you run cf ic build or docker build, the contents of your current directory (your "build context") are sent along with the build request. You can use the COPY or ADD instructions to add any file or folder within the build context into your container image, just as you can locally.
If you want to use an online source such as Github anyway, you can ADD the file from a URL. For more information, see the Dockerfile reference.
Bonus, you can now use cf ic cp to copy files to and from your containers once they are up and running.