Can I use COPY to add in files when building images on Bluemix? - build

I'd like to build my images on Bluemix and not locally.
Can I use COPY to include my own files into the image ?
If so where can I store the files ?
Or maybe if I had the files on github I could pull them from github ?

Yes, you can!
When you run cf ic build or docker build, the contents of your current directory (your "build context") are sent along with the build request. You can use the COPY or ADD instructions to add any file or folder within the build context into your container image, just as you can locally.
If you want to use an online source such as Github anyway, you can ADD the file from a URL. For more information, see the Dockerfile reference.

Bonus, you can now use cf ic cp to copy files to and from your containers once they are up and running.

Related

How do I view the contents of my build artifact folder in Azure Dev Ops?

I am trying to modify my configuration file, dataSettings.json, located somewhere inside the build artifacts folder. Figuring out the correct access path to it is like working in the dark. Using "**/dataSettings.json" as a path doesn't work in my task since I don't know the artifact's folder structure, nor wether dataSettings.json even exists.
Is there a way to quickly view the contents of a build artifacts folder within DevOps?
Add a script step in your shell scripting language of choice (bash, PowerShell, Windows command prompt, etc) that recursively outputs the directory structure. Specific commands are easy to Google. i.e. PowerShell would be gci -rec. DOS would be dir /s. Bash would be ls -R.
You can quickly view the contents of the artifacts in many of the tasks in your release pipeline.
For example, If you are using File transform task or Azure App Service deploy task. You can click the 3dots at the right end of the Package or folder field to view the contents and folder structure of the artifacts.
The Source Folder field of Copy files tasks for example:
If the artifacts is a zip file. You can navigate to its correponding build pipeline runs and download the artifacts locally to check its contents. You can download the build artifacts at the Build summary page.

copy sub directories as well using Jenkins S3 plugin

I am using s3 plugin in Jenkins to copy my project from GIT to S3.
Its working fine; except that it copies only the top level files. It doesn't copy the subdirectories or the files with in the sub directory.
How can I achieve a full copy?
It depends on your OS where the Jenkins job is executed: JENKINS issue 27576 seems to indicate it was an issue, but PR 55 also shows the right syntax to use for a recursive upload:
We had the S3 plugin configured with the source parameter as trunk/build/resources/**/* on Windows builders.
So in your case, make sure your path to upload finishes with /**/* in order to consider all files.
Ant -- copying files and subdirectories from only one subdirectory on a tree
This helped me a lot
if you only what to upload to s3 the whole folder use: foldername/**/
i used this to host a nuxt project in s3 with the dist generated folder.

Uploading files to a bluemix app and pointing to them from configuration files

I am trying to upload files to my bluemix app and I am having problems using and understanding the file system. After I have succesfully uploaded files I want to give their path on my configuration files.
Specifically, I want to upload a jar file to the server and later use it as javaagent.
I have tried approaching this isuue from several directions.
I see that I can create a folder in the liberty_buildpack and place the files inside I can later access it on the compilation-release phases from the tmp folder:
/tmp/buildpacks/ibm-websphere-liberty-buildpack/lib/liberty_buildpack/my_folder
Also I can see that in the file system that I see when building and deploying the app I can copy only to the folder located in:
/app
So I copied the JAR file to the app file and set it as a javaagent using 2 method:
Manually set enviorment variable JAVA_OPTS with java agent to point to /app/myjar.jar using cf set-env
Deploy a war file of the app using cf push from wlp server and set the java agent inside the server.xml file and attribute genericJvmArguments
Both of those methods didnt work, and either the deploy phase of the application failed or my features simply didnt work.
So I tried searching the application file system using cf files and came up with the app folder, but strangly it didn't have the same file as the folder I deploy and I couldn't find any connection to the deployed folder ot the build pack.
Can someone explain how this should be done correctly? namely, uploading the file and then how should I point to it from the enviorment variable/server file?
I mean should it be /app/something or maybe other path?
I have also seen the use of relative paths like #droplet.sandbox maybe its the way to address those files? and how should I access those folders from cf files
Thanks.
EDIT:
As I have been instructed in the comments I have added the jar file to the system, the problem is that when I add the javaagent variable to the enviorment variable JAVA_OPTS the deploy stage fails with the timeout error:
payload: {... "reason"=>"CRASHED", "exit_status"=>32, "exit_description"=>"failed to accept connections within health check timeout", "crash_timestamp"=>
1433864527}
The way I am assigning the javaagent is as follows:
cf set-env myApp JAVA_OPTS "path/agent.jar"
I have tried adding several location:
1. I have found that if I add the jar files to my WebContent folder I can find it in: /app/wlp/usr/servers/defaultServer/apps/myapp.war/resources/
2. I have copied the jar file from the /tmp location in the compilation phase to /home/vcap/app/agent.jar
3. I have located the jar file in /app/.java/jre/lib
none of those 3 paths worked.
I found out that if I give a wrong path the system behaves the same so it may be a path problem.
Any ideas?
Try this:
Put your agent jars in a folder called ".profile.d" inside your WAR package;
cf se your-app JAVA_OPTS -javaagent:/home/vcap/app/.profile.d/your.jar ;
Push the war to Bluemix.
Not sure if this is exactly the right answer, but I am using additional jar files in my Liberty application, so maybe this will help.
I push up a myapp.war file to bluemix. Within the war file, inside the WEB-INF folder, I have a lib folder that contains a number of jar files. The classes in those jar files are then used within the java code of my application.
myapp.war/WEB-INF/lib/myPlugin.jar
You could try doing something like that with the jar file(s) you need, building them into the war file.
Other than that, you could try the section Overlaying the JRE from the bluemix liberty documentation to add jars to the JRE.

Using Bitbucket for existing project

I have an existing django project on my local machine (in virtualwrapper). How do I add it to the Bitbucket?
Let say my django project is like this:
project
--manage.py
--apps
--db.sqlite3
...
So should I do 'git init' under 'project' directory?
Since it is develop in the virtualwrapper, so I think only the project files will be pushed to the Bitbucket, is that right? If I want to develop the project on a different computer, and want to pull the project files from Bitbucket, how should I do it? I mean should I create another virtual environment in my new machine, install django and necessary pakcages before import the files from bitbucket?
I am new to git, so I don't know what is the best to do it.
So should I do 'git init' under 'project' directory?
Yes, but after that, don't add everything.
Create a .gitignore file first, where you declare the files that shouldn't be versioned (the one that are generated)
Then add and commit: that updates a local repo.
you can easily link it to an existing empty BitBucket repo:
git remote add origin ssh://git#bitbucket.org/username/myproject.git
git push -u origin master # to push changes for the first time
Normally, you wouldn't store a binary like db.sqlite3 in a source repo.
But this blog post suggests a way to do so through
In a .gitattributes or .git/info/attributes file, give Git a filename pattern and the name of a diff driver, which we'll define next. In my case, I added:
db.sqlite3 diff=sqlite3
Then in .git/config or $HOME/.gitconfig, define the diff driver. Mine looks like:
[diff "sqlite3"]
textconv = dumpsqlite3
I chose to define an external dumpsqlite3 script, since this can be useful elsewhere.
It just dumps SQL to stdout for the filename given by its first argument:
#!/bin/sh
sqlite3 $1 .dump

jenkins ci : how to select the artifacts to be archived

I'm using Jenkins to build a maven 2 project. As part of the build a couple of jar files get generated in the target directory. I would like Jenkins to archive/copy a specific jar from the target location to a custom folder.
How can I achieve this ? I've tried using the 'Archive the artifacts' post build option but it does not allow me to select the file under target. I get a error message saying such a location does not exist.
I'm new to Jenkins so any help is appreciated.
Thanks.
Sirius
You may have your file specification or the base directory for the artifacts wrong. From the help text:
Can use wildcards like 'module/dist/*/.zip'. See the #includes of Ant fileset for the exact format. The base directory is the workspace.
So you'll need to figure out where your target directory is relative to the workspace directory.
The archive feature copies/saves your build artifacts out of the workspace into the build's individual directory. You cannot specify where it puts them. That said, I would probably leave archiving turned on if you'll ever need to refer back to a previous version.
You can use a script build step to create the dir if it does not exist and perform the copy.
But you have not said why you want to move the artifacts around. If it is to make them available to other projects, you should look instead at the Copy Artifact build step.