Missing folder from CloudBuild - google-cloud-platform

I am using CloudBuild for a build but find that its missing a folder I have in my local directory
/cloudbuild.yaml
/scripts
/output
.gitkeep
/...
One of my build steps writes to scripts/output directory. But it fails because its missing. It works when I create that directory first (mkdir scripts/output). Why do I need to do that? Whats causing the folder to be missing?

Related

Build on Vercel with Prisma is not working recently (how to include schema.prisma file?)

I developed SvelteKit app with Prisma and am trying to deploy it on Vercel.
In package.json, configuration below should be set so that schema.prisma file located in root path is available when app is deployed.
"postbuild": "cp prisma/schema.prisma .vercel_build_output/functions/node/render/ && cp node_modules/#prisma/engines/*query* .vercel_build_output/functions/node/render/",
Problem is an error occurred during the build on Vercel, but that wasn't occurred before (~ May 2022).
I guess cause of the error is related to recent update of SvelteKit and found that directory of .vercel_build_output that is generated during the build is changed to the new .vercel recently. However, the new path structure for index.js (i.e. .vercel/output/functions/render.func/home/s/test/discord-bot-frontend/.svelte-kit/output/server/index.js) is so different from the previous (i.e. .vercel_build_output/functions/node/render/) that I cannot find right path for it.
Would you please let me know right setting for package.json?
Error message:
> discord-bot-frontend#0.0.1 postbuild
> cp prisma/schema.prisma .vercel_build_output/functions/node/render/ && cp node_modules/#prisma/engines/*query* .vercel_build_output/functions/node/render/
cp: cannot create regular file ‘.vercel_build_output/functions/node/render/’: No such file or directory
Error: Command "npm run vercel-build" exited with 1
I found that there is no need to modify .vercel_build_output now. In other words, the post build is not needed anymore.
It is probably because latest Vercel handles this point.

Include Procfile and .ebextension folder in Jenkins ZIP

I have a Maven Project with this folder structure (after run 'maven clean install' through jenkins):
PS: there are other files and directories inside this 'target' folder, but the ones I need are just those.
Well, for AWS deployment, I need to create a ZIP file with this exact structure (missing the 'ebs' folder):
My pipeline, in Jenkins, creates the ZIP with the jar ane 'ebs' folder inside, but I need Procfile and .ebextensions at root level, outside 'ebs' folder.
Jenkins configuration:
I also tried "ebs", "ebs/", "ebs/*" and "ebs/.". None works. What am I doing wrong? Should be simple to include files in ZIP package, but it doesnt.
To include Procfile file and .ebextensions folder inside ZIP package generated by Jenkins, the first step is to eliminate the 'ebs' folder during maven build.
With every file/directory inside target root, jenkins 'includes' configuration is: myapp.jar,Procfile,.ebextensions/**/*

Uploading files to a bluemix app and pointing to them from configuration files

I am trying to upload files to my bluemix app and I am having problems using and understanding the file system. After I have succesfully uploaded files I want to give their path on my configuration files.
Specifically, I want to upload a jar file to the server and later use it as javaagent.
I have tried approaching this isuue from several directions.
I see that I can create a folder in the liberty_buildpack and place the files inside I can later access it on the compilation-release phases from the tmp folder:
/tmp/buildpacks/ibm-websphere-liberty-buildpack/lib/liberty_buildpack/my_folder
Also I can see that in the file system that I see when building and deploying the app I can copy only to the folder located in:
/app
So I copied the JAR file to the app file and set it as a javaagent using 2 method:
Manually set enviorment variable JAVA_OPTS with java agent to point to /app/myjar.jar using cf set-env
Deploy a war file of the app using cf push from wlp server and set the java agent inside the server.xml file and attribute genericJvmArguments
Both of those methods didnt work, and either the deploy phase of the application failed or my features simply didnt work.
So I tried searching the application file system using cf files and came up with the app folder, but strangly it didn't have the same file as the folder I deploy and I couldn't find any connection to the deployed folder ot the build pack.
Can someone explain how this should be done correctly? namely, uploading the file and then how should I point to it from the enviorment variable/server file?
I mean should it be /app/something or maybe other path?
I have also seen the use of relative paths like #droplet.sandbox maybe its the way to address those files? and how should I access those folders from cf files
Thanks.
EDIT:
As I have been instructed in the comments I have added the jar file to the system, the problem is that when I add the javaagent variable to the enviorment variable JAVA_OPTS the deploy stage fails with the timeout error:
payload: {... "reason"=>"CRASHED", "exit_status"=>32, "exit_description"=>"failed to accept connections within health check timeout", "crash_timestamp"=>
1433864527}
The way I am assigning the javaagent is as follows:
cf set-env myApp JAVA_OPTS "path/agent.jar"
I have tried adding several location:
1. I have found that if I add the jar files to my WebContent folder I can find it in: /app/wlp/usr/servers/defaultServer/apps/myapp.war/resources/
2. I have copied the jar file from the /tmp location in the compilation phase to /home/vcap/app/agent.jar
3. I have located the jar file in /app/.java/jre/lib
none of those 3 paths worked.
I found out that if I give a wrong path the system behaves the same so it may be a path problem.
Any ideas?
Try this:
Put your agent jars in a folder called ".profile.d" inside your WAR package;
cf se your-app JAVA_OPTS -javaagent:/home/vcap/app/.profile.d/your.jar ;
Push the war to Bluemix.
Not sure if this is exactly the right answer, but I am using additional jar files in my Liberty application, so maybe this will help.
I push up a myapp.war file to bluemix. Within the war file, inside the WEB-INF folder, I have a lib folder that contains a number of jar files. The classes in those jar files are then used within the java code of my application.
myapp.war/WEB-INF/lib/myPlugin.jar
You could try doing something like that with the jar file(s) you need, building them into the war file.
Other than that, you could try the section Overlaying the JRE from the bluemix liberty documentation to add jars to the JRE.

How do I set the beanstalk .ebextensions .config "sources" key "target directory" to the current bundle directory

I'm working in a python 2.7 elastic beanstalk environment.
I'm trying to use the sources key in an .ebextensions .config file to copy a tgz archive to a directory in my application root -- /opt/python/current/app/utility. I'm doing this because the files in this folder are too big to include in my github repository.
However, it looks like the sources key is executed before the ondeck symbolic link is created to the current bundle directory so I can't reference /opt/python/ondeck/app when using the sources command because it creates the folder and then beanstalk errors out when trying to create the ondeck symbolic link.
Here are copies of the .ebextensions/utility.config files I have tried:
sources:
/opt/python/ondeck/app/utility: http://[bucket].s3.amazonaws.com/utility.tgz
Above successfully copies to /opt/python/ondec/app/utility but then beanstalk errors out becasue it can't create the symbolic link from /opt/python/bundle/x --> /opt/python/ondeck.
sources:
utility: http://[bucket].s3.amazonaws.com/utility.tgz
Above copies the folder to /utility right off the root in parallel with /etc.
You can use container_commands instead of sources as it runs after the application has been set up.
With container_commands you won't be able to use sources to automatically get your files and extract them so you will have to use commands such as wget or curl to get your files and untar them afterwards.
Example: curl http://[bucket].s3.amazonaws.com/utility.tgz | tar xz
In my environment (php) there is no transient ondeck directory and the current directory where my app is eventually deployed is recreated after commands are run.
Therefore, I needed to run a script post deploy. Searching revealed that I can put a script in /opt/elasticbeanstalk/hooks/appdeploy/post/ and it will run after deploy.
So I download/extract my files from S3 to a temporary directory in the simplest way by using sources. Then I create a file that will copy my files over after the deploy and put it in the post deploy hook directory .
sources:
/some/existing/directory: https://s3-us-west-2.amazonaws.com/my-bucket/vendor.zip
files:
"/opt/elasticbeanstalk/hooks/appdeploy/post/99_move_my_files_on_deploy.sh":
mode: "000755"
owner: root
group: root
content: |
#!/usr/bin/env bash
mv /some/existing/directory /var/app/current/where/the/files/belong

jenkins ci : how to select the artifacts to be archived

I'm using Jenkins to build a maven 2 project. As part of the build a couple of jar files get generated in the target directory. I would like Jenkins to archive/copy a specific jar from the target location to a custom folder.
How can I achieve this ? I've tried using the 'Archive the artifacts' post build option but it does not allow me to select the file under target. I get a error message saying such a location does not exist.
I'm new to Jenkins so any help is appreciated.
Thanks.
Sirius
You may have your file specification or the base directory for the artifacts wrong. From the help text:
Can use wildcards like 'module/dist/*/.zip'. See the #includes of Ant fileset for the exact format. The base directory is the workspace.
So you'll need to figure out where your target directory is relative to the workspace directory.
The archive feature copies/saves your build artifacts out of the workspace into the build's individual directory. You cannot specify where it puts them. That said, I would probably leave archiving turned on if you'll ever need to refer back to a previous version.
You can use a script build step to create the dir if it does not exist and perform the copy.
But you have not said why you want to move the artifacts around. If it is to make them available to other projects, you should look instead at the Copy Artifact build step.