Can gcloud builds submit run in background? - google-cloud-platform

I'm trying to automate my builds using Google Cloud Build. For various reasons the primary use flow will be to manually kick off builds rather than using a Github trigger.
Right now it looks like when you run
gcloud builds submit .
it kicks off the build process but the gcloud command must stay running for the build to continue. If I Ctrl-C it then it stops the entire build process.
I could run it in the background but presumably if I get disconnected from the network or my laptop goes to sleep that will also interrupt the build process(?)
Clearly the build can run in the background since that's what a Github trigger does; is there any way I can do the same thing from the command line?

If you add --async to your gcloud builds ... command, the job will be run asynchronously, as a long-running operation.
You may query the state of this operation in order to determine the state of the build.
https://cloud.google.com/sdk/gcloud/reference/builds/submit
Alternatively, you may use something like Linux screen to keep any job running after you detach.

Related

Programmatically "stop" Sagemaker instance

I can automatically shutdown a sagemaker instance as indicated here, by using lifecycle configuration
https://stackoverflow.com/questions/53609409/automatically-stop-sagemaker-notebook-instance-after-inactivity
Is there a way to achieve this programmatically, by means of any ''image terminal'' command?
By image terminal, it is meant the Linux shell that can be opened via ''Launcher'' in ''Sagemaker Studio''
My use case is large computational jobs, where the idle inactive time in the automatic solution would be quite expensive.
It would be useful to have a shutdown_instance(), to be added as last command in lengthy shell scripts

AWS CodeBuild projects not running concurrently

I have multiple different, independent AWS CodeBuild projects set up. All of them are using the aws/codebuild/amazonlinux2-x86_64-standard:3.0 image.
What I expect is for them to run concurrently, whenever a BitBucket webhook triggers them on PUSH. However, what happens is that one build from one build definition runs, whereas the others, belonging to different build definition projects are in QUEUED status, waiting for the first build to complete. Only then do they get the build environment provisioned.
I have also tried the ubuntu standard 5.0 runtime.
Is there a way to instruct CodeBuild to run different build projects in parallel, not waiting for each other to finish?

Cloud-init script for Google Cloud Platform

I am looking for a way to execute a script at instance launch in Google cloud platform similar to user data in AWS. I check 'Startup script' but it is executed at every boot. is there any way to achieve it?
Yes, accordingly to the documentation startup script runs on each boot and there's no option available to change this behavior:
Compute Engine lets you create and run your own startup scripts on
your virtual machine (VM) instances to perform automated tasks every
time your instance boots up. Startup scripts can perform actions such
as installing software, performing updates, turning on services, and
any other tasks defined in the script.
To solve this issue you can use this workaround:
Set up a flag, such as a file on the disk, when your startup script runs first time.
Check existence of this flag in your startup script and exit without any action if the flag exists.

How to run dependency Jobs on PCF?

I have two batch applications ex: batchapp1 and batchapp2 . I want to run batchapp2 after completion of the batchapp1. Can we achieve it using PCF scheduler or can we achive it using Spring data-flow-server?
Right now we are doing it using Control-M and run on VM's JVM (Not on cloud).
What you need is Spring Cloud Data Flow's Composed Task functionality. With this, you'd be able to orchestrate a series of Tasks/Batch-Jobs as Direct Acyclic Graph. A graph can include sequential, parallel, or both where each of the steps is a Task/Batch-job.
For your example, in SCDF, the DSL representation would look like:
task create foo --definition "batchapp1 && batchapp2"
Upon launching the Task definition foo in SCDF on PCF, it would launch batchapp1 first and upon success/failure, it would run the batchapp2 next. You can also have transitions to run cleanup/error-handling steps based on the exit-code at each step.
As an alternative, you could do all this on the interactive drag & drop visual canvas, too.
Also to note, in PCF, all the steps would be launched as short-lived CF Tasks, which run for a finite time. That would be them running as long as the App needs to run and then it is cleanly shutdown to free-up resources.

Jenkins - Executing a script before any build job built

I'd like to execute a script before I execute any project/build on a Jenkins server. I've checked out the BuildWrapper script and a couple of others.
These all seem to be based on a specific job, but I'd like to get this sorted to work across any job/build on this server.
You could add an upstream job that all jobs depends on.
Specifically, In the buildsteps of your job, you can choose "Trigger/Call builds on
other projects", add ParentJob and select "Block until the triggered
projects finish their builds" before invoking the buildsteps of your job.