Unable to delete Cloud Deploy Pipeline from GCP - google-cloud-platform

I have done a pipeline creation using newly launch CD tool by GCP, however it seems i cannt delete the pipeline once its created.
I used command as per following:
gcloud deploy delete --file=clouddeploy.yaml --region=us-central1 --project=myproject
Is there any restriction on deletion since i am getting following error while doing so:
$ deploy-quickstart gcloud deploy delete --file=clouddeploy.yaml --region=us-central1 --project=myproject
ERROR: (gcloud.deploy.delete) FAILED_PRECONDITION: Resource '"projects/myproject/locations/us-central1/deliveryPipelines/my-demo-app-1"' has nested resources

I have tried the same command as on a pipeline created with the quickstart and got the same error message as you. As mentioned by Atef H.
in the comments, you need to use the --force flag as your pipeline has subresources (releases/rollouts).

Related

Deploy secrets Gcloud run from pipeline

I'm trying to deploy to gcp secrets stored as variables in azure devops.
But when I try the command
gcloud run deploy --update-secrets=myvar=$(myvar)
as stated in the official documentation gcloud run deploy --update-secrets=[KEY=VALUE,…] , it throws this error:
ERROR: (gcloud.run.deploy) No secret version specified for myvar. Use myvar:latest to reference the latest version.
##[error]Cmd.exe exited with code '1'.
Given that the pipeline correctly recognizes $(myvar). And given that changing gcloud run deploy --update-secrets=myvar:latest=$(myvar) won't give effects.
How should I release this secret?
Thanks
The square brackets are just indications that you can provide multiple values, separated by a comma.
So this should work:
gcloud run deploy --update-secrets="myvar=$(myvar)"
A thing that worked was running
gcloud run deploy --update-secrets = myvar=$(myvar):latest
Documentation is available here: https://cloud.google.com/run/docs/configuring/secrets#command-line

gcloud beta run deploy --source . throws 412

Due to corporate restrictions, I'm supposed to host everything on GCP in Europe. The organisation I work for, has set a restriction policy to enforce this.
When I deploy a cloud run instance from source with gcloud beta run deploy --source . --region europe-west1 it seems the command tries to store the temporary files in a storage bucket in the us, which is not allowed. The command then throws a 412 error.
➜ gcloud beta run deploy cloudrun-service-name --source . --platform managed --region=europe-west1 --allow-unauthenticated
This command is equivalent to running `gcloud builds submit --tag [IMAGE] .` and `gcloud run deploy cloudrun-service-name --image [IMAGE]`
Building using Dockerfile and deploying container to Cloud Run service [cloudrun-service-name] in project [PROJECT_ID] region [europe-west1]
X Building and deploying new service... Uploading sources.
- Uploading sources...
. Building Container...
. Creating Revision...
. Routing traffic...
. Setting IAM Policy...
Deployment failed
ERROR: (gcloud.beta.run.deploy) HTTPError 412: 'us' violates constraint 'constraints/gcp.resourceLocations'
I see the Artifact Registry Repository being created in the correct region, but not the storage bucket.
To bypass this I have to create a storage bucket first in the correct region with the name PROJECT_ID_cloudbuild. Is there any other way to fix this?
Looking at the error message indicates that the bucket is forced to be created in the US regardless of the Organisation policy set in Europe. As per this public issue tracker comment,
“Cloud build submit creates a [PROJECT_ID]_cloudbuild bucket in the
US. This will of course not work when resource restrictions apply.
What you can do as a workaround is to create that bucket yourself in
another location. You should do this before your first cloud build
submit.”
This has been a known issue and I found two workarounds that can help you achieve what you want.
The first workaround is by using “gcloud builds submit” with additional flags:
Create a new bucket with the name [PROJECT_ID]_cloudbuild in the
preferred location.
Specify non-buckets using --gcs-source-staging-dir and
--gcs-log-dir 2 ===> this flag is required as if it is not set
it will create a bucket in the US.
The second workaround is by using a cloudbuild.yaml and the “--gcs-source-staging-dir” flag:
Create a bucket in the region, dual-region or multi-region you may
want
Create a cloudbuild.yaml for storing a build artifacts
You can find an example of the YAML file in the following external
documentation, please note that I cannot vouch for its accuracy
since it is not from GCP.
Run the command :
gcloud builds submit
--gcs-source-staging-dir="gs://example-bucket/cloudbuild-custom" --config cloudbuild.yaml
Please try these workarounds and let me know if it worked for you.

getting get-credentials requires edit permission error on gcp

I'm trying to setup credentials for kubernetes on my local.
gcloud container clusters get-credentials ***** --zone **** --project elo-project-267109
This query works fine when I tried it from cloud shell, but I got this error when I tried run it from my terminal:
ERROR: (gcloud.container.clusters.get-credentials) get-credentials requires edit permission on elo-project-267109
I've tried this query from admin account as well as default service account also from new service account by assigning editor role and it still doesn't seem to work for me.
i am using macOs Mojave(10.14.6) and gcloud SDK version installed in my system is 274.0.1
i was able to resolve this issue on my local but i was actually trying to build a CI/CD from gitlab and the issue persists there, i have tried using gcloud(279.0.0) image version.
i am new to both gitlab and gcloud. i am trying to build CI/CD pipeline for the first time.
Do gcloud auth list to see which account are you logged into.
You need to login with the account which has the correct credentials to access the action that you're trying to perform.
To set the gcloud account: gcloud config set account <ACCOUNT>
It's turned out to be the image version mismatch issue on GitLab.

Cloud Composer is not getting deleted

Cloud Composer is not getting deleted properly with this error:
DELETE operation on this environment failed 4 days ago with the following error message:
RPC Skipped due to required preoperation not finished yet.
RPC Skipped due to required preoperation not finished yet.
here's the error screenshot:
Please, follow the below steps to delete environments resources manually:
Delete GKE cluster, that corresponds to environment
Delete the Google Storage bucket used by environment
Delete the related deployments with:
gcloud deployment-manager deployments delete <DEPLOYMENT_NAME> --delete-policy=ABANDON
Then try again to delete the Composer environments with:
gcloud composer environments delete <ENVIRONMENT_NAME> --location <LOCATION>
The problem you are facing could be also related with a misconfiguration with the Cloud Composer service account in your project. By default, Cloud Composer environments run as the Compute Engine default service account, but when you are using a custom service account, at a minimum, that service account requires the permissions that the composer.worker role provides to access resources in the Cloud Composer environment. Please refer to this documentation for further details about how to grant a role to a service account.
Please, try to add the policy binding for Cloud Composer API Service Agent role to the service account, so the command would be:
gcloud projects add-iam-policy-binding <PROJECT_ID> --member=<MEMBER> --role=roles/composer.serviceAgent
The member should be of the form user|group|serviceAccount:email or domain:domain (refer to documentation).
Then, please retry the action of remove your Composer environments. I hope you find the above pieces of information useful.

gcloud crashed (AttributeError): 'NoneType' object has no attribute 'revisionTemplate'

I'm working on Cloud Run, which seems to be beta yet, preventing from redeploying as shown below. It works if I delete the service from GCP console, then deploy the same Docker as a new service. I could not find a way to to set revisionTemplate.
I run this command to deploy a Cloud Run service using gcloud.
gcloud beta run deploy v2-cms --image gcr.io/my-project/v2-cms --quiet
Then, it fails saying like this.
X Deploying...
. Creating Revision...
. Routing traffic...
Deployment failed
ERROR: gcloud crashed (AttributeError): 'NoneType' object has no attribute 'revisionTemplate'
If you would like to report this issue, please run the following command:
gcloud feedback
To check gcloud for common problems, please run the following command:
gcloud info --run-diagnostics
To fix this issue, please update gcloud to ite latest version with gcloud components update
Make sure that your local Tensorflow version is still supported by GCloud https://cloud.google.com/ai-platform/training/docs/runtime-version-list