Google Cloud Liens not protecting from Project Deletion - google-cloud-platform

I have set up gcp liens as described here.
Unfortunately when I try to delete the project using an owner account the project is deleted.
Does it take some time to take effect or is there some other kind of extra configuration?
In order to achieve so I used the commands specified on the documentation
gcloud alpha resource-manager liens create --restrictions=resourcemanager.projects.delete --reason="Super important production system" --project projectId
Then I check the rule
> gcloud alpha resource-manager liens list --project projectId --format json
[
{
"createTime": "2020-01-23T07:53:19.938621Z",
"name": "liens/p111111111111-420a1a11-8dee-4b07-a7fe-5112b00e898d",
"origin": "john#doe.com",
"parent": "projects/111111111111",
"reason": "Super important production system",
"restrictions": [
"resourcemanager.projects.delete"
]
}
]

You need to have the “Project Lien Modifier” role for your user at the Organization level.
Then you can open the cloud shell and run this command
gcloud alpha resource-manager liens create --restrictions=resourcemanager.projects.delete --reason="Important PJ" --project=[YOUR-PJ-NAME] --verbosity=debug
** EDIT:
I test it in a a no-organization project and the lien doesn't work. This feature is in alpha, looks like this does not support individual projects currently. It was made thinking in large organisation with hundreds of projects

Related

Cloud Build Failed to trigger build: generic::permission_denied: Permission denied

I'm trying to use a use cloud build for my cloud run project. I have this cloudbuild.json:
{
"steps": [
{
"name": "gcr.io/cloud-builders/docker",
"args": ["build", "-t", "eu.gcr.io/$PROJECT_ID/keysafe", "."]
},
{
"name": "gcr.io/cloud-builders/docker",
"args": [
"push",
"us-central1-docker.pkg.dev/${PROJECT_ID}/my-docker-repo/myimage"
]
}
],
"options": {
"logging": "CLOUD_LOGGING_ONLY"
}
}
And I keep getting a permission denied error. I've tried running it without a service account and using my permissions (I'm the owner), and with a service account even with the owner role.
It was originally working but since my project transitioned from Container registry to Artifact repository, I was getting an error
generic::invalid_argument: generic::invalid_argument: if 'build.service_account' is specified, the build must either (a) specify 'build.logs_bucket' (b) use the CLOUD_LOGGING_ONLY logging option, or (c) use the NONE logging option
That error persisted through both my account and the service account, which is why I switched to building from a cloudbuild.json file, not just my Dockerfile alone.
All the other Stack Overflow articles I've found suggest permissions to assign, but the service account and I have owner permissions and even adding the suggested permissions on top of Owner did not help.
Here are the permissions of the service account:
Here is the trigger configuration:
If anyone ends up in my position this is how I fixed it.
I ended up deleting the Cloud Run and Build and then recreated them. This gave me a pre-made cloudbuild.yaml which I added the option logging: CLOUD_LOGGING_ONLY, still using the same service account. I'm not sure why this fixed it but it does seem to be working.

gcloud builds submit of Django website results in error "does not have storage.objects.get access"

I'm trying to deploy my Django website with Cloud Run, as described in Google Cloud Platform's documentation, but I get the error Error 403: 934957811880#cloudbuild.gserviceaccount.com does not have storage.objects.get access to the Google Cloud Storage object., forbidden when running the command gcloud builds submit --config cloudmigrate.yaml --substitutions _INSTANCE_NAME=trouwfeestwebsite-db,_REGION=europe-west6.
The full output of the command is: (the error is at the bottom)
Creating temporary tarball archive of 119 file(s) totalling 23.2 MiB before compression.
Some files were not included in the source upload.
Check the gcloud log [C:\Users\Sander\AppData\Roaming\gcloud\logs\2021.10.23\20.53.18.638301.log] t
o see which files and the contents of the
default gcloudignore file used (see `$ gcloud topic gcloudignore` to learn
more).
Uploading tarball of [.] to [gs://trouwfeestwebsite_cloudbuild/source/1635015198.74424-eca822c138ec
48878f292b9403f99e83.tgz]
ERROR: (gcloud.builds.submit) INVALID_ARGUMENT: could not resolve source: googleapi: Error 403: 934957811880#cloudbuild.gserviceaccount.com does not have storage.objects.get access to the Google Cloud Storage object., forbidden
On the level of my storage bucket, I granted 934957811880#cloudbuild.gserviceaccount.com the permission Storage Object Viewer, as I see on https://cloud.google.com/storage/docs/access-control/iam-roles that this covers storage.objects.get access.
I also tried by granting Storage Object Admin and Storage Admin.
I also added the "Viewer" role on IAM level (https://console.cloud.google.com/iam-admin/iam) for 934957811880#cloudbuild.gserviceaccount.com, as suggested in https://stackoverflow.com/a/68303613/5433896 and https://github.com/google-github-actions/setup-gcloud/issues/105, but it seems fishy to me to give the account such a broad role.
I enabled Cloud run in the Cloud Build permissons tab: https://console.cloud.google.com/cloud-build/settings/service-account?project=trouwfeestwebsite
With these changes, I still get the same error when running the gcloud builds submit command.
I don't understand what I could be doing wrong in terms of credentials/authentication (https://stackoverflow.com/a/68293734/5433896). I didn't change my google account password nor revoked permissions of that account to the Google Cloud SDK since I initialized that SDK.
Do you see what I'm missing?
The content of my cloudmigrate.yaml is:
steps:
- id: "build image"
name: "gcr.io/cloud-builders/docker"
args: ["build", "-t", "gcr.io/${PROJECT_ID}/${_SERVICE_NAME}", "."]
- id: "push image"
name: "gcr.io/cloud-builders/docker"
args: ["push", "gcr.io/${PROJECT_ID}/${_SERVICE_NAME}"]
- id: "apply migrations"
name: "gcr.io/google-appengine/exec-wrapper"
args:
[
"-i",
"gcr.io/$PROJECT_ID/${_SERVICE_NAME}",
"-s",
"${PROJECT_ID}:${_REGION}:${_INSTANCE_NAME}",
"-e",
"SETTINGS_NAME=${_SECRET_SETTINGS_NAME}",
"--",
"python",
"manage.py",
"migrate",
]
- id: "collect static"
name: "gcr.io/google-appengine/exec-wrapper"
args:
[
"-i",
"gcr.io/$PROJECT_ID/${_SERVICE_NAME}",
"-s",
"${PROJECT_ID}:${_REGION}:${_INSTANCE_NAME}",
"-e",
"SETTINGS_NAME=${_SECRET_SETTINGS_NAME}",
"--",
"python",
"manage.py",
"collectstatic",
"--verbosity",
"2",
"--no-input",
]
substitutions:
_INSTANCE_NAME: trouwfeestwebsite-db
_REGION: europe-west6
_SERVICE_NAME: invites-service
_SECRET_SETTINGS_NAME: django_settings
images:
- "gcr.io/${PROJECT_ID}/${_SERVICE_NAME}"
Thank you very much for any help.
The following solved my problem.
DazWilkin was right in saying:
it's incorrectly|unable to reference the bucket
(comment upvote for that, thanks!!). In my secret (configured on Secret Manager; or alternatively you can put this in a .env file at project root folder level and making sure you don't exclude that file for deployment in a .gcloudignore file then), I now
have set:
GS_BUCKET_NAME=trouwfeestwebsite_sasa-trouw-bucket (project ID + underscore + storage bucket ID)
instead of
GS_BUCKET_NAME=sasa-trouw-bucket
Whereas the tutorial in fact stated I had to set the first, I had set the latter since I found the underscore splitting weird, nowhere in the tutorial had I seen something similar, I thought it was an error in the tutorial.
Adapting the GS_BUCKET_NAME changed the error of gcloud builds submit to:
Creating temporary tarball archive of 412 file(s) totalling 41.6 MiB before compression.
Uploading tarball of [.] to [gs://trouwfeestwebsite_cloudbuild/source/1635063996.982304-d33fef2af77a4744a3bb45f02da8476b.tgz]
ERROR: (gcloud.builds.submit) PERMISSION_DENIED: service account "934957811880#cloudbuild.gserviceaccount.com" has insufficient permission to execute the build on project "trouwfeestwebsite"
That would mean that least now the bucket is found, only a permission is missing.
Edit (a few hours later): I noticed this GS_BUCKET_NAME=trouwfeestwebsite_sasa-trouw-bucket (project ID + underscore + storage bucket ID) setting then caused trouble in a later stage of the deployment, when deploying the static files (last step of the cloudmigrate.yaml). This seemed to work for both (notice that the project ID is no longer in the GS_BUCKET_NAME, but in its separate environment variable):
DATABASE_URL=postgres://myuser:mypassword#//cloudsql/mywebsite:europe-west6:mywebsite-db/mydb
GS_PROJECT_ID=trouwfeestwebsite
GS_BUCKET_NAME=sasa-trouw-bucket
SECRET_KEY=my123Very456Long789Secret0Key
Then, it seemed that there also really was a permissions problem:
for the sake of completeness, afterwards, I tried adding the permissions as stated in https://stackoverflow.com/a/55635575/5433896, but it didn't prevent the error I reported in my question.
This answer however helped me: https://stackoverflow.com/a/33923292/5433896. =>
Setting the Editor role on the cloudbuild service account helped the gcloud builds submit command to continue its process further without throwing the permissions error.
If you have the same problem: I think a few things mentioned in my question can also help you - for example I think doing this may also have been important:
I enabled Cloud run in the Cloud Build permissons tab:
https://console.cloud.google.com/cloud-build/settings/service-account?project=trouwfeestwebsite

How to list all projects in GCP that belongs to a specific organization

gcloud allows you to list organization, folders or projects. I didn't found a option to list projects inside a organization.
Something like:
gcloud projects list --organization=ORG
You can use Cloud Asset inventory. The base query is the following:
gcloud beta asset search-all-resources \
--asset-types=cloudresourcemanager.googleapis.com/Project \
--scope=organizations/920778098964
You can play with page size if you want to have a long list of results. More details here
I personally prefer to export to BigQuery all the assets and then query what I want in it. Project, but also VM, firewall rules,....
I think there's no quick way like you mentioned with --organization arg, but that could be accomplished with, for example, the following UNIX-like script:
for project_id in $(gcloud projects list --format='value(project_id)'); do
org_id=$(gcloud projects describe $project_id --format='value(parent.id)')
if [ $org_id -eq $the_org_you_want_to_find_out ]; then
echo "$org_id > $project_id"
fi
done
You can use gcloud:
gcloud projects list --filter 'parent.id=id-organization123456 AND parent.type=organization' | awk '{print $1 }' > projects.txt
Not what you asked, but if you want to filter to all projects assigned to a folder you can use --filter='parent.id:40123456789':
$ gcloud organizations list
DISPLAY_NAME ID DIRECTORY_CUSTOMER_ID
example.com 10123456789 C0123abc
$ gcloud resource-manager folders list --organization=10123456789
DISPLAY_NAME PARENT_NAME ID
demos organizations/10123456789 40123456789
test organizations/10123456789 50123456789
$ gcloud projects list --filter='parent.id:40123456789'
PROJECT_ID NAME PROJECT_NUMBER
demo-one demo-one 301123456789
demo-two demo-two 302123456789
demo-three demo-three 303123456789
You could list all the project in an organization using the following command:
gcloud projects list
The definition of the command is:
Lists all active projects, where the active account has Owner, Editor
or Viewer permissions. Projects are listed in alphabetical order by
project name. Projects that have been deleted or are pending deletion
are not included.
If you only need the project_ID, the name or the project number you could use:
gcloud projects list --format 'value(project_id)'
gcloud projects list --format 'value(name)'
gcloud projects list --format 'value(project_number)'

gcloud "config set project" works but can't fetch project?

I'm new to gcloud. I've created a project called playground but can't seem to be able to use it. Here are my commands on the terminal
$ gcloud config set project playground
Updated property [core/project].
$ gcloud compute instances create --zone us-west1-a playground-instance
ERROR: (gcloud.compute.instances.create) Could not fetch resource:
- Failed to find project playground
I'm very confused! Could this be an issue with path?
Here is what I've tried.
First I thought maybe its a path issue. I have /Users/macuser/google-cloud-sdk/bin in $PATH. Do I need anything else?
I've insured that the right user is logged in via gcloud auth list and have additionally set it explicitly with gcloud config set account ...
Any advise or suggestions greatly appreciated.
Additional info
$ gcloud config list
[core]
account = babak#hemaka.com
disable_usage_reporting = True
project = playground
Your active configuration is: [playground]
GCP projects have several different identifiers:
"project number" is a unique 12 digit number assigned to each project
"project ID" is a unique alphanumeric ID you can chose when creating a project, but can't be changed. This often defaults to something like obvious-animal-1234.
"project name" which is a freeform text string you can choose and change at will.
You should use the project ID with gcloud.
It is possible "playground" is actually the project name, and not it's ID. Run gcloud projects list to see a list of projects and their ID/number/name to verify you are using the right identifier.
Just make sure you honour the spaces as mentioned in the documentation.
For example: Check the below command that worked for me:
gcloud compute --project \
"YOU RPROJECT_ID" ssh \
--zone "YOUR_ZONE" \
"YOUR INSTANCE"

How to change the project in GCP using CLI commands

How can I change the current running project to another project in GCP (Google Cloud Platform) account using cli commands other than using gcloud init manually?
gcloud projects list will list the projects running on my account. I want to change the current project to any other project from the list using a cli command.
gcloud config set project $MY_PROJECT_ID
#=>
Updated property [core/project].
You may also set the environment variable $CLOUDSDK_CORE_PROJECT.
Make sure you are authenticated with the correct account:
gcloud auth list
* account 1
account 2
Change to the project's account if not:
gcloud config set account `ACCOUNT`
Depending on the account, the project list will be different:
gcloud projects list
- project 1
- project 2...
Switch to intended project:
gcloud config set project `PROJECT ID`
You should actually use the project ID and not the name as the other answers imply.
Example:
gcloud projects list
PROJECT_ID NAME PROJECT_NUMBER
something-staging-2587 something-staging 804012817122
something-production-24 something-production 392181605736
Then:
gcloud config set project something-staging-2587
It's also the same thing when using just the --project flag with one of the commands:
gcloud --project something-staging-2587 compute ssh my_vm
If you use the name it will silently accept it but then you'll always get connection or permission issues when trying to deploy something to the project.
The selected answer doesn't help if you don't know the name of projects you have added gcloud already. My flow is to list the active projects, then switch to the one I want.
gcloud config configurations list
gcloud config configurations activate [NAME]
where [NAME] is listed from the prior command.
It could be that I'm late to answer, but this command made me learn a lot about gcloud SDK
gcloud alpha interactive
It's easier to discover by yourself that you'll need gcloud config set project my-project.
However, what I like about gcloud is tab completion, so if you configure your gcloud config with configurations (I know it sounds weird but run this command gcloud config configurations list) you can easily switch between your own projects that you usually work:
The alias that I use is:
alias gcca="gcloud config configurations activate" and it works fine with zsh gcloud plugin.
EDIT:
To configure one of configurations I usually do this
gcloud config configurations create [CUSTOM_NAME]
gcloud auth login # you can also manually set but this is for lazy one
gcloud config set project [gcp-project-id]
gcloud config set compute/zone europe-west3-c
gcloud config set compute/region europe-west3
You can use ENV variables too to configure zone/project but I like when it's configured this way so I can use tab complication between projects.
Do the following:
1 - gcloud auth list
2 - gcloud projects list
3 - gcloud config set project *projectid*
(replace project id with actual project id)
Also, if you are using more than one project and don't want to set global project every time, you can use select project flag.
For example: to connect a virtual machine, named my_vm under a project named my_project in Google Cloud Platform:
gcloud --project my_project compute ssh my_vm
This way, you can work with multiple project and change between them easily by just putting project flag. You can find much more information about other GCP flags from here.
For what its worth if you have a more than a handful of projects, which I do, use:
gcloud init
This will list all your projects and give you the option to change current project settings, add a new project configuration or switch:
Pick configuration to use:
[1] Re-initialize this configuration [esqimo-preprod] with new settings
[2] Create a new configuration
[3] Switch to and re-initialize existing configuration: [default]
[4] Switch to and re-initialize existing configuration: [project 1]
[5] Switch to and re-initialize existing configuration: [project 2]
Please enter your numeric choice:
It will always ask you to login and display options for different google accounts that you may have.
Given that I manage multiple organisations and projects this approach lets' me to simply switch between them.
I do prefer aliases, and for things that might need multiple commands, based on your project needs, I prefer functions...
Example
function switchGCPProject() {
gcloud config set project [Project Name]
// if you are using GKE use the following
gcloud config set container/cluster [Cluster Name]
// if you are using GCE use the following
gcloud config set compute/zone [Zone]
gcloud config set compute/region [region]
// if you are using GKE use the following
gcloud container clusters get-credentials [cluster name] --zone [Zone] --project [project name]
export GOOGLE_APPLICATION_CREDENTIALS=path-to-credentials.json
}
gcloud projects list
To get List of Projects.
gcloud config set project [Project-ID]
For setting default project.
You can also export your project id into variable to use in future commands which helps in avoiding typos with following.
MY_PROJECT_ID=[Project-ID]
echo $MY_PROJECT_ID
Check the available projects by running: gcloud projects list. This will give you a list of projects which you can access.
To switch between projects: gcloud config set project <project-id>.
Also, I recommend checking the active config before making any change to gcloud config. You can do so by running: gcloud config list
I'm posting this answer to give insights into multiple ways available for you to change the project on GCP. I will also explain when to use each of the following options.
Option 1: Cloud CLI - Set Project Property on Cloud SDK on CLI
Use this option, if you want to run all Cloud CLI commands on a specific project.
gcloud config set project <Project-ID>
With this, the selected project on Cloud CLI will change, and the currently selected project is highlighted in yellow.
Option 2: Cloud CLI - Set Project ID flag with most Commands
Use this command if you want to execute commands on multiple projects. Eg: create clusters in one project, and use the same configs to create on another project. Use the following flag for each command.
--project <Project-ID>
Option 3: Cloud CLI - Initialize the Configurations in CLI
This option can be used if you need separate configurations for different projects/accounts. With this, you can easily switch between configurations by using the activate command. Eg: gcloud config configurations activate <congif-name>.
gcloud init
Option 4: Open new Cloud Shell with your preferred project
This is preferred if you don't like to work with CLI commands. Press the PLUS + button for a new tab.
Next, select your preferred project.
I add aliases to the .bash_alaises to switch to a different project.
alias switch_proj1="gcloud config set project ************"
Here is a script to generate aliases :) for all projects listed.
Please update the switch_proj to unique project aliases that you can remember.
gcloud projects list | awk '{print "alias switch_proj=\"gcloud config set project " $1 "\""}'
To update your existing project to another project, you can use this command line:
gcloud projects update PROJECT_ID --name=NAME
NAME: will be the new name of your project.
Check your project by running gcloud config list
Then gcloud config set "project name"
You can try: gcloud config set project [project_id]
add this below script in ~/.bashrc and do please replace project name(projectname) with whatever the name you needed
function s() {
array=($(gcloud projects list | awk /projectname/'{print $1}'))
for i in "${!array[#]}";do printf "%s=%s\n" "$i" "${array[$i]}";done
echo -e "\nenter the number to switch project:\c"
read project
[ ${array[${project}]} ] || { echo "project not exists"; exit 2; }
printf "\n**** Running: gcloud config set project ${array[${project}]} *****\n\n"
eval "gcloud config set project ${array[${project}]}"
}
Just use the gcloud projects list to get the project you have . Get the PROJECT_ID of the poject to use
After that use gcloud set project --project=PROJECT_ID to set the project.
You can change the project using the gcloud command:
gcloud config set project <your_project_name>
gcloud config set project <PROJECT_ID>
Before setting the Gcloud project see the list of available project
gcloud projects list
then set the project using
gcloud config set project $MY_PROJECT_ID
make sure you are passing project Id (not project name as both are different)