GCP project creation via deploymentmanager - google-cloud-platform

So im trying to create a project with google cloud deployment manager,
Ive structured the setup roughly as below:
# Structure
Org -> Folder1 -> Seed-Project(Location where I am running deployment manager from)
Organization:
IAM:
-> {Seed-Project-Number}#cloudservices.gserviceaccount.com:
- Compute Network Admin
- Compute Shared VPC Admin
- Organisation Viewer
- Project Creator
# DeploymentManager Resource:
type cloudresourcemanager.v1.project
name MyNewProject
parent
id: '{folder1-id}'
type: folder
projectId: MyNewProject
The desired result is that MyNewProject should be created under Folder1.
However; It appears as if the deployment manager service account does not have sufficent permissions:
$ CLOUDSDK_CORE_PROJECT=Seed-Project gcloud deployment-manager deployments \
create MyNewDeployment \
--config config.yaml \
--verbosity=debug
Error message:
- code: RESOURCE_ERROR
location: /deployments/MyNewDeployment/resources/MyNewProject
message: '{"ResourceType":"cloudresourcemanager.v1.project",
"ResourceErrorCode":"403","ResourceErrorMessage":{"code":403,"message":"The
caller does not have permission","status":"PERMISSION_DENIED","statusMessage":"Forbidden","requestPath":"https://cloudresourcemanager.googleapis.com/v1/projects/MyNewProject","httpMethod":"GET"}}'
I've done some digging, and it appears to be calling the resourcemanager.projects.get method; The 'Compute Shared VPC Admin (roles/compute.xpnAdmin)' role should provide this permission as documented here: https://cloud.google.com/iam/docs/understanding-roles
Except that doesn't seem to be the case, whats going on ?
Edit
Id like to add some additional information gathered from debugging efforts:
These are the API requests from the deployment manager, (from the seed project).
You can see that the caller is an anonymous service account, this isn't what id expect to see. (Id expect to see {Seed-Project-Number}#cloudservices.gserviceaccount.com as the calling account here)
Edit-2
config.yaml
imports:
- path: composite_types/project/project.py
name: project.py
resources:
- name: MyNewProject
type: project.py
properties:
parent:
type: folder
id: "{folder1-id}"
billingAccountId: billingAccounts/REDACTED
activateApis:
- compute.googleapis.com
- deploymentmanager.googleapis.com
- pubsub.googleapis.com
serviceAccounts: []
composite_types/project/* is an exact copy of the templates found here:
https://github.com/GoogleCloudPlatform/deploymentmanager-samples/tree/master/community/cloud-foundation/templates/project

The key thing is that this is a GET operation, not an attempt to create the project. This is to verify global uniqueness of the project-id requested, and if not unique, PERMISSION_DENIED is thrown.
Lousy error message, lots of wasted developer hours !

Probably late, but just to share that I ran into similar issue today.Double checked every permission mentioned in the Readme for the serviceAccount under which the deployment manager job runs ({Seed-Project-Number}#cloudservices.gserviceaccount.com in the question), turned out that the Billing Account User role was not assigned contrary to what I thought earlier. Granting that and running it again worked.

Related

Why doesn't my Cloud Function (2nd gen) trigger?

After getting some help deploying the sample code for a cloud function with a storage trigger I now get the function deployed just fine, but now it won't trigger :-(
The function is deployed:
The deployed code (per the tutorial):
'use strict';
// [START functions_cloudevent_storage]
const functions = require('#google-cloud/functions-framework');
// Register a CloudEvent callback with the Functions Framework that will
// be triggered by Cloud Storage.
functions.cloudEvent('imageAdded', cloudEvent => {
console.log(`Event ID: ${cloudEvent.id}`);
console.log(`Event Type: ${cloudEvent.type}`);
const file = cloudEvent.data;
console.log(`Bucket: ${file.bucket}`);
console.log(`File: ${file.name}`);
console.log(`Metageneration: ${file.metageneration}`);
console.log(`Created: ${file.timeCreated}`);
console.log(`Updated: ${file.updated}`);
});
// [END functions_cloudevent_storage]
The deploy command I used:
gcloud functions deploy xxx-image-handler --gen2 --runtime=nodejs16 --project myproject --region=europe-west3 --source=. --entry-point=imageAdded --trigger-event-filters='type=google.cloud.storage.object.v1.finalized' --trigger-event-filters='bucket=xxx_report_images'
To test I just uploaded a text file from command line:
gsutil cp test-finalize.txt gs://xxx_report_images/test-finalize.txt
Which worked fine, and the file was uploaded to the xxx_report_images bucket (which also is in the europe-west3 region) but the function is never triggered, which I can see by 0 items in the log:
gcloud beta functions logs read xxx-image-handler --gen2 --limit=100 --region europe-west3
Listed 0 items.
What is going on here? This seems very straight-forward and I fail to see what I'm missing, so hopefully I can get some guidance.
**** EDIT 1 *****
Regarding comment 1 below. I can see (in the Eventarch trigger list, picture below) that the service account used by eventarch is xxx-compute#developer.gserviceaccount.com
And I can see in the IAM Principals list that this indeed is the Default compute service account, which has the Editor role. I also explicitly added the Eventarc Connection Publisher role to this service account but without success (I assume the Editor role already contains this role, but just to be sure...). So I guess the issue is not related to your suggestion (?).
BTW. I tried the suggested gcloud compute project-info describe --format="value(defaultServiceAccount)" but just got Could not fetch resource: - Required 'compute.projects.get' permission for 'projects/myproject' and I couldn't figure out which role to add to which serviceaccount to enable this. However, as seen above I found the info in the Eventarc section in GCP instead.
**** EDIT 2 ****.
Indeed after more testing I think I have nailed it down to being a permission issue on Eventarc, just like suggested in the comment. I've posted a new question which is more specific.

Where in GCP can I see rejected API requests due to lacking permissions?

My question is technically independent of the context but I introduce it for the sake of clarity.
# export GOOGLE_APPLICATION_CREDENTIALS="/home/raffael/repos/dask/playground-310111-1d035231463d.json"
from dask_cloudprovider.gcp import GCPCluster
cluster = GCPCluster(
projectid="project_id",
n_workers=1,
source_image="projects/ubuntu-os-cloud/global/images/ubuntu-minimal-1804-bionic-v20210325",
zone="europe-west1-b",
)
I want to create a Dask cluster using above service account and code.
If I attach Project Owner role to that SA - it works.
If I attach "only" Compute Admin role to that SA - it fails.
For completeness' sake I list the error message but it doesn't say much more than that the supposedly created Scheduler instance doesn't exist.
Launching cluster with the following configuration:
Source Image: projects/ubuntu-os-cloud/global/images/ubuntu-minimal-1804-bionic-v20210325
Docker Image: daskdev/dask:latest
Machine Type: n1-standard-1
Filesytsem Size: 50
Disk Type: pd-standard
N-GPU Type:
Zone: europe-west1-b
Creating scheduler instance
Failed to find running VMI...
{'id': 'projects/project_id/zones/europe-west1-b/instances', 'selfLink': 'https://www.googleapis.com/compute/v1/projects/project_id/zones/europe-west1-b/instances', 'kind': 'compute#instanceList'}
Traceback (most recent call last):
File "gcp_cluster.py", line 9, in <module>
cluster = GCPCluster(
File "/home/raffael/miniconda3/envs/dask/lib/python3.8/site-packages/dask_cloudprovider/gcp/instances.py", line 603, in __init__
super().__init__(debug=debug, **kwargs)
[...]
File "/home/raffael/miniconda3/envs/dask/lib/python3.8/site-packages/dask_cloudprovider/gcp/instances.py", line 209, in create_vm
while await self.update_status() != "RUNNING":
File "/home/raffael/miniconda3/envs/dask/lib/python3.8/site-packages/dask_cloudprovider/gcp/instances.py", line 255, in update_status
raise Exception(f"Missing Instance {self.name}")
Exception: Missing Instance dask-9645b903-scheduler
So, the big question is - what permission is missing?
In Operations Logging I found the following error which correlates with the failure but I'm not sure if it is telling me something about what service was used and what permission is required.
#type and logName say something about "audit" so I Ctrl-F-ed for "audit" in all of Project Owner permissions and found as a "promising" candiate container.auditSinks.* but adding those doesn't solve the problem.
I guess it's bs anyway - but I'm not exactly sure what #type and logName refer to so I just gave it a try.
Back to my question - does GCP tell me somewhere what API calls where rejected due to missing permissions?
(and of course - if you happen to just know what permissions/roles I need to add - that would be splendid)
Try granting this role to your SA at the project level roles/iam.serviceAccountUser which will allow this SA to impersonate other SA in the project, included any SA used for Compute Engine. As you can see, this is mentioned in the documentation that when you grant the Compute Admin, role that you still need to assign the SA User role at project or SA level as described here
In the case that this don't work please share if you are using some tutorial or manual for this Dask cluster, or the steps to reproduce this and find the solution for this scenario.

How to remove a service account key on GCP using Ansible Playbook?

I am using an Ansible playbook to run certain modules that create service accounts and their respective keys. The code used to generate this is as found on the Ansible documentation:
- name: create a service account key
gcp_iam_service_account_key:
service_account: "{{ serviceaccount }}"
private_key_type: TYPE_GOOGLE_CREDENTIALS_FILE
path: "~/test_account.json"
project: test_project
auth_kind: serviceaccount
service_account_file: "/tmp/auth.pem"
state: present
Now I am trying to remove the service account key, so I changed the state value from present to absent but that doesn't seem to do much, wanted to ask if I'm missing something or if there is anything else I could try?
I'm not sure if it could be possible since I couldn't find the module on the ansible documentation, but in the deletion for instances examples, I see that after the absent state they use a tag for the deletion, it could be a way to do it for the SA. e.g.
state: absent
tags:
- delete
Other way that could be useful is to directly do the request to the REST API, e.g.
DELETE https://iam.googleapis.com/v1/projects/[PROJECT-ID]/serviceAccounts/[SA-NAME]#[PROJECT-ID].iam.gserviceaccount.com/keys/[KEY-ID]
I can confirm that it works when changing state from present to absent in version 1.0.2 of the google.cloud collection.
I believe that you expect the file in path: "~/test_account.json" to be deleted but in fact the key is deleted on the service account in GCP. You will have to delete the file yourself after the task has completed successfully.

“Create new version” ignores custom service account

I'm trying to deploy a new version of a model to AI Platform, it's a custom prediction routine. I've managed to deploy just fine when I have all the resources in the same GCP project, but when I try to deploy and I point the GCS files to a bucket in a different project, it fails to deploy. So I'm trying to pass which service account to use when creating the version, but it keeps ignoring it.
That's the message I get:
googleapiclient.errors.HttpError: <HttpError 400 when requesting https://ml.googleapis.com/v1/projects/[gcp-project-1]/models/[model_name]/versions?alt=json returned "Field: version.deployment_uri Error: The provided GCS prefix [gs://[bucket-gcp-project-2]/] cannot be read by service account service-*****#cloud-ml.google.com.iam.gserviceaccount.com.". Details: "[{'#type': 'type.googleapis.com/google.rpc.BadRequest', 'fieldViolations': [{'field': 'version.deployment_uri', 'description': 'The provided GCS prefix [gs://[bucket-gcp-project-2]/] cannot be read by service account service-******#cloud-ml.google.com.iam.gserviceaccount.com.'}]}]
My request looks like
POST https://ml.googleapis.com/v1/projects/[gcp-project-1]/models/[model_name]/versions?alt=json
{
"name": "v1",
"deploymentUri": "gs://[bucket-gcp-project-2]",
"pythonVersion": "3.5",
"runtimeVersion": "1.13",
"package_uris": "gs://[bucket-gcp-project-2]/model.tar.gz",
"predictionClass": "predictor.Predictor",
"serviceAccount": "my-service-account#[gcp-project-1].iam.gserviceaccount.com"
}
The service account has access in both projects
Specifying a service account is documented as a beta feature. Try using the gcloud SDK, e.g.:
gcloud components install beta
gcloud beta ai-platform versions create v1 \
--service-account my-service-account#[gcp-project-1].iam.gserviceaccount.com ...

Google Deployment Manager - Project creation permission denied

I am getting a 403 PERMISSION_DENIED response from GCP when running the deployment manager to create a deployment that creates a project, two service accounts and sets IAM policy for it using the cloud resource manager API.
- code: RESOURCE_ERROR
location: /deployments/test-deployment/resources/dm-test-project
message: '{"ResourceType":"cloudresourcemanager.v1.project","ResourceErrorCode":"403","ResourceErrorMessage":{"code":403,"message":"The
caller does not have permission","status":"PERMISSION_DENIED","statusMessage":"Forbidden","requestPath":"https://cloudresourcemanager.googleapis.com/v1/projects/dm-test-project","httpMethod":"GET"}}'
Before, I created a project 'DM Project Creation', enable some APIs, assign the Billing Account to it and then create a Service Account.
I already had an Organization node created, so I added the created Service Account in the org node and gave the following IAM roles:
- Project Creator
- Billing Account User
I was actually following this examples from Google Cloud Platform:
https://github.com/GoogleCloudPlatform/deploymentmanager-samples/tree/master/examples/v2/project_creation
https://github.com/GoogleCloudPlatform/deploymentmanager-samples/blob/master/community/cloud-foundation/templates/project/README.md
I run the following command to authenticate with the Service Account:
gcloud auth activate-service-account dm-project-creation#dm-creation-project-0.iam.gserviceaccount.com --key-file=/Users/famedina/Downloads/dm-creation-project-0-f1f92dd070ce.json
Then run the deployment manager passing the configuration file:
gcloud deployment-manager deployments create test-deployment --config config.yaml
imports:
- path: project.py
resources:
# The "name" property below will be the ID of the new project
# If you want your project to have a different name, use the "project-name"
# property.
- name: dm-test-project
type: project.py
properties:
# Change this to your organization ID.
organization-id: "<MY_ORG_ID"
# You can also create the project in a folder.
# If both organization-id and parent-folder-id are provided,
# the project will be created in parent-folder-id.
#parent-folder-id: "FOLDER_ID"
# Change the following to your organization's billing account
billing-account-name: billingAccounts/<MY_BILLING_ACC_ID>
# The apis to enable in the new project.
# To see the possible APIs, use: gcloud services list --available
apis:
- compute.googleapis.com
- deploymentmanager.googleapis.com
- pubsub.googleapis.com
- storage-component.googleapis.com
- monitoring.googleapis.com
- logging.googleapis.com
# The service accounts you want to create in the project
service-accounts:
- my-service-account-1
- my-service-account-2
bucket-export-settings:
create-bucket: true
# If using an already existing bucket, specify this
# bucket: <my bucket name>
# Makes the service account that Deployment Manager would use in the
# generated project when making deployments in this new project a
# project owner.
set-dm-service-account-as-owner: true
# The patches to apply to the project's IAM policy. Note that these are
# always applied as a patch to the project's current IAM policy, not as a
# diff with the existing properties stored in DM. This means that removing
# a binding from the 'add' section will not remove the binding on the
# project during the next update. Instead it must be added to the 'remove'
# section.
iam-policy-patch:
# These are the bindings to add.
add:
- role: roles/owner
members:
# NOTE: The DM service account that is creating this project will
# automatically be added as an owner.
- serviceAccount:98765432100#cloudservices.gserviceaccount.com
- role: roles/viewer
members:
- user:iamtester#deployment-manager.net
# The bindings to remove. Note that these are idempotent, in the sense
# that any binding here that is not actually on the project is considered
# to have been removed successfully.
remove:
- role: roles/owner
members:
# This is already not on the project, but in case it shows up, let's
# remove it.
- serviceAccount:1234567890#cloudservices.gserviceaccount.com```
I ran into this as well, and the error message is not actually explaining the underlying problem.
The key thing is that this is a GET operation, not an attempt to create the project. This is to verify global uniqueness of the project-id requested, and if not unique, PERMISSION_DENIED is thrown.
- code: RESOURCE_ERROR
location: /deployments/test-deployment/resources/dm-test-project
message: '{"ResourceType":"cloudresourcemanager.v1.project","ResourceErrorCode":"403","ResourceErrorMessage":{"code":403,"message":"The
caller does not have permission","status":"PERMISSION_DENIED","statusMessage":"Forbidden","requestPath":"https://cloudresourcemanager.googleapis.com/v1/projects/dm-test-project","httpMethod":"**GET**"}}'
Alot of room for improvement in the resulting error towards the end user.