Сreate a manage-zone using the google cloud console? - google-cloud-platform

Problem: I can't create managed-zones using the google cloud console.
What did I do?
Created a service account
Add role DNS Administrator
Created a json key
Executed the command
gcloud auth activate-service-account test235643#developer-dns-test.iam.gserviceaccount.com --key-file=/home/d.reznikov/Downloads/developer-dns-test-5a2088479459.json --project=developer-dns-testing
Executed the command
gcloud dns managed-zones create my_zone --dns-name my.zone.com. --description "My zone!"
I get error
ERROR: (gcloud.dns.managed-zones.create) User [test235643#developer-dns-test.iam.gserviceaccount.com] does not have permission to access project [developer-dns-test] (or it may not exist): Forbidden
Please help, maybe something else needs to be installed in the google cloud settings?

It looks like the project name is slightly different between the command used to activate the service account and the error message.
Command:
gcloud auth activate-service-account test235643#developer-dns-test.iam.gserviceaccount.com --key-file=/home/d.reznikov/Downloads/developer-dns-test-5a2088479459.json --project=developer-dns-testing
Error msg:
ERROR: (gcloud.dns.managed-zones.create) User [test235643#developer-dns-test.iam.gserviceaccount.com] does not have permission to access project [developer-dns-test] (or it may not exist): Forbidden
I would double check the project name and re authorize the service account using the correct one. Then retry to create the zone.

Related

Batch cannot pull docker image from Artifact Registry

I use a workflow to create a batch job using a docker image hosted in a docker registry.
All of this happens within the same google cloud project.
My batch job fails with this error :
"docker: Error response from daemon: Head "https://us-west1-docker.pkg.dev/v2/entity/docker-registry/image-name/manifests/latest": denied: Permission "artifactregistry.repositories.downloadArtifacts" denied on resource "projects/project-id/locations/us-west1/repositories/docker-registry" (or it may not exist).
See 'docker run --help'.
From google documentation I understand that Compute Engine's service account doesn't have the roles/artifactregistry.admin : Jobs default to using the Compute Engine default service account
I get the same error after giving the role to the service account :
gcloud projects add-iam-policy-binding project-id \
--member=serviceAccount:compute#developer.gserviceaccount.com \
--role=roles/artifactregistry.admin
While digging service accounts I found another service another service account and also gave it the role : service-xxxx#gcp-sa-cloudbatch.iam.gserviceaccount.com.
It does not solve the problem.
How can I see which service account is used ?
Can I see logs about denied permissions ?
The error occurs when you are trying to push an image on a repository in which a specific hostname associated with its repository location is not yet authenticated and specified in the credential helper.You may refer to this Setting up authentication for Docker .You may check and confirm the service account to make sure you are still impersonating the correct one ,run below as mentioned in document
gcloud auth list
This command will show the active account, along with the other
accounts that are authorized to access your Google Cloud project. The
active account will be marked with an asterisk (*).
Try to run the authentication using a command specifying the location of your repository.You may try to run the configure-docker command against the auth group and see.
gcloud auth configure-docker <location>-docker.pkg.dev
And then try pulling the Docker image again.
Refer Authenticating to a repository for more information and you can see these logs permission denied logs in Cloud logging for more details.

Google Cloud Invalid value for [ACCOUNT]

I am trying to upload a dockerfile to the Google Cloud containers.
Turns out, this is more difficult than actual developing the app.
I did everything the authenticaion page mentioned (https://cloud.google.com/container-registry/docs/advanced-authentication#gcloud-helper), but then i had to set up a key.
I followed this guide (https://cloud.google.com/iam/docs/creating-managing-service-account-keys#creating_service_account_keys), downloaded my json file.
I could not use gcloud shell, so i downloaded gcloud in my local machine from snaps.
Finally issued this command:
gcloud auth activate-service-account ACCOUNT --key-file=KEY-FILE
Where
ACCOUNT is the service account name in the format [USERNAME]#[PROJECT-ID].iam.gserviceaccount.com. You can view existing service accounts on the Service Accounts page of Cloud Console or with the command gcloud iam service-accounts list
KEY-FILE is the service account key file. See the Identity and Access Management (IAM) documentation for information about creating a key.
However, i get this error:
ERROR: (gcloud.auth.activate-service-account) Invalid value for [ACCOUNT]: The given account name does not match the account name in the key file. This argument can be omitted when using .json keys.
I don't know neither what is going on, or why i am getting this error, since i am doing everything by the book.
Some help would be much appreciated.

Google BigQuery P4 service account needs iam.serviceAccounts.getAccessToken Error

I am trying to config and run a BigQuery Transfer Service from Google Cloud Build but I had the following error message.
BigQuery error in mk operation: P4 service account needs
iam.serviceAccounts.getAccessToken permission. Running the following command may
resolve this error: gcloud projects add-iam-policy-binding --member='serviceAccount:service-#gcp-sa-bigquerydatatransfer.iam.gserviceaccount.com' --role='roles/iam.serviceAccountShortTermTokenMinter'
I have tried it using Python library as well as CLI but no luck. I know it can be done via console but I have to do it programatically.
Also I have granted roles/iam.serviceAccountShortTermTokenMinter to the bigquerydatatransfer iam service account.

What predefined IAM roles does a service account need to complete the Google Cloud Run Quickstart: Build and Deploy?

I want to compare Google Cloud Run to both Google App Engine and Google Cloud Functions. The Cloud Run Quickstart: Build and Deploy seems like a good starting point.
My Application Default Credentials are too broad to use during development. I'd like to use a service account, but I struggle to configure one that can complete the quickstart without error.
The question:
What is the least privileged set of predefined roles I can assign to a service account that must execute these commands without errors:
gcloud builds submit --tag gcr.io/{PROJECT-ID}/helloworld
gcloud beta run deploy --image gcr.io/{PROJECT-ID}/helloworld
The first command fails with a (seemingly spurious) error when run via a service account with two roles: Cloud Build Service Account and Cloud Run Admin. I haven't run the second command.
Edit: the error is not spurious. The command builds the image and copies it to the project's container registry, then fails to print the build log to the console (insufficient permissions).
Edit: I ran the second command. It fails with Permission 'iam.serviceaccounts.actAs' denied on {service-account}. I could resolve this by assigning the Service Account User role. But that allows the deploy command to act as the project's runtime service account, which has the Editor role by default. Creating a service account with (effectively) both Viewer and Editor roles isn't much better than using my Application Default Credentials.
So I should change the runtime service account permissions. The Cloud Run Service Identity docs have this to say about least privileged access configuration:
This changes the permissions for all services in a project, as well
as Compute Engine and Google Kubernetes Engine instances. Therefore,
the minimum set of permissions must contain the permissions required
for Cloud Run, Compute Engine, and Google Kubernetes Engine in a
project.
Unfortunately, the docs don't say what those permissions are or which set of predefined roles covers them.
What I've done so far:
Use the dev console to create a new GCP project
Use the dev console to create a new service account with the Cloud Run Admin role
Use the dev console to create (and download) a key for the service account
Create (and activate) a gcloud configuration for the project
$ gcloud config list
[core]
account = {service-account-name}#{project-id}.iam.gserviceaccount.com
disable_usage_reporting = True
project = {project-id}
[run]
region = us-central1
Activate the service account using the downloaded key
Use the dev console to enable the Cloud Run API
Use the dev console to enable Container Registry→Settings→Container Analysis API
Create a sample application and Dockerfile as instructed by the quickstart documentation
Run gcloud builds submit --tag gcr.io/[PROJECT-ID]/helloworld
...fails due to missing cloud build permissions
Add the Cloud Build Editor role to service account and resubmit build
...fails due to missing storage permissions. I didn't pay careful attention to what was missing.
Add the Storage Object Admin role to service account and resubmit build
...fails due to missing storage bucket permissions
Replace service account's Storage Object Admin role with the Storage Admin role and resubmit build
...fails with
Error: (gcloud.builds.submit) HTTPError 403:
<?xml version='1.0' encoding='UTF-8'?>
<Error>
<Code>AccessDenied</Code>
<Message>Access denied.</Message>
<Details>
{service-account-name} does not have storage.objects.get access to
{number}.cloudbuild-logs.googleusercontent.com/log-{uuid}.txt.</Details>
</Error>
Examine the set of available roles and the project's automatically created service accounts. Realize that the Cloud Build Service Account role has many more permissions that the Cloud Build Editor. This surprised me; the legacy Editor role has "Edit access to all resources".
Remove the Cloud Build Editor and Storage Admin roles from service account
Add the Cloud Build Service Account role to service account and resubmit build
...fails with the same HTTP 403 error (missing get access for a log file)
Check Cloud Build→History in the dev console; find successful builds!
Check Container Registry→Images in the dev console; find images!
At this point I think I could finish Google Cloud Run Quickstart: Build and Deploy. But I don't want to proceed with (seemingly spurious) error messages in my build process.
Cloud Run PM here:
We can break this down into the two sets of permissions needed:
# build a container image
gcloud builds submit --tag gcr.io/{PROJECT_ID}/helloworld
You'll need:
Cloud Build Editor and Cloud Build Viewer (as per #wlhee)
# deploy a container image
gcloud beta run deploy --image gcr.io/{PROJECT_ID}/helloworld
You need to do two things:
Grant your service account the Cloud Run Deployer role (if you want to change the IAM policy, say to deploy the service publicly, you'll need Cloud Run Admin).
Follow the Additional Deployment Instructions to grant that service account the ability to deploy your service account
#1
gcloud projects add-iam-policy-binding PROJECT_ID \
--member="serviceAccount:{service-account-name}#{project-id}.iam.gserviceaccount.com" \
--role="roles/run.developer"
#2
gcloud iam service-accounts add-iam-policy-binding \
PROJECT_NUMBER-compute#developer.gserviceaccount.com \
--member="serviceAccount:{service-account-name}#{project-id}.iam.gserviceaccount.com" \
--role="roles/iam.serviceAccountUser"
EDIT: As noted, the latter grants your service account the ability to actAs the runtime service account. What role this service account has is dependent on what it needs to access: if the only thing Run/GKE/GCE accesses is GCS, then give it something like Storage Object Viewer instead of Editor. We are also working on per-service identities, so you can create a service account and "override" the default with something that has least-privilege.
According to https://cloud.google.com/cloud-build/docs/securing-builds/set-service-account-permissions
"Cloud Build Service Account" - Cloud Build executes your builds using a service account, a special Google account that executes builds on your behalf.
In order to call
gcloud builds submit --tag gcr.io/path
Edit:
Please "Cloud Build Editor" and "Viewer" your service account that starts the build, it's due to the current Cloud Build authorization model.
Sorry for the inconvenience.

How do I use gcloud with a service account?

I'm having trouble getting gcloud to access my project as a service account
Installed the gcloud sdk for Windows on my local machine
Created a new service account on Google Cloud Platform console
Gave the service account the Compute Admin role
Authorized gcloud as the service account:
gcloud auth activate-service-account --key-file=keyfile.json
Issued the command
gcloud compute zones list
I get the following error:
ERROR: (gcloud.compute.zones.list) Some requests did not succeed:
- Required 'compute.zones.list' permission for '<project id>'
I verified the Compute Admin role has the proper compute.zones.list permission.
What am I missing?
I fixed the issue by recreating the service account.
It seems there's a screen that asks about the roles you want the service account to have as you create it. I originally assigned the roles after the fact.