docker pull: Permission "artifactregistry.repositories.downloadArtifacts" denied on resource - google-cloud-platform

How do I give a new service account this permission?
I have a VM with "Compute Engine default service account" and it works.
I changed the service account to one with just:
Artifact Registry Administrator
Artifact Registry Reader
and this results in the above error on docker pull.
Thanks

Check if you are correctly configured Docker to be able to pull and push images to Artifact registry : https://cloud.google.com/artifact-registry/docs/docker/pushing-and-pulling
You also have to be sure you are using the expected Service Account in the place where you execute your command.
If you execute from you local machine and bash, check if you are connected on the expected Service Account with :
gcloud auth activate-service-account --key-file=your_key_file_path.json
export GOOGLE_APPLICATION_CREDENTIALS=your_key_file_path.json
The permissions you given to you Service Account seems to be corrects to execute the needed action.

This happens when you are trying to push/pull an image on a repository in which its specific hostname (associated with its repository location) is not yet added to the credential helper configuration for authentication.
For the gcloud credential helper or standalone credential helper, the Artifact Registry hosts you use must be in your Docker configuration file.
Artifact Registry does not automatically add all registry hosts to the Docker configuration file. Docker response time is significantly slower when there is a large number of configured registries. To minimize the number of registries in the configuration file, you add the hosts that you need to the file
You need to configure-docker while impersonating your service account ($SERVICE_ACCOUNT_EMAIL):
1. Run the following command to make sure you are still impersonating $SERVICE_ACCOUNT_EMAIL:
$ gcloud auth list
If the service account is not impersonated then run the following command:
$ gcloud auth activate-service-account \ "$SERVICE_ACCOUNT_EMAIL" \ --key-file=$SERVICE_ACCOUNT_JSON_FILE_PATH
2. Run the configure-docker command against the auth group:
$ gcloud auth configure-docker <location>-docker.pkg.dev
3. Finally, try pulling the Docker image again.
Refer Authenticating to a repository and stackpost for more information.

Related

GCP - Impersonate Service Account from Local Machine

I have used AWS in the past. So I might be trying to compare the behavior with GCP. Bear with me and provide the steps to do things in a correct way.
What I did:
I create a service account in GCP with storage object viewer role.
I also create a key-pair and downloaded the json file in my local.
If I have gcloud/gsutil installed in my local machine, How can I assume/impersonate the service account and work on GCP resources?
Where should i keep the downloaded json file? I already referred to this - https://cloud.google.com/iam/docs/creating-managing-service-account-keys#creating
I downloaded file as key.json and kept in my home directory.
Then I did this.
export GOOGLE_APPLICATION_CREDENTIALS="~/key.json"
Execute this command
gsutil cp dummy.txt gs://my-bucket/
Ideally, it should NOT work. but I am able to upload files.
You can set the path to your service account in env (when using Google Cloud SDKs):
#linux
export GOOGLE_APPLICATION_CREDENTIALS="/home/user/path/to/serviceAccountFile.json"
#windows
$env:GOOGLE_APPLICATION_CREDENTIALS="C:\Users\username\Downloads\path\to\serviceAccountFile.json"
If you don't have the key file, run the following command that'll download the key:
gcloud iam service-accounts keys create ./serviceAccount.json --iam-account=svc_name#project.iam.gserviceaccount.com
You can then use activate-service-account to use given service account as shown below:
gcloud auth activate-service-account --key-file=serviceAccount.json

Google cloud credentials for deploying cloud run service in CI pipeline

I am attempting to deploy a Cloud Run service from source from a GitLab CI pipeline. I can deploy it manually, with my own credentials, but am struggling to give the right credentials to the CI pipeline to get it to deploy.
These are the commands in my pipeline:
gcloud auth activate-service-account --key-file $CLOUD_RUN_CREDENTIALS
gcloud run deploy api --source=./api/ --region=us-cental1
CLOUD_RUN_CREDENTIALS is a pipeline file variable that contains the key for a service account I have created for this purpose. The service account has the following roles:
Cloud Build Editor role
Artifact Registry Admin role
Storage Admin role
Cloud Run Admin role
Service Account User role
In the Cloud Build settings I have enabled Cloud Run Admin and Service Account User.
When I run this GitLab job, I get the following error:
$ gcloud auth activate-service-account --key-file $CLOUD_RUN_CREDENTIALS
Activated service account credentials for: [XXXXXXXXXXX#XXXXXXXXXX.iam.gserviceaccount.com]
$ gcloud run deploy api --source=./api/ --region=us-cental1
ERROR: Error in retrieving repository from Artifact Registry.
ERROR: (gcloud.run.deploy) INVALID_ARGUMENT: Request contains an invalid argument.
Is this a permissions issue? What permissions do I need to give this service account to allow the deployment to succeed? If not, what am I doing wrong?
If you use the Build from Source feature. The Cloud Build Service Account is the one used to perform certain actions, like pulling and pushing images from Artifact Registry.
You have to grant the Cloud Build Service account (the one called PROJECT_NUMBER#cloudbuild.gserviceaccount.com) the required permissions.
Go to IAM, look for the Cloud Build Service Account and grant it the permissions you listed above. That should solve the issue.
The Service Account used from GitLab doesn't need all these permissions. Cloud Run Admin should be enough

gcloud \ kubectl authentication problem: forget service account

I'm using gcloud and kubectl to handle my resources (Kubernetes, VM and so on). Everything worked find until I read some article that created a new service account and activate it via cloud. Something like this:
gcloud auth activate-service-account --key-file=path/to/key
The created service account has limited permissions to few resources. When I run commands, like:
kubectl --namespace production get pods
I'm getting back response like:
Error from server (Forbidden): pods is forbidden: User
"SA-USER#PROGECTNAME.iam.gserviceaccount.com"
cannot list resource "pods" in API group "" in the namespace
"production": requires one of ["container.pods.list"] permission(s).
The SA SA-USER#PROGECTNAME.iam.gserviceaccount.com is the service account that I created yesterday. For some reason, it took control on my default permissions and I'm locked out because this user almost has no permissions.
I tried to make the gcloud forget this service account without success. Things I tried:
Uninstall & Install of gcloud and kubectl
Remove the config directory ("~/.config/gcloud/")
gcloud auth login
All those tried was failed. I still getting the same message as above.
How I can make gcloud and kubectl forget this service account?
Thanks!
UPDATE for the new auth plugin:
Some time ago, gke adopted the new auth plugin architecture for kubernetes.
In the new plugin gke-gcloud-auth-plugin, the authentication cache is at
macOS: ~/.kube/gke_gcloud_auth_plugin_cache
(Please edit this answer to include locations in others operation systems.)
You can just delete that file.
There is a problem with the expiration of the authentication token used by kubectl.
When you choose a new user via gcloud and make a kubectl request, an authentication token is cached locally.
A workaround is to edit your ~/.kube/config and set a date in the past for the expiry field on the relevant user section.
You can perform a gcloud auth application-default login
You can also see the current configuration of your gcloud CLI by doing gcloud config list. You can change some default parameter with gcloud config set param_name param_value. For example (because you will use often it if you have several projects)
gcloud config set project MyProjectId
With these, you will be able to solve your issue

gcloud - ERROR: (gcloud.app.deploy) Permissions error fetching application

I am trying to deploy node js app on google cloud but getting following error -
Step #1: ERROR: (gcloud.app.deploy) Permissions error fetching application [apps
/mytest-240512]. Please make sure you are using the correct project ID and that
you have permission to view applications on the project.
I am running following command -
gcloud builds submit . --config cloudbuild.yaml
My cloudbuild.yaml file looks like -
steps:
#install
- name: 'gcr.io/cloud-builders/npm'
args: ['install']
#deploy
- name: 'gcr.io/cloud-builders/gcloud'
args: ['app', 'deploy']
The default Cloud Build service account does not allow access to deploy App Engine. You need to enable the Cloud Build service account to perform actions such as deploy.
The Cloud Build service account is formatted like this:
[PROJECT_NUMBER]#cloudbuild.gserviceaccount.com
Go to the Google Cloud Console -> IAM & admin -> IAM.
Locate the service account and click the pencil icon.
Add the role "App Engine Deployer" to the service account.
Wait a couple of minutes for the service account to update globally and then try again.
I had this same error today and the way I resolve it was by running: $ gcloud auth login on the console.
This will open a new browser tab for you to login with the credentials that has access to the project you're trying to deploy.
I was able to deploy to gcloud after that.
ps.: I'm not sure this is the best approach, but I'm leaving this as a possible solution as this is how I usually go around this problem. Worst case, I'll stand corrected and learn something new.
The most common way to deploy an app to App Engine is to use gcloud app deploy ....
When you use gcloud app deploy against App Engine Flex, the service uses Cloud Build.
It's entirely possible|reasonable to use Cloud Build to do your deployments too, it's just more involved.
I've not tried this but I think that, if you wish to use Cloud Build to perform the deployment, you will need to ensure that the Cloud Build service account has permissions to deploy to App Engine.
Here's an example of what you would need to do, specifically granting Cloud Build's service account the correct role.
Two commands can handle the perms needed (run in your terminal if you have gcloud sdk installed and authenticated or run in cloud shell for your project):
export PROJECT_ID=[[put your project id here]]
export PROJECT_NUMBER=$(gcloud projects describe $PROJECT_ID --format="value(projectNumber)")
gcloud iam service-accounts add-iam-policy-binding ${PROJECT_ID}#appspot.gserviceaccount.com \
--member=serviceAccount:${PROJECT_NUMBER}#cloudbuild.gserviceaccount.com \
--role=roles/iam.serviceAccountUser \
--project=${PROJECT_ID}
```
gcloud projects add-iam-policy-binding ${PROJECT_ID} \
--member=serviceAccount:${PROJECT_NUMBER}#cloudbuild.gserviceaccount.com \
--role=roles/appengine.appAdmin

Using Google Cloud Source Repositories with service account

Is it possible to access a Google Cloud Source Repository in an automated way, i.e. from a GCE instance using a service account?
The only authentication method I am seeing in the docs is to use the gcloud auth login command, which will authenticate my personal user to access the repo, not the machine I am running commands from.
If you want to clone with git rather than running through gcloud, you can run:
git config --global credential.helper gcloud.sh
...and then this will work:
git clone https://source.developers.google.com/p/$PROJECT/r/$REPO
On GCE vms running
gcloud source repos clone default ~/my_repo
should work automatically without extra step of authentication, as it will use VMs service account.
If you running on some other machine you can download from https://console.cloud.google.com service account .json key file and activate it with
gcloud auth activate-service-account --key-file KEY_FILE
and then run the above clone command.
In case somebody like me was trying to do this as part of Dockerfile, after struggling for a while I've only managed to get it to work like this:
RUN gcloud auth activate-service-account --key-file KEY_FILE ; \
gcloud source repos clone default ~/my_repo
As you can see, having it to be part of the same RUN command was the key, otherwise it kept failing with
ERROR: (gcloud.source.repos.clone) You do not currently have an active account selected.
Enable access to the "Cloud Source Repositories" Cloud API for the instance. You should do this while creating or editing the instance in the Admin console
From a shell inside the instance, execute gcloud source repos clone <repo_name_in_cloud_source> <target_path_to_clone_into>
If you are running on GCE, take advantage of the new authentication method that needs fewer lines of code.
When creating your VM instance, under "Access & Security," set "Cloud Platform" to "Enabled."
Then the authentication code is this simple:
from oauth2client.client import GoogleCredentials
credentials = GoogleCredentials.get_application_default()
http = credentials.authorize(httplib2.Http())
See
https://developers.google.com/identity/protocols/application-default-credentials