GCP yum artifact registry 403 when imported from packer instance - google-cloud-platform

I am trying to install a package from the yum repository created using the GCP Artifact registry within a packer instance. I am able to install the package if the repository has public access to allUsers however, it fails if the principal is limited to a service account even though the sa has roles/artifactregistry.admin or roles/artifactregistry.reader role. The packer is using Default network with the scope of "https://www.googleapis.com/auth/cloud-platform" and the appropriate service_account_email, and account json options.
Errors during downloading metadata for repository 'MyRepository':
- Status code: 403 for https://us-central1-yum.pkg.dev/projects/project-xyz/repo-rhel8/repodata/repomd.xml (IP: 142.250.125.82)
Error: Failed to download metadata for repo 'MyRepository': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried
kindly request your help with this problem.

There are many possibilities why you got above error:
You need to verify the VM has an associated service account.
Goto VM instance page
In the list of VMs, check the name of your VM and on the details tab, the service account and access scopes appear under API and IM; by default they will use compute engine default service account; you need to change that as per your account. Please check this document.
You need to check the VM service account has read the permissions to the repository as well as the cloud platform API access scope.

The problem is solved by installing yum-plugin-artifact-registry. I was using rhel8 and this package was not found. After looking into the PR (https://github.com/GoogleCloudPlatform/artifact-registry-yum-plugin/pull/14), found that I have to install dnf-plugin-artifact-registry which is found in the default registries and then was able to get my custom repo

Related

Batch cannot pull docker image from Artifact Registry

I use a workflow to create a batch job using a docker image hosted in a docker registry.
All of this happens within the same google cloud project.
My batch job fails with this error :
"docker: Error response from daemon: Head "https://us-west1-docker.pkg.dev/v2/entity/docker-registry/image-name/manifests/latest": denied: Permission "artifactregistry.repositories.downloadArtifacts" denied on resource "projects/project-id/locations/us-west1/repositories/docker-registry" (or it may not exist).
See 'docker run --help'.
From google documentation I understand that Compute Engine's service account doesn't have the roles/artifactregistry.admin : Jobs default to using the Compute Engine default service account
I get the same error after giving the role to the service account :
gcloud projects add-iam-policy-binding project-id \
--member=serviceAccount:compute#developer.gserviceaccount.com \
--role=roles/artifactregistry.admin
While digging service accounts I found another service another service account and also gave it the role : service-xxxx#gcp-sa-cloudbatch.iam.gserviceaccount.com.
It does not solve the problem.
How can I see which service account is used ?
Can I see logs about denied permissions ?
The error occurs when you are trying to push an image on a repository in which a specific hostname associated with its repository location is not yet authenticated and specified in the credential helper.You may refer to this Setting up authentication for Docker .You may check and confirm the service account to make sure you are still impersonating the correct one ,run below as mentioned in document
gcloud auth list
This command will show the active account, along with the other
accounts that are authorized to access your Google Cloud project. The
active account will be marked with an asterisk (*).
Try to run the authentication using a command specifying the location of your repository.You may try to run the configure-docker command against the auth group and see.
gcloud auth configure-docker <location>-docker.pkg.dev
And then try pulling the Docker image again.
Refer Authenticating to a repository for more information and you can see these logs permission denied logs in Cloud logging for more details.

Not able to delete cloud composer environment

When trying to delete my cloud composer environment it gets stuck complaining about insufficient permissions. I have deleted the storage bucket, GKE cluster and the deployment according to this post:
Cannot delete Cloud Composer environment
And the service account is the standard compute SA.
DELETE operation on this environment failed 33 minutes ago with the following error message:
Could not configure workload identity: Permission iam.serviceAccounts.getIamPolicy is required to perform this operation on service account projects/-/serviceAccounts/"project-id"-compute#developer.gserviceaccount.com.
Even though I made the compute account a project owner and IAM Security Admin temporarily it does not work.
And I've tried to delete it through the GUI, gcloud CLI and terraform without success. Any advice or things to try out will be appreciated :)
I got help from the google support, and instead of adressing the SA projects/-/serviceAccounts/"project-id"-compute#developer.gserviceaccount.com.
It was apparently the default service agent that has the format of
service-"project-nr"#cloudcomposer-accounts.iam.gserviceaccount.com with the
Cloud Composer v2 API Service Agent Extension
Thank you for the kind replies!
The issue iam.serviceAccounts.getIamPolicy, seems to be more related to the credentials, that your server is having issues retrieving credentials data.
You should set up your path credentials variable again:
export GOOGLE_APPLICATION_CREDENTIALS=fullpath.json
Also there another options where you can try to run:
gcloud auth activate-service-account
Also you can add it to your script:
provider "google" {
credentials = file(var.service_account_file_path)
project = var.project_id
}
Don't forget that you need to have the correct roles to delete the composer.
For more details about it you can check:
https://cloud.google.com/composer/docs/delete-environments#gcloud
https://registry.terraform.io/providers/hashicorp/google/latest/docs/resources/composer_environment
https://cloud.google.com/composer/docs/how-to/access-control?hl=es_419

Error on trying to add Private key with SHA-2 to AWS Opsworks

I am using OpsWorks Chef 11, it was working fine till 15 march 2022.
now getting:
ERROR: You’re using an RSA key with SHA-1, which is no longer allowed. Please use a newer client or a different key type.
Please see Improving Git protocol security on GitHub | The GitHub Blog 2 for more information.
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
I recreated Key with SHA-2 and updated in Github but unable to update in OpsWorks.
Is there any way to pass new SHA-2 key to OpsWorks?
If You mean accessing a GitHub repository using an SSH URL from OpsWork, then the relevant documentation would be "AWS OpsWorks/ Using Git Repository SSH Keys".
Reminder: AWS OpsWorks Stacks does not support SSH key passphrases.
Enter the private key in the Repository SSH Key box when you add an app or specify cookbook repository. Select Git under Source Control.

Unable to create Composer environment [GCP]

I am trying to create basic Composer environment:
image version: 1.17.8/2.1.4
using service account with composer.worker permission
my own user has project.owner permission
public ip
All my attempts failed with following error:
Http error status code: 400
Http error message: BAD REQUEST
Errors in: [Web server]; Error messages:
The caller does not have permission
Required 'deploymentmanager.typeProviders.create' permission for 'projects/<my-project>/global/typeProviders/europe-west2-<name-id>-addons-gke-typer'
deploymentmanager.typeProviders.create is covered by Deployment Manager Type Editor, so I added this permission to both my account and service account, but the error remains the same.
Cloud Composer Service Agent account is present in the project without any modifications to its permissions.
Is there anything else I can check or something that I missed during the set up?
For an account (whether User Account or Service Account) to be able to create a Composer Environment, the account must have a composer.environments.create permission.
And according to Google Cloud's documentation on Cloud Composer Access Control,
The Composer Worker role provides the permissions necessary to run a Cloud Composer environment
VM and intended for service accounts.
The Composer Worker role is not intended for creation of environments thus, it does not have the composer.environments.create permission.
If you want your service account to be able to create a Composer environment, you will need to assign the role Composer Administrator and this has the composer.environments.create permission needed.
You may refer to Access Control for Cloud Composer for the complete list of permission for Composer Worker, Composer Administrator and other Composer related roles.

Error in creating google datalab instance

I'm trying to follow the Datalab: Notebook in the Cloud video , and when executing the datalab create ai-adventures command, I encountered this error.
ERROR: (gcloud.compute.instances.create) Could not fetch resource:
- The user does not have access to service account '*#compute-system.iam.gserviceaccount.com'. User: '*#gmail.com'. Ask a project owner to grant you the iam.serviceAccountUser role on the service account
Few things to note;
I'm the project owners.
I've tried to create a new service account
tutorial#*.iam.gserviceaccount.com and that didn't work.
Any advice would be greatly appreciated!
Edit 1
Below is the information, as you can see when checking through gcloud auth list directive, I'm the owner, yet it tells me that I don't have access.
~ ⌚ 18:40:34
$ datalab create ai-adventures-3 --machine-type=n1-standard-4 --zone=us-central1-b
Creating the disk ai-adventures-3-pd
Creating the instance ai-adventures-3
ERROR: (gcloud.compute.instances.create) Could not fetch resource:
- The user does not have access to service account 'service-510602609611#compute-system.iam.gserviceaccount.com'. User: 'XXX#gmail.com'. Ask a project owner to grant you the iam.serviceAccountUser role on the service account
A nested call to gcloud failed, use --verbosity=debug for more info.
~ ⌚ 15:02:59
$ gcloud auth list
Credentialed Accounts
ACTIVE ACCOUNT
* XXX#gmail.com
AAA#gmail.com
To set the active account, run:
$ gcloud config set account `ACCOUNT`
~ ⌚ 15:03:45
$
I tried to reproduce your error in my own project using different ways to connect to it, with different users and accounts services.
And I found something that may help you.
First, I created a new VMInstance with OS debian-9-drawfork-v20200207.
I logged in to this new instance through SSH
But when I attempted to create this datalab instance with the command
datalab create --verbosity=debug example-datalab-2 --machine-type n1-standard-1
I received an error.
Then I send the following command
gcloud auth list
And I received something like:
Credentialed Accounts
ACTIVE ACCOUNT
* XXXXX#developer.gserviceaccount.com
To set the active account, run:
$ gcloud config set account `ACCOUNT`
It means that if I tried to create the new datalab instance with this Account Service, but this account doesn’t have the roles to create the notebook instance.
Then I changed the user with gcloud auth login command to an account with the owner role and re-ran the datalab create command and it worked.
To corroborate that it is working well I used the command
datalab connect example-datalab-2
And I received something like
Connecting to example-datalab-2.
This will create an SSH tunnel and may prompt you to create an rsa key pair. To manage these keys, see https://cloud.google.com/compute/docs/instances/adding-removing-ssh-keys
Waiting for Datalab to be reachable at http://localhost:8081/
This tool needs to create the directory [/home/directory/.ssh] before
being able to generate SSH keys.
Do you want to continue (Y/n)? y
Generating public/private rsa key pair.
.
.
.
Updating project ssh metadata...done.
Waiting for SSH key to propagate.
The connection to Datalab is now open and will remain until this command is killed.
Click on the *Web Preview* (square button at top-right), select *Change port > Port 8081*, and start using Datalab.
Then If I access to http://localhost:8081/ I can see:
It is worth mentioning that I received another error message at my first attempt
ERROR: (gcloud.source.repos.list) User [user#example.com] does not have permission to access project [myproject] (or it may not exist): Cloud Source Repositories API has not been used in project xxxxxxxx before or it is disabled.
I fixed this issue enabling the service with the command
gcloud beta services enable sourcerepo.googleapis.com
On the other hand, in order that you can troubleshoot this issue more accurately , I recommend to re-run the command with the following debug flag to help diagnose the problem:
datalab create --verbosity=debug datalab-instance-name
Also, I have found 2 guides that can help you with your task:
Quickstart guide that shows you how to use the datalab command line tool to set up and open Google Cloud Datalab.
Create a new notebook instance guide.
I hope you find this information useful.
Edit 1
Regarding the Service Account you mentioned tutorial#*.iam.gserviceaccount.com, I’ve found the following document where it says that you can use a service account instead of the default service account to create your datalab, but you need to add the following roles:
roles/compute.instanceAdmin.v1
roles/iam.serviceAccountUser
So, please check if your SA has these roles and then you can run the same command + --service-account something like:
datalab create ai-adventures --service-account=tutorial#*.iam.gserviceaccount.com
You could see more information in the following link.
This question is more interesting than I thought and I’ve encountered several issues using Datalab, we could take in consideration that the video you mentioned has been posted on Jan 11, 2018 and as Google Cloud Platform is evolving very fast, something that could work on that date could completely change 2 years later.
With this, I recommend you to use the AI Platform Notebooks page since it has similar features as Datalab, and according with the following documentation it was released as Generally Available on March 31st, 2020.
Upon trying it out, it seems that it could fit your use case and it also has capabilities to connect to Bigquery through the use of the R notebook.