I have .crt, .csr and .key file in ssl_cert directory of my gcloud VM. While creating a SSL Certificate I am using following command -
gcloud compute ssl-certificates create SSL_CERTIFICATE --certificate
/home/USER/ssl_cert/ssl.crt --private-key /home/USER/ssl_cert/ssl.key
and after executing the above command I get below error -
ERROR: (gcloud.compute.ssl-certificates.create) Some requests did not
succeed:
- Insufficient Permission
Can someone please help in resolving this basic error?
Run this gcloud command on your SSH terminal
gcloud auth login
A login link will be generated for you in the SSH, click on it, it will require you to login with the gmail account that owns the project. A code will be generated for you after login, copy and paste it back in your SSH terminal on the line where you have
Enter verification code :
Then hit the enter button, permission will be granted.
Try to rerun your initial command that returned insufficient permission.
Your issue seems to be related to this post:
gcloud compute list networks error: "Some requests did not succeed: - Insufficient Permission"
The solution suggested in the above post is to run the command: gcloud auth login. This will generate an authentication url. Copy and paste this url to your browser. It should return an authentication code. Enter this code in your command shell
"Insufficient permission" is returned by the web interface if you have not yet validated the domain for which you are uploading the certificate. I.e. if you are uploading a certificate for example.com you must have demonstrated that you own example.com.
To to this, using the GCloud web interface navigate to APIs & Services -> Credentials, and then click on Domain Verification. Then Add a domain. This will take you to a set of tools to allow you to validate that you own the domain. (In my case I added a TXT field in the DNS record.)
With this done, you can upload a certificate for that domain.
As per your question, you don't have full access to your server. Please ask your hosting provider to get you the full access for the same.
Related
I use a workflow to create a batch job using a docker image hosted in a docker registry.
All of this happens within the same google cloud project.
My batch job fails with this error :
"docker: Error response from daemon: Head "https://us-west1-docker.pkg.dev/v2/entity/docker-registry/image-name/manifests/latest": denied: Permission "artifactregistry.repositories.downloadArtifacts" denied on resource "projects/project-id/locations/us-west1/repositories/docker-registry" (or it may not exist).
See 'docker run --help'.
From google documentation I understand that Compute Engine's service account doesn't have the roles/artifactregistry.admin : Jobs default to using the Compute Engine default service account
I get the same error after giving the role to the service account :
gcloud projects add-iam-policy-binding project-id \
--member=serviceAccount:compute#developer.gserviceaccount.com \
--role=roles/artifactregistry.admin
While digging service accounts I found another service another service account and also gave it the role : service-xxxx#gcp-sa-cloudbatch.iam.gserviceaccount.com.
It does not solve the problem.
How can I see which service account is used ?
Can I see logs about denied permissions ?
The error occurs when you are trying to push an image on a repository in which a specific hostname associated with its repository location is not yet authenticated and specified in the credential helper.You may refer to this Setting up authentication for Docker .You may check and confirm the service account to make sure you are still impersonating the correct one ,run below as mentioned in document
gcloud auth list
This command will show the active account, along with the other
accounts that are authorized to access your Google Cloud project. The
active account will be marked with an asterisk (*).
Try to run the authentication using a command specifying the location of your repository.You may try to run the configure-docker command against the auth group and see.
gcloud auth configure-docker <location>-docker.pkg.dev
And then try pulling the Docker image again.
Refer Authenticating to a repository for more information and you can see these logs permission denied logs in Cloud logging for more details.
When submitting a Cloud Build run via gcloud builds submit ... I'm getting a forbidden error saying I don't have access to the bucket(s). There are 2 places where buckets are normally involved in submitting a Cloud Build, the staging and logs bucket. I specified the buckets for each as buckets (the same one, just different folders) that I do have access too so the command looks like this:
gcloud builds submit
--gcs-log-dir $my_bucket/logs
--gcs-source-staging-dir $my_bucket/source
The error I get is:
ERROR: (gcloud.builds.submit) 403: The user is forbidden from accessing the bucket [$my_bucket]: Please check your organization's policy.
I re-ran with --log-http and --verbosity debug and the expanded error shows the real reason:
DEBUG: https://storageapis.google.com "GET /storage/v1/b/$my_bucket?alt=json"
...
{
"error": {
"code": 403,
"message": "$user does not have serviceusage.services.use access to the Google Cloud Project."
}
}
I did some digging and see that's this error shows up when supplying a quota/billing project with the request (in addition to not having service consumer role). I confirmed this when inspecting the request's HTTP headers which included X-Goog-User-Project: $my_project.
What's weird is that I have access to objects in this bucket and can run gsutil/HTTP commands just fine which are using the same API endpoints with the difference being that gsutil doesn't include that user project in the request.
Is there a way to submit a build that doesn't include the project so that I don't need serviceusage.services.use permission? I tried unsetting the project in my gcloud config but it prompted me that I needed to either set it or pass it with --project flag.
edit: the bucket isn't "requester pays" enabled either which is why gsutil and client libraries work fine
The only reason why you are having this error is you have to enable your billing in order to build your bucket.
I have enabled it when I was trying the tutorial by clicking the "Create a Cloud Storage Bucket" under Getting started at the left side of your Dashboard. Just follow the instructions and you will see the "Enable Billing". Once you have enabled the Billing, you don't need to finish the Tutorial. Go back to your work and run the
$ gcloud build submit
and it's done!
I'm not sure you can run a cloud build without specifying a project. As far as I know, gcloud commands run within a project so it's needed.
If you want to use a different service account you can use service account impersonation adding --impersonate-service-account flag.
For this gcloud invocation, all API requests will be made as the given service account instead of the currently selected account.
According to the GCP documentation:
To run gcloud builds commands, users with only cloudbuild.builds.viewer or cloudbuild.builds.editor roles also require the serviceusage.services.use permission. To give this permission to the user, grant them the serviceusage.serviceUsageConsumer role.
Edit your user on IAM & Admin choosing your user and type "Service Usage Consumer".
However, review your policies and roles because I beliave that this option is for clean users created from the scratch without any other permissions than Object Storage roles.
Problem: I can't create managed-zones using the google cloud console.
What did I do?
Created a service account
Add role DNS Administrator
Created a json key
Executed the command
gcloud auth activate-service-account test235643#developer-dns-test.iam.gserviceaccount.com --key-file=/home/d.reznikov/Downloads/developer-dns-test-5a2088479459.json --project=developer-dns-testing
Executed the command
gcloud dns managed-zones create my_zone --dns-name my.zone.com. --description "My zone!"
I get error
ERROR: (gcloud.dns.managed-zones.create) User [test235643#developer-dns-test.iam.gserviceaccount.com] does not have permission to access project [developer-dns-test] (or it may not exist): Forbidden
Please help, maybe something else needs to be installed in the google cloud settings?
It looks like the project name is slightly different between the command used to activate the service account and the error message.
Command:
gcloud auth activate-service-account test235643#developer-dns-test.iam.gserviceaccount.com --key-file=/home/d.reznikov/Downloads/developer-dns-test-5a2088479459.json --project=developer-dns-testing
Error msg:
ERROR: (gcloud.dns.managed-zones.create) User [test235643#developer-dns-test.iam.gserviceaccount.com] does not have permission to access project [developer-dns-test] (or it may not exist): Forbidden
I would double check the project name and re authorize the service account using the correct one. Then retry to create the zone.
I am learning GCP and have the following question in regarding copying object into a bucket.
In Cloudshell I ran gsutil cp earthquakes.* gs://welynx-test1 and errored out:
Copying file://earthquakes.csv [Content-Type=text/csv]...
AccessDeniedException: 403 Insufficient Permission
I am logged in via SSH already so I checked the identity with whoami
xenonxie#instance-1:~/training-data-analyst/CPB100/lab2b$ whoami
xenonxie
And then I proceeded with editing the bucket's permission:
I added an email address to the Storage Admin member.
I then run gcloud auth login, and was prompted a link and clicking on the link takes me to browser where I log in with an email address, and then I will get a verification code to paste back to SSH window, and then became logged in as that email user:
You are now logged in as [xenonxie # gmail.com]. Your current project
is [rock-perception-263016]. You can change this setting by running:
$ gcloud config set project PROJECT_ID
Question1:
What I don't understand is: I still see the same whoami as below:
xenonxie#instance-1:~/training-data-analyst/CPB100/lab2b$ whoami
xenonxie
Because I added the email in bucket permission as Storage Admin, I am able to save object into that bucket.
Question2:
I believe it is not needed to be Storage Admin, all I want is to write an object into that bucket. What is the best practice to do that?
Thank you very much.
Question 1":
You see the same whoami because that command is ran on the instance's shell so it is answering the session on instance-1.
Question 2:
Yes, you are right. If you are using the principle of least privilege as Storage admin is a role with more privileges than needed is not the best option.
As the use case you describe is only to write on the bucket you can use storage.objectCreator or roles/storage.legacyBucketWriter depending if you need to navigate in the bucket or don't.
To get more details about the roles available you can check this page
EDIT
To see the Google Cloud Platform Account being used you can use gcloud auth list and under you will get a list of accounts and one will be marked as active. The one marked as active is the one being used.
Question 1: whoami would output your Login username using with you Logged in on the machine. Use gcloud auth list command to check currently authorised User/Service account on machine.
Question 2: Storage Admin permission is not needed. storage.legacyBucketWriter permission would be sufficient.
Hope this helps.
Please how can one know which email address i used for opening a GCP account based on the server IP ? I registered a GCP account months ago but can't remember which email was used.
If you can login to the server and that server has permissions to query IAM then you can list all the IAM members of the project. From the list you can infer which email you used.
If you can login to the server in a GCP Project then you can run following to get the project id:
curl -H "Metadata-Flavor: Google" "http://metadata.google.internal/computeMetadata/v1/project/project-id"
Then,You can run
gcloud projects get-iam-policy YOUR_GCP_PROJECT_ID
Also, if you remember the credentials of other gmail account which you mentioned in comment, you can login with those credentials to GCP Console and then look all the members in IAM section.
Hope this helps.