how to set credentials to use GCP API from Dataproc instance - google-cloud-platform

I am trying to access some credentials stored in google Secret Manager. To access this its required to have credentials setup in the Cluster machine where the jar is running.
I have SSH into the master instance, and seen there is nothing configured for GOOGLE_APPLICATION_CREDENTIALS.
I am curious to know how to assign GOOGLE_APPLICATION_CREDENTIALS or any other alternative that allows to use GCP APIs that require credentials.

If you are running on Dataproc clusters, default GCE service account should be already configured for you. Assuming your clusters are running outside GCP environment, in that case you want to follow this instruction to manually set up a service account that has editor/owner role for Google Secret Manager, and download the credential key file and point GOOGLE_APPLICATION_CREDENTIALS to it.

Related

Gitlab CI/CD deploy to aws via aws-azure-cli authentication

When deploying to AWS from gitlab-ci.yml file, you usually use aws-cli commands as scripts. At my current workplace, before I can use the aws-cli normally, I have to login via aws-azure-cli, authenticate via 2FA, then my workstation is given a secret key than expires after 8 hours.
Gitlab has CI/CD variables where I would usually put the AWS_ACCESS_KEY and AWS_SECRET_KEY, but I can't create IAM role to get these. So I can't use aws-cli commands in the script, which means I can't deploy.
Is there anyway to authenticate Gitlab other than this? I can reach out to our cloud services team, but that will take a week.
You can configure OpenID to retrieve temporary credentials from AWS without needing to store secrets.
In my view its actually a best practice too, to use OopenID roles instead of storing actual credentials.
Add the identity provider fir gitlab in aws
Configure the role and trust
Retrieve a temporary credential
follow this https://docs.gitlab.com/ee/ci/cloud_services/aws/ or a more detailed version https://oblcc.com/blog/configure-openid-connect-for-gitlab-and-aws/

How to auto login to GCP using gcloud cli?

We have GCP account credentials(username/password). We have installed gcloud CLI on the Amazon Linux EC2 machine. We would like to create a script that would auto-login to the GCP account and do the below things sequentially using gcloud CLI.
Login to the GCP account.
Create Project and specify a meaningful project-id.
Create a service account with a meaningful ID.
Assign the owner role to the service account.
Create and download a new JSON key.
Please help us to achieve this
You should use a Service Account not a User (username|password) for automation. The Service Account should be suitably permissioned so that it can create Projects and Service Accounts.
I was unable to find a source for this (but it used to be that?) Google monitors User Accounts for apparent use of automation (e.g. for bots) and these accounts may be disabled.

VM Instance Service Account can't be recognized

Although I've gave Owner role to that specific service, I can't use the permissions from my instances that I connect with SSH from my local.
Also can't upload my files to Storage bucket which I've created in cloud platform.
Here is the screenshots of the problem:
The problem might be caused by the access token not having the appropriate permission scopes to conduct the required activity. To make sure you're using the auth scope of this service account appropriately, I recommend doing the following:
Run the command in the Google documentation inside the VM to
create a new key for the service account. This will create a .json
file inside the current directory containing the private
authentication key for the service account.
Run the command in the Google documentation to activate the
service account.
Run the command: $gcloud auth list to check if this worked.
In the output you should see an asterisk before the service
account’s name, indicating that this is the service account you are
currently using.
Now refer to the Google documentation and run the $env:GOOGLE_APPLICATION_CREDENTIALS="KEY_PATH"
Google Cloud Compute VMs have a setting for Access Scopes. This feature can limit the permissions that a service account has when attached to a virtual machine.
Go to the Google Cloud Console GUI, select your VM, stop the VM and then edit Acess Scopes to grant the permissions you require.
Access scopes

GKE Secrets OR Google Secret manager

Does anyone know in which case choose Kubernetes secrets instead of google secret manager and the reverse ? Differences between the two ?
With Kubernetes secret (K8S Secret), you use a built in feature of K8S. You load your secrets in config maps, and you mount them on the pods that require them.
PRO
If a day you want to deploy on AWS, Azure or on prem, still on K8S, the behavior will be the same, no update to perform in your code.
CONS
The secrets are only accessible by K8S cluster, impossible to reuse them with another GCP services
Note: With GKE, no problem the ETCD component is automatically encrypted with a key form KMS service to keep the secret encrypted at rest. But, it's not always the same for every K8S installation, especially on premise, where the secrets are kept in plain text. Be aware about this part of the security.
Secret Manager is a vault managed by Google. You have API to read and write them and the IAM service checks the authorization.
PRO
It's a Google Cloud service and you can access it from any GCP services (Compute Engine, Cloud Run, App Engine, Cloud Functions, GKE,....) as long as you are authorized for
CONS
It's Google Cloud specific product, you are locked in.
You can use them together via this sync service: https://external-secrets.io/

Accessing Google Secrets from an application running on a Google Cloud VM instance - Assigning Cloud APIs to VM

I'm using Google Secrets to store API keys and other application specific "secrets". When I deploy my application to a VM instance using a docker container, I want it to access the Google Secrets using the VM's associated service account.
I have gotten this working using the following:
Assigning the "Secret Manager Secret Accessor" permission to the Service Account.
Giving my VM access to all APIs:
From a security perspective and recommended best practice, I don't want to give access to all APIs. The default access option doesn't work and I can't figure out from the list which options to enable to allow access to Google Secrets from my VM.
TLDR - Which Cloud API(s) do I need to give my Google Compute VM instance access to so that it can access Google Secrets without giving it access to all of the Cloud APIs?
According to the Secret Manager documentation, you need the cloud-platform OAuth scope.
Note: To access the Secret Manager API from within a Compute Engine instance or a Google Kubernetes Engine node (which is also a Compute Engine instance), the instance must have the cloud-platform OAuth scope. For more information about access scopes in Compute Engine, see Service account permissions in the Compute Engine documentation.
I'm not sure you can set this scope in the web UI, though you can set it through the command line.
Possible Alternative
What I do, rather than setting scopes on VMs, is create a service account specifically for the VMs (instead of using the default service account) and then give this service account access to specific resources (like the specific Secrets it should have access to, rather than all of them). When you do this in the web UI, the access scopes disappear and you are instructed to use IAM roles to control VM access like so: