Authentication to GCP in terraform - google-cloud-platform

We need to create gcp resources with terraform, but we are stuck at the terraform init stage while terraform tries to authenticate to gcp. We have already configured our backend and obtained our service account key but minifying (removing the extra lines in credential json file) the credential json and exporting to GOOGLE_CREDENTIALS, doesn't work. How are you setting this value?

If you are in a local and controlled environment you can use GOOGLE_APPLICATION_CREDENTIALS and set it with the path to the JSON key file. But as discussed key files are bad practices security wise. An alternative is to authenticate using gcloud auth application-default login and you dont have to deal with key files.
Another alternative is to use Google Cloud Shell which is already setup with the credentials of the authorised user opening the session.
Finally for automated pipeline you can use Google Cloud Build where processes will be run using the authentication and the authorisation of the service account used by Cloud Build

Related

Gitlab CI/CD deploy to aws via aws-azure-cli authentication

When deploying to AWS from gitlab-ci.yml file, you usually use aws-cli commands as scripts. At my current workplace, before I can use the aws-cli normally, I have to login via aws-azure-cli, authenticate via 2FA, then my workstation is given a secret key than expires after 8 hours.
Gitlab has CI/CD variables where I would usually put the AWS_ACCESS_KEY and AWS_SECRET_KEY, but I can't create IAM role to get these. So I can't use aws-cli commands in the script, which means I can't deploy.
Is there anyway to authenticate Gitlab other than this? I can reach out to our cloud services team, but that will take a week.
You can configure OpenID to retrieve temporary credentials from AWS without needing to store secrets.
In my view its actually a best practice too, to use OopenID roles instead of storing actual credentials.
Add the identity provider fir gitlab in aws
Configure the role and trust
Retrieve a temporary credential
follow this https://docs.gitlab.com/ee/ci/cloud_services/aws/ or a more detailed version https://oblcc.com/blog/configure-openid-connect-for-gitlab-and-aws/

Authenticating to pubsub in a co-lab notebook via the gcloud auth command

I would like to authenticate to pubsub via a co-lab notebook by using the !gcloud auth command. However, when I run that command, I am able to authenticate to cloud sql as well as to the gcp buckets, but I get the following error message when I run the publisher = pubsub.PublisherClient() command:
DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started
I want to avoid uploading my cloud credentials to a google drive folder to authenticate with pubsub, since i want to be able to share this notebook across my organization so that other users with the correct access rights can also run the notebook directly from their end without needing to upload their own service account credentials. Is there a way that I can do this? Thanks in advance.
Use:
from google.colab import auth
auth.authenticate_user()
As in this example showing Google Cloud Storage access:
https://colab.research.google.com/notebooks/io.ipynb#scrollTo=NQX0hbfYaEKc

How to setup properly Google Cloud Shell?

I know this question is probably a bit vague. I was trying to run one of the examples of Google NLP Library in Google Shell.
I have 0 experience with using API, JSON, Nodejs... I don't understand what they are and how to use them.
Please help
Here is the snapshot of the error:
The error message means that you are using user credentials instead of service account credentials.
When you connect to Google Cloud Shell, you are using your Google Accounts User Credentials. Those credentials are the ones that you used to log in to the Google Cloud Console. When you run an application in Google Cloud Shell, your application is using those credentials unless you explicitly specify different credentials.
The solution is to create a service account in the Google Cloud Console. Then in your program use the service account for credentials for your application.
Google Cloud Service Accounts
When you do not specify the application credentials, the Google Client libraries use a method to locate credentials called ADC (Application Default Credentials). I wrote an article that might help you understand ADC:
Google Cloud Application Default Credentials
The simplest method for you is to create the environment variable GOOGLE_APPLICATION_CREDENTIALS to point to the service account full path location before running your application. Change the path below to point to where the service account is stored on Cloud Shell. You will need to first create the service acount, download it and then upload to Cloud Shell.
export GOOGLE_APPLICATION_CREDENTIALS="$HOME/service-account.json"
Managing files with Cloud Shell
This link will provide more information on how to write applications that use service accounts.
Setting Up Authentication for Server to Server Production Applications

I am using a GCP service account, but when calling Dialog Flow API I get an error

Here is the error:
Your application has authenticated using end user credentials from the
Google Cloud SDK or Google Cloud Shell which are not supported by the
dialogflow.googleapis.com. We recommend that most server applications
use service accounts instead. For more information about service
accounts and how to use them in your application, see
https://cloud.google.com/docs/authentication/.
Many of the Client Libraries pull from the Application Default Credentials, a summary of how they're checked is provided on that link. Essentially it will check environmental variables for a path and pull credentials from that location. This error message means you're using a User account, and not a service account.
Most commonly you logged in once using gcloud auth login, and even though you provided your service account it's still pulling from the Application Default location.
As you did, the method to associate a specific service account is gcloud auth activate-service-account --key-file <path>
Alternatively to use the true application default you can use gcloud auth application-default login

Google Cloud KMS server install

I want to deploy a node application on a google cloud compute engine micro instance from a source control repo.
As part of this deployment I want to use KMS to store database credentials rather than having them in my source control. To get the credentials from KMS I need to authenticate on the instance with GCLOUD in the first place.
Is it safe to just install the GCloud CLI as part of a startup script and let the default service account handle the authentication? Then use this to pull in the decrypted details and save them to a file?
The docs walkthrough development examples, but I've not found anything about how this should work in production, especially as I obviously don't want to store the GCloud credentials in source control either.
Yes, this is exactly what we recommend: use the default service account to authenticate to KMS and decrypt a file with the credentials in it. You can store the resulting data in a file, but I usually either pipe it directly to the service that needs it or put it in tmpfs so it's only stored in RAM.
You can check the encrypted credentials file into your source repository, store it in Google Cloud Storage, or elsewhere. (You create the encrypted file by using a different account, such as your personal account or another service account, which has wrap but not unwrap access on the KMS key, to encrypt the credentials file.)
If you use this method, you have a clean line of control:
Your administrative user authentication gates the ability to run code as the trusted service account.
Only that service account can decrypt the credentials.
There is no need to store a secret in cleartext anywhere
Thank you for using Google Cloud KMS!