Is it possible to share google oauth credentials between different services? - google-cloud-platform

Lets say you have a web application that accesses (via api) a user's Google Drive files at a set time each week and performs some kind of task on them.
The user will grant authorization to the web application via it's website (which is hosted on App Engine). However, the weekly, scheduled queries to Google Drive will be carried out by cloud scheduler.
Is it possible then for Cloud Scheduler to use the same credentials (access and refresh tokens) gotten by the web application in the first instance?
Can the credentials, for example, be stored in a cloud data storage bucket, which is accessible to both the application and cloud scheduler?
Or is there another means of accomplishing this?

Related

Secure way to upload files to GCP Cloud Storage

We're making some machines in which there's a part which uploads the images captured by the camera to Google Cloud Storage. For this purpose what I've done is
Create a service account for each machine.
Create a custom role with
permissions:
storage.objects.create
storage.buckets.get
storage.objects.get
Apply this role to that service account.
Download the JSON credentials key file and use this file with python script (in which I specify bucket name) to upload image to GCP Storage.
Is this way of doing things efficient and secure given that we only ship 2-3 machines each month?
Also I will have to ship JSON file with each machine, if the above method is valid, is this fine or there's any method to hide this key file?
Your case isn't so simple!
Firstly, if you want to put a service account in each machine, you will be limited a day (you are limited to 100 service accounts per project). And using the same service account, or the same key is too dangerous
Secondly, your use case sounds like IoT use case where you have lot of devices on edge to communicate with the cloud. But PubSub messages are limited to 10Mb max and IoT Core solution doesn't fit with your case.
The 2 latest solutions are based on the same principle:
Make an endpoint public (Cloud Run, Cloud Functions, App Engine or whatever you want)
Call this endpoint with your machine, and their own token (i.e. a string, encrypted or not)
Check the token, if OK you can (here the 2 alternatives)
Create an access token (short lived token) on a service account with the minimal permission for the machine usage, and send it back to the machine. The machine will use it to call the Google Cloud API, such as Cloud Storage API. The advantage of this solution is that you will be able to use the access token to reach other GCP APIs in the future if your use case, and your machine update require them.
Create a signedUrl and send it back to the machine. Then the machine has to upload file to this URL. The advantage is the strict limitation to Cloud Storage, no other GCP service.
The main issue with the 2 latest solution is that required public endpoint and you are exposed to attacks on it. You can protect it behind a load balancer and mitigate the attacks with Cloud Armor. Think also to limit the scalability of your public endpoint, to prevent any useless expenses in case of attacks.

How to setup properly Google Cloud Shell?

I know this question is probably a bit vague. I was trying to run one of the examples of Google NLP Library in Google Shell.
I have 0 experience with using API, JSON, Nodejs... I don't understand what they are and how to use them.
Please help
Here is the snapshot of the error:
The error message means that you are using user credentials instead of service account credentials.
When you connect to Google Cloud Shell, you are using your Google Accounts User Credentials. Those credentials are the ones that you used to log in to the Google Cloud Console. When you run an application in Google Cloud Shell, your application is using those credentials unless you explicitly specify different credentials.
The solution is to create a service account in the Google Cloud Console. Then in your program use the service account for credentials for your application.
Google Cloud Service Accounts
When you do not specify the application credentials, the Google Client libraries use a method to locate credentials called ADC (Application Default Credentials). I wrote an article that might help you understand ADC:
Google Cloud Application Default Credentials
The simplest method for you is to create the environment variable GOOGLE_APPLICATION_CREDENTIALS to point to the service account full path location before running your application. Change the path below to point to where the service account is stored on Cloud Shell. You will need to first create the service acount, download it and then upload to Cloud Shell.
export GOOGLE_APPLICATION_CREDENTIALS="$HOME/service-account.json"
Managing files with Cloud Shell
This link will provide more information on how to write applications that use service accounts.
Setting Up Authentication for Server to Server Production Applications

Authenticate Google Storage object with access token in python

I am new to Google Cloud. I am trying to access google buckets to upload files. I use Google Storage object for accessing the bucket programmatically in Python. I am able to authenticate the storage object with 'key.json'. But I am unsure when the application will run in cloud how will it access 'key.json' file securely ? Also is there a way to authenticate storage object using access token in python ?
Thanks in advance!
But I am unsure when the application will run in cloud how will it
access 'key.json' file securely ?
Review the details that I wrote below. Once you have selected your environment you might not need to use a service account JSON file at all because the metadata server is available to provide your code with credentials. This is the best case and secure. On my personal website, I have written many articles that show how to create, manage and store Google credentials and secrets.
Also is there a way to authenticate storage object using access token
in python ?
All access is via an OAuth Access Token. The following link shows details using the metadata server which I cover in more detail below.
Authenticating applications directly with access tokens
There are three items to consider:
My code is not running in Google Cloud
My code is running in Google Cloud on a "compute" type of service with access to the metadata server
My code is running in Google Cloud without access to the metadata server.
1) My code is not running in Google Cloud
This means your code is running on your desktop or even in another cloud such as AWS. You are responsible for providing the method of authorization. There are two primary methods: 1) Service Account JSON key file; 2) Google OAuth User Authorization.
Service Account JSON key file
This is what you are using now with key.json. The credentials are stored in the file and are used to generate an OAuth Access Token. You must protect that file as it contains your Google Cloud secrets. You can specify the key.json directly in your code or via the environment variable GOOGLE_APPLICATION_CREDENTIALS
Google OAuth User Authorization
This method requires the user to log in to Google Accounts requesting an OAuth scope for Cloud Storage. The end result is an OAuth Access Token (just like a Service Account) that authorizes access to Cloud Storage.
Getting Started with Authentication
2) My code is running in Google Cloud on a "compute" type of service with access to the metadata server
Notice the word "metadata" server. For Google Cloud compute services, Google provides a metadata server that provides applications running on that compute service (Compute Engine, Cloud Functions, Cloud Run, etc) with credentials. If you use Google SDK Client libraries for your code, the libraries will automatically select the credentials for you. The metadata server can be disabled (denied access through role/scope removal), so you need to evaluate what you are running on.
Storing and retrieving instance metadata
3) My code is running in Google Cloud without access to the metadata server.
This is a similar scenario to #1. However, now you are limited to only using a service account unless this is a web server type of service that can present the Google Accounts authorization service to the user.

Using OAuth2 tokens for interactive usage of GCP services instead of service account (keys)

In order to limit the number of service accounts to manage as well as handling their keys, I'm exploring other ways of accessing GCP resources from a developer laptop or desktop so I can run ad-hoc scripts or interactive programs (e.g. Jupyter notebook) that access GCP services.
Using gcloud auth application-default login generates, after authenticating via a web browser, a refresh token that can be used to get and renew access tokens that can be used to interact with GCP services.
The workflow I'm following is this:
Run gcloud auth application-default login. This generates a JSON file on my disk that
contains the refresh token.
Export the JSON file location as GOOGLE_APPLICATION_CREDENTIALS env variable
GOOGLE_APPLICATION_CREDENTIALS=/Users/my.username/.config/gcloud/application_default_credentials.jsonĀ 
Use that file to authenticate via Google auth library and interact with different GCP services.
This is convenient, as it reduces the need to circulate, secure and, if needed, share service account key files around team members. However, I have noticed that the refresh token provided does not expire and is still valid.
Unless I'm missing something here, this makes application_default_credentials.json file as sensitive as a service account key. If it gets lost or compromised it can be used to get access tokens without the need to re-authenticate, which is fairly insecure, IMO.
We're aware of the GCP security best practices recommend using service account (and their keys) for service-to-service workloads. This scenario I'm describing is for ad-hoc, development/testing of code from
a developer's or engineer's laptop. We think that forcing users to interactively authenticate via the web to get new tokens every few hours would be more secure and convenient than using long-lived service account keys stored in the hard drive.
I have read through [1] but I could not find a definitive answer.
Does anyone know if there is an expiration for these refresh tokens?
Is there a way of controlling and limiting their lifetimes (ideally to hours or minutes)?
What is the best/common practice for this scenario? Using a single service account (and key) per individual user?
[1] https://developers.google.com/identity/protocols/OAuth2#expiration
Note: User Credentials have Refresh Tokens too.
Does anyone know if there is an expiration for these refresh tokens?
Google OAuth Refresh Tokens do not expire. They can be revoked.
Is there a way of controlling and limiting their lifetimes (ideally to
hours or minutes)?
You could periodically revoke the Refresh Token which will invalidate the Access and Client ID tokens. This means that you are handling the Refresh Tokens which adds another security issue to manage.
What is the best/common practice for this scenario? Using a single
service account (and key) per individual user?
If you use User Credentials (the method where you log in to Google) you will receive SDK warnings and if you make a lot of API calls, you will become blocked. Google does not want you to use User credentials in place of Service Account credentials. The verification process for User Credentials requires more effort on Google's backend systems. User Credentials are assumed to be created in an insecure environment (web browsers) whereas Service Account credentials are assumed to be in a secure environment.
Best practices are to issue service account JSON key files to an individual application with only the required permissions for that application to operate. For example, if you create a tool that only needs to read Cloud Storage objects, create a service account with only read permissions. Periodically the service account keys should be rotated and new keys downloaded and old keys deleted. Each application should have its own service account JSON key file. I wrote an article on how to securely store JSON key files on Cloud Storage. This helps with rotating keys as your application just downloads the latest key when needed. (link). My article discusses Google Cloud Run, but the same principles apply.

how does one pass the credentials from a Google Cloud Identity Access Management system to a compute VM?

I wish to use the Google cloud IAM ( identity access management) system for a new Google App Engine project. (Although it's not necessary to know, the front-end will be an angular JS, and the backend in Java.) However, once the user logs into my app using his or her browser and is then authenticated via Google Cloud IAM, I need to know whether it's possible to pass this " authenticated credential" to a Google compute VM. If so, how? The reason why need to pass this "authenticated credential" is that I wish to use the gsutil ( or similar) functionality on a Google compute VM and I want to use the same username to ensure that the security profile carries through properly. (Specifically, I intend to use gsutil to communicate with Google cloud storage, but I intend to do this from a Windows Server compute engine VM.)
I've been reading on the Google computer VM and Google cloud IAM, and they all talk about being able to pass the "service account" token, but there is no reference to how to pass a "authenticated user" credential so that the gsutil command that can access Google cloud storage on the Windows VM could use this authenticated user. (I want to avoid making the user authenticate both for my application as well as for the gsutil program running within the compute engine Windows VM.)
Is this possible? If not, any suggestions/workarounds?
One idea I had, though ugly, is as follows: every time a Windows compute engine VM is requested, we would dynamically create a new Google service account which had the same permissions as the logged in IAM-authenticated user. Then, we would uses Google service account within the Windows compute VM to contact Google cloud storage. This solves the problem of ensuring that the same privileges are communicated, though it creates a slightly different problem in that all the logs that are generated for access to the file will be using this dummy service account instead of the real users name.