I have a Cloud Function that interacts with Cloud Storage and BigQuery and they all belong to the same project. The usual way that I have followed when deploying Cloud Function from the command line is this:
$ gcloud functions deploy my_function ... --set-env-vars GOOGLE_APPLICATION_CREDENTIALS=my_project_credentials.json
Where my_project_credentials.json is a json key file that contains service account and key to allow access to Cloud Storage and BigQuery.
As this is the way that I have done ever since, what I need is another way in order to avoid this json credentials file altogether (since these interacting services belong to the same Google Cloud project anyway). Is there such a way? I am a bit new with Google Cloud so I am not familiar with in and outs of IAM.
(An additional reason that I need this, is that I have a client that is not comfortable with me as a developer having access to that json key and also he/she doesn't want that json key deployed alongside with Function code. Kindly provide some details on how to this in IAM particularly to BigQuery and Cloud Storage as I don't have control over IAM as well).
When you can, and at least when you application run on GCP, you mustn't use service account key file. 2 reasons
It's a simple file for the authentication: you can easily copy it, send it by email and even commit it in your code repository, maybe public!!
It's a secret, you have to store it securely and to rotate it frequently (Google recommend at least every 90 days). It's hard to manage, you want redeploy your function every 90 days with a news security file!
So, my peer Gabe and Kolban have right. Use function identity:
Either you specify the service account email when deploying the function
Or the default service account will be used (this one of compute engine, with editor role by default. Not really safe, prefer the first solution)
In your code, use the getDefaultCredential (according with the language, the name change slightly but the meaning is the same). If you look into the source code, you will see that the function perform this
Look if GOOGLE_APPLICATION_CREDENTIALS env var exists. If so, use it
Look if "well known file" exists. According with the OS, and when you perform a gcloud auth application-default login, the credentials are stored in different place locally. The library look for them.
Look if the metadata server exists. This link reference compute engine but other environment followed the same principle.
There is no "magic" stuff. The metadata server know the identity of the function and can generate access and identity token on demand. The libraries implements calls to it if your code run on GCP -> That's why, you never need a service account key file, the metadata server is here for serving you this information!
What Kolban said. When you deploy your Cloud Function you can define a service account to use, and then any API calls that use Application Default Credentials will automatically use that service account without the need of a service account bearer token (the json file). Check out the docs here:
https://cloud.google.com/docs/authentication/production#auth-cloud-implicit-nodejs
Related
I'm fairly new to GCP Cloud Functions.
I'm developing a cloud function within a GCP project which needs to access some other resources from the project (such as GCS, for instance). When I set up a cloud function, it gets a service account associated to it, so, I'm able give this service account the required permissions on the IAM and it works just fine in production.
I'm handling the required integrations by using the GCP SDKs and identifying the resources relative to the GCP project. For instance, if I need to access a GCS bucket within that project, it looks something like this:
const bucket = await storage.bucket("bucket-name");
The problem with this is that I'm not able to access these resources if I'm running the cloud function locally for development, so, I have to deploy it every time to test it, which is a process that takes some time and makes development fairly unproductive.
So, is there any way I can run this cloud function locally whilst keeping the access to the necessary project resources so that I'm able to test it while developing? I figured that running this function as it's service account could work, but I don't know how to do it and I'm also open to different approaches.
Yes, there is!
The only thing you need to do is setting the GOOGLE_APPLICATION_CREDENTIALS environment variable to the path of a service account json file and then the googleapis libraries handle the rest automatically, most of the time.
I have googled quite heavily the last couple of hours to see if I could use Google Secret Manager from an external service like AWS Lambda or my local PC. I could not find anything helpful, or something that describes properly the steps to do so.
I do not want to play with the APIs and end up doing the authenticating via OAuth myself, I wish to use the client library. How would I go about doing so?
I have so far referred to the following links:
https://cloud.google.com/secret-manager/docs/configuring-secret-manager - Describes setting up secret manager, and prompts you to set up Google Cloud SDK.
https://cloud.google.com/sdk/docs/initializing - Describes setting up the cloud SDK (doesn't seem like I get some kind of config file that helps me to point my client library to the correct GCP project)
The issue I have is that it doesn't seem like I get access to some form of credential that I can use with the client library that consumes the secret manager service of a particular GCP project. Something like a service account token or a means of authenticating and consuming the service from an external environment.
Any help is appreciated, it just feels like I'm missing something. Or is it simply impossible to do so?
PS: Why am I using GCP secret manager when AWS offers a similar service? The latter is too expensive.
I think that your question applies to all GCP services, there isn't anything that is specific to Secret Manager.
As you mentioned, https://cloud.google.com/docs/authentication/getting-started documents how to create and use a Service Account. But this approach has the downside that now you need to figure out to store the service account key (yet another Secret!)
If you're planning to access GCP Secret Manager from AWS you can consider using: https://cloud.google.com/iam/docs/configuring-workload-identity-federation#aws which uses identity federation to map an AWS service account to a GCP service account, without the need to store an extra Secret somewhere.
Is it possible to do local development without using google service account key in minikube.
Currently I have a service account key which I use to authenticate google services inside the pod in minikube.
I want to avoid using service account key and instead use IAM role.
Iam User - XXX#XX.com
I am given required role/permission to this iam user.
After using gcloud auth login, I can access the google services. Is it possible to do similar way in k8s/minikube pods.
I think that you can cheat. Use it only for development environment, never in production
Firstly, locate your own user credentials created with the gcloud auth application-default login. The created file is
on linux ~/.config/gcloud/application_default_credentials.json
on windows %appdata%\gcloud\application_default_credentials.json
Then mount this file (or copy it) into minikube
You can define your GOOGLE_APPLICATION_CREDENTIALS env var to reference this file.
That's all, your credential will be used! Be careful, there is some limitation, I wrote an article on this
I think you are mixing things up. You can never use a key instead of a role. In most cases, you need both of them. You need a key to authenticate to the Google Cloud Platform. And you need a certain IAM role to access services within GCP. Authentication means confirming your own identity, whereas authorization means being allowed access to the system.
In your specific case, I think you are referring to the process of letting your application/code use your own identity to authentication to the Google Cloud Platform. There are 2 possibilities here:
Download a service account key file, which is prone to security leaks, because those key files are not rotated by themselves.
As #guillaume blaquiere explains below, you could also generate a key file using your own identity. The specifics about this are well explained here and here. For local development, this is preferred over the other option.
If you want to know how your SDK works with key files, I would recommend you take a look inside the SDK for the programming language you are using. There is always be a reference to GOOGLE_APPLICATION_CREDENTIALS. This is the location of the key file you are using.
I have a CLI tool that interacts with Google KMS. In order for it to work, I fetch the user credentials as a JSON file which is stored on disk. Now a new requirement came along. I need to make a web app out of this CLI tool. The web app will be protected via Google Cloud IAP. Question is, how do I run the CLI tool on behalf of the authenticated user?
You don't. Better use a service-account and assign the required role. That service account still could have domain-wide delegation of rights (able to impersonate just any user, which is known).
Running CLI tools from a web-application probably also could/should be avoided. Iit might be better to convert his CLI tool into a Cloud Function and then call it via HTTP trigger, from within the web-application (so that access to the service account is limited as far as possible).
This might also be something to reconsider, security-wise:
I fetch the user credentials as a JSON file which is stored on disk.
Even if it might have been required, with a service-account it wouldn't.
I'm deploying a Cloud Function which interacts with Google Drive. As far as I can tell, there's no way to use the implicit Cloud Function credentials or related environment variables to authorize with Drive, so I've had to resort to either including the credentials.json service account key file in the Cloud Function bundle or by setting an environment variable using that same key file during deployment of the Cloud Function.
It'd seem to me that you would be able to use the Cloud Function's implicit service account and credentials in order to achieve this.
You can specify a identity service account on Cloud Function. By the way, instead to use the default compute service account, you can use this one that you want.
Then you can share a document, or a shared drive with the service account email. By the way, the service account of the function will have access to this drive element.
However, it's not enough. When you will perform the request to the Google Drive API, you have to provide an authentication token in the header of the request. You can find here how to request the Cloud Function metadata server to get this token and then reuse it to your Google Drive API call.