I am trying to use Google Storage for my Django application deployed in Compute Engine. To enable this I have to download a credentials.json in my console. I have made a new service account. However I do not see any download json for this credentials. How do you generate the json file?
You can click on the relevant service account in Cloud Console and create a new key from KEYS tab by clicking Add Key. Once you create the key, the JSON file will be downloaded.
Checkout the documentation for more information if you want to create keys programmatically.
As #Guillaume commented, if you are running your application in Google Cloud env, then you use Application Default Credentials.
Related
I created a new project in Google Cloud Console. (eg. my-project-abc12345 - only a example name)
Then I tried to create an agent in Google Dialogflow Console, under the created project.
But the project my-project-abc12345 I created is not listed here. Does anybody know a possible reason for this?
Actually, I need to create the project first as I need proper naming for my project_id, and then use it in the agent creation time in Dialogflow.
The same scenario works well with my personal google account. For this one, I am using my company's google account.
Your ideas/comments/answers on this are really appreciated.
You can try below steps to resolve your issue :
First, make sure that you are using the same account in GCP and in
Dialogflow ES console, and that the account is listed as the owner
for that project.
You can try refreshing the browser page with the Dialogflow ES console if the project was created after you opened the Create a New Agent option in Dialogflow ES.
You can create an agent via the
API and then open it in the console, pasting the project ID in the
URL to the agent in the browser address bar.That is, open an
existing agent and replace the project ID in the URL.
The above step
is not required if you can see the new agent in the agents selector
after refreshing the Dialogflow ES console browser page.
We need to create gcp resources with terraform, but we are stuck at the terraform init stage while terraform tries to authenticate to gcp. We have already configured our backend and obtained our service account key but minifying (removing the extra lines in credential json file) the credential json and exporting to GOOGLE_CREDENTIALS, doesn't work. How are you setting this value?
If you are in a local and controlled environment you can use GOOGLE_APPLICATION_CREDENTIALS and set it with the path to the JSON key file. But as discussed key files are bad practices security wise. An alternative is to authenticate using gcloud auth application-default login and you dont have to deal with key files.
Another alternative is to use Google Cloud Shell which is already setup with the credentials of the authorised user opening the session.
Finally for automated pipeline you can use Google Cloud Build where processes will be run using the authentication and the authorisation of the service account used by Cloud Build
I know this question is probably a bit vague. I was trying to run one of the examples of Google NLP Library in Google Shell.
I have 0 experience with using API, JSON, Nodejs... I don't understand what they are and how to use them.
Please help
Here is the snapshot of the error:
The error message means that you are using user credentials instead of service account credentials.
When you connect to Google Cloud Shell, you are using your Google Accounts User Credentials. Those credentials are the ones that you used to log in to the Google Cloud Console. When you run an application in Google Cloud Shell, your application is using those credentials unless you explicitly specify different credentials.
The solution is to create a service account in the Google Cloud Console. Then in your program use the service account for credentials for your application.
Google Cloud Service Accounts
When you do not specify the application credentials, the Google Client libraries use a method to locate credentials called ADC (Application Default Credentials). I wrote an article that might help you understand ADC:
Google Cloud Application Default Credentials
The simplest method for you is to create the environment variable GOOGLE_APPLICATION_CREDENTIALS to point to the service account full path location before running your application. Change the path below to point to where the service account is stored on Cloud Shell. You will need to first create the service acount, download it and then upload to Cloud Shell.
export GOOGLE_APPLICATION_CREDENTIALS="$HOME/service-account.json"
Managing files with Cloud Shell
This link will provide more information on how to write applications that use service accounts.
Setting Up Authentication for Server to Server Production Applications
I'm building an appengine app that requires access to the Google Play Developer API. I've seen in the sample code that it's possible to authenticate using a service account in addition to Oauth.
Is there any chance this could work with the default service account without having to generate a json key ? That would make the setup a bit easier.
Edit: be more explicit about not using a json key but really using the default application credentials instead.
For App Engine Standard environment:
You can generate a Service Account key file from default service
account. Follow the link sample code you provided, then click on
the link shown in the "Getting Started" section, you'll get in
the Google Developer Console. If logged in with the correct account
(you should see your project name at the top), then go to Credentials
-> Create credentials -> Service Account key. In the service account dropdown list, choose "App Engine Default Service Account", choose
JSON as key type and you should be good to go to follow the last
instructions on the Github page.
For App Engine Flexible environment:
The default service account isn't listed in the Service Account page, as explained here. You can't generate a service account key with it. You'd need to use a custom service account.
I want to deploy a node application on a google cloud compute engine micro instance from a source control repo.
As part of this deployment I want to use KMS to store database credentials rather than having them in my source control. To get the credentials from KMS I need to authenticate on the instance with GCLOUD in the first place.
Is it safe to just install the GCloud CLI as part of a startup script and let the default service account handle the authentication? Then use this to pull in the decrypted details and save them to a file?
The docs walkthrough development examples, but I've not found anything about how this should work in production, especially as I obviously don't want to store the GCloud credentials in source control either.
Yes, this is exactly what we recommend: use the default service account to authenticate to KMS and decrypt a file with the credentials in it. You can store the resulting data in a file, but I usually either pipe it directly to the service that needs it or put it in tmpfs so it's only stored in RAM.
You can check the encrypted credentials file into your source repository, store it in Google Cloud Storage, or elsewhere. (You create the encrypted file by using a different account, such as your personal account or another service account, which has wrap but not unwrap access on the KMS key, to encrypt the credentials file.)
If you use this method, you have a clean line of control:
Your administrative user authentication gates the ability to run code as the trusted service account.
Only that service account can decrypt the credentials.
There is no need to store a secret in cleartext anywhere
Thank you for using Google Cloud KMS!