gcloud ml language Request had insufficient authentication scopes - google-cloud-platform

For a relatively small academic research project, I am trying to use Google Cloud Natural Language API.
From what I understood on the Authentication Overview, it looks like an API key would be the best and simplest approach to authentication, rather than a service account or user account.
Creating the key was easy enough. But now I am stuck on how to actually use it in conjunction with gcloud commands on an Ubuntu VM instance on Google cloud compute engine.
When I try to run the simple example on the Natural Language Quickstart Guide, I get this error:
gcloud ml language analyze-entities --content="Michelangelo Caravaggio, Italian painter, is known for 'The Calling of Saint Matthew'."
ERROR: (gcloud.ml.language.analyze-entities) PERMISSION_DENIED:
Request had insufficient authentication scopes.
The documentation and Q&A I've seen related to this error are related to service accounts or user accounts, but I am trying to just use the "simple" API key.
The documentation for Using an API key shows how to do so via REST. But, for now as a "quick" test to see if I have the Natural Language API working, I want to just do a simple test with gcloud on the command line. I looked through the gcloud documentation, but could not find anything about specifying an API key string.
How can I run the above command with gcloud and authenticate with my API key?
If this API key turns out to be more of a hassle, I may consider switching to a service account.
Any help would be greatly appreciated...

Got this to work by:
From Google Cloud console:
Compute Engine -> VM instances
Click name of existing VM, which brings up VM instance details page. Click "Edit" link near the top of the page.
Then modify Cloud API access scopes to allow full access to all Cloud APIs.

If you are using a GCE VM the easiest way to authenticate to the Cloud APIs is to use the VM's service account. When you create the VM you can specify what scopes to authorize for the service account. The simplest solution is to provision a VM with Cloud Platform scope. Using gcloud
gcloud --project=$PROJECT compute instances create $VM --zone=$ZONE --machine-type=$MACHINE --scopes=cloud-platform

Related

Authenticating gcloud sdk with workload identity federation

I am trying to authenticate a service account by gcloud auth login command using workload identity federation based on whats mentioned in this official tutorial. Even though the tutorial says both service account keys and workload identity federation works for my use case, WIF is the preferred route forward using credential configuration file. But I am quite confused trying to generate the file for my use case as doing so requires me to create a workload identity provider which are categorized to be among the following types:
AZURE,AWS,OIDC,SAML. I just want to use WIF to authenticate gcloud SDK from my terminal so I am not sure which category should I use.
Is this a possible use case or should I resort to use service account keys ?
But I am quite confused trying to generate the file for my use case as doing so requires me to create a workload identity provider which are categorized to be among the following types:
AZURE,AWS,OIDC,SAML. I just want to use WIF to authenticate gcloud SDK from my terminal so I am not sure which category should I use.
Is this a possible use case or should I resort to use service account keys ?
Workload Identity Federation(WIF), is used in multi-cloud environments and hybrid cloud environments where one needs access to one cloud platform from another cloud platform or from a data center as the services are catered across multiple platforms and needs coordination for running your application.
There are multiple ways to connect other cloud providers with GCP, you can use WIF for connecting with Amazon Web Services (AWS), you could use OpenID Connect (OIDC) or SAML 2.0 to connect with any other cloud providers, such as Microsoft Azure. Refer to the source for more information. (Source: GCP docs)
Since you are trying to connect to gcloud SDK from your terminal you can simply use your credential file or gcloud auth or gcloud init commands for setting up the gcloud cloud SDK and have necessary roles and permissions enabled for the service or user account which you are using for authentication. This is the simplest way to access your gcp environment. JFYI, in Authorize the gcloud CLI documentation(the doc which you were referring to) they are using the credential file which is different from WIF, so if you want to authenticate without using SA(service account) you can simply follow credential file based authentication.

In a containerized application that runs in AWS/Azure but needs to access GCLOUD commands, what is the best way to setup gcloud authentication?

I am very new to GCP and I would greatly appreciate some help here ...
I have a docker containerized application that runs in AWS/Azure but needs to access gcloud SDK as well as through "Google cloud client libraries".
what is the best way to setup gcloud authentication from an application that runs outside of GCP?
In my Dockerfile, I have this (cut short for brevity)
ENV CLOUDSDK_INSTALL_DIR /usr/local/gcloud/
RUN curl -sSL https://sdk.cloud.google.com | bash
ENV PATH $PATH:$CLOUDSDK_INSTALL_DIR/google-cloud-sdk/bin
RUN gcloud components install app-engine-java kubectl
This container is currently provisioned from an Azure app service & AWS Fargate. When a new container instance is spawned, we would like it to be gcloud enabled with a service account attached already so our application can deploy stuff on GCP using its deployment manager.
I understand gcloud requires us to run gcloud auth login to authenticate to your account. How we can automate the provisioning of our container if this step has to be manual?
Also, from what I understand, for cloud client libraries, we can store the path to service account key json file in an environment variable (GOOGLE_APPLICATION_CREDENTIALS). So this file either has to be stored inside the docker image itself OR has to be mounted from an external storage at the very least?
How safe is it to store this service account key file in an external storage. What are the best practices around this?
There are two main means of authentication in Google Cloud Platform:
User Accounts: Belong to people, represent people involved in your project and they're associated to a Google Account
Service Accounts: Used by an application or an instance.
Learn more about their differences here.
Therefore, you are not required to use the command gcloud auth login to perform gcloud commands.
You should be using gcloud auth activate-service-account instead, along with the --key-file=<path-to-key-file> flag, which will allow you to authenticate without the need of signing into a Google Account with access to your project every time you need to call an API.
This key should be stored securely, preferably encrypted in the platform of your choice. Learn how to do it in GCP here following these steps as an example.
Take a look at these useful links for storing secrets in Microsoft Azure and AWS.
On the other hand, you can deploy services to GCP programmatically either using Cloud Libraries with your programming language of choice, or using Terraform is very intuitive if you prefer to do so over using the Google Cloud SDK through the CLI.
Hope this helped.

How to create API Keys in GCP using service accont

I have a service account with Owner permissions on the project (Just for testing this out)
Still I am not able to create API Keys using that service account via gcloud. It says "Permission Denied"
I am using the following commands.
1.
gcloud auth activate-service-account <Service-account>#<project-id>.iam.gserviceaccount.com --key-file=<key-file>.json
2.
gcloud auth list //Gives the service account name
3.
gcloud alpha services api-keys create --display-name=dummy
The above command works if I authenticate as a normal user with Owner permission but with service account it doesn't seems to work. Am I missing something ? Please help.
The APIKEY Api has a strange history. Relesed in Beta about 1 years ago, and now go back to Alpha. There is no public documentation (in reality it has been removed) and if you know this API, you have found it on SO or on old tutorial.
Anyway, just to say that it's not a reliable API and if you want to automate stuff on it (with call with a service account) it's not a good idea. In addition, sometime, APIs don't allow service account call but require user credentials. It was the case previously with the quota APIs, but it has been updated recently (this summer 2020).
Eventually, Google Cloud don't recommend to use APIKEY for security reason (we can discuss this more if you want). And thus, I don't think it is in its (security and best practice) strategy to promote an API that allows APIKEY automation.

Logging into google compute engine with a service account

I have spent the entire day today reading documentations and questions on stackexchange on trying to use service account to logon to a compute engine but have got no where.
I am new to google cloud, so pardon my knowledge.
We are trying to setup a long running service on a google compute engine. We want the service to be run as a system account but not on individual account so as to allow troubleshooting privileges across the team but not specific users. We thought that service account of GCP should be able to accomplish this but we havent been able to get to logon to a compute engine as a service account. We took the following steps to try this out -
create service account and give serviceaccountuser permissions to the team. Also create rsa key for the service account that were distributed to the team.
use gcloud auth activate-service-account to switch to the service account
gcloud init to the service account and setup configuration
use gcloud compute ssh .
We hoped to be able to logon to the instance as the service account since we switched identity before logging on. But we are not getting the desired effect.
questions -
Can service accounts be actually used to logon to compute engine?
if not, what is the purpose of configuring a service account to run as when creating a VM on GCP.
if not, what is the right way to run a service on a compute engine using a system account that everybody can have access to?
if yes, what are we missing?
Thanks a lot for solving the confusion in advance,
The service account allows the Compute Engine instance to access other Google APIs. For example, the instance might need to access private content from Storage buckets or connect to a Datastore. See https://cloud.google.com/iam/docs/service-accounts
In order to give your team members (ssh) access to a compute engine instance, you add them as members to the project by adding their Google accounts. Specify their level of access so they can only list and ssh in, but not create or delete. I think you want a new role with "Compute OS Login" permission. They don't need billing set up either. See https://cloud.google.com/iam/docs/granting-changing-revoking-access

how does one pass the credentials from a Google Cloud Identity Access Management system to a compute VM?

I wish to use the Google cloud IAM ( identity access management) system for a new Google App Engine project. (Although it's not necessary to know, the front-end will be an angular JS, and the backend in Java.) However, once the user logs into my app using his or her browser and is then authenticated via Google Cloud IAM, I need to know whether it's possible to pass this " authenticated credential" to a Google compute VM. If so, how? The reason why need to pass this "authenticated credential" is that I wish to use the gsutil ( or similar) functionality on a Google compute VM and I want to use the same username to ensure that the security profile carries through properly. (Specifically, I intend to use gsutil to communicate with Google cloud storage, but I intend to do this from a Windows Server compute engine VM.)
I've been reading on the Google computer VM and Google cloud IAM, and they all talk about being able to pass the "service account" token, but there is no reference to how to pass a "authenticated user" credential so that the gsutil command that can access Google cloud storage on the Windows VM could use this authenticated user. (I want to avoid making the user authenticate both for my application as well as for the gsutil program running within the compute engine Windows VM.)
Is this possible? If not, any suggestions/workarounds?
One idea I had, though ugly, is as follows: every time a Windows compute engine VM is requested, we would dynamically create a new Google service account which had the same permissions as the logged in IAM-authenticated user. Then, we would uses Google service account within the Windows compute VM to contact Google cloud storage. This solves the problem of ensuring that the same privileges are communicated, though it creates a slightly different problem in that all the logs that are generated for access to the file will be using this dummy service account instead of the real users name.