GCP Python SDK - Enable APIs without gcloud - google-cloud-platform

I have used the documentation here https://cloud.google.com/endpoints/docs/frameworks/enable-api#gcloud to enable new APIs via the gcloud CLI tool. I have a service account with the owner role over my project, is it possible from within the python SDK, to enable APIs without using gcloud? Or, is it possible to call the gcloud CLI tool from the python SDK?

You can by using the discovery API client library for Service Usage API
You can also perform an authenticated call to the Service Usage API enable service endpoint
Let me know if you have issues and you want code example.

According to the official documentation:
Enabling APIs
You can enable Cloud APIs for a project using the console
You can also enable and disable Cloud APIs using Cloud SDK and Service Usage API
Therefore I would recommend to make a HTTP request using python to this url :POST https://serviceusage.googleapis.com/v1/{name=*/*/services/*}:enable
This is the curl command I used to enable analytics.googleapis.com API
OAUTH_TOKEN=`gcloud auth print-access-token`
curl -X POST -H "Authorization: Bearer $OAUTH_TOKEN" -H "Content-Type: application/json" https://serviceusage.googleapis.com/v1/projects/your-project/services/analytics.googleapis.com:enable
Response body
.......
"state": "ENABLED",
"parent": "projects/xxxxxxx"
.

Related

List objects from publicly accessible Amazon S3 bucket

I have one Amazon S3 bucket which is public with list and get permission. I want to list object in ruby. We can use AWS SDK to list objects but it require credentials. I want to list objects in ruby without using credentials. how to achieve this ??
I think you could use the HTTP Method.Amazon S3 support make requests to Amazon S3 endpoints by using the REST API.
I try putObject with HTTP Method and it work,I use the curl command.
But the Object owner is anonymous, i can't remove it.
And I am not familiar with Ruby,I think it also work with listObject without use SDK.
This is my curl command :
curl --request PUT --upload-file "./myobject" "https://${mybkt}......../myobject"
ListObject HTTP method Doc:
https://docs.aws.amazon.com/AmazonS3/latest/API/API_ListObjectsV2.html
To use AWS SDK in any language, you need to create a Service Client in that given language, Ruby is no different from .NET, Python, Java, etc.
To make an AWS Service call from a Service Client (not CLI, or Rest, etc), you must specify creds. More information can be found in the AWS Ruby DEV Guide:
Configuring the AWS SDK for Ruby
Not all AWS SDKs expose options to make unsigned API requests. There is no option in the Ruby SDK.
You might want to comment on, or re-open, this previously-closed feature request in the Ruby SDK.

Access GCP cloud Storage bucket from docker container without using gcloud utility

I have a docker file that needs to access the GCP bucket. I do not want to authenticate GCP using SA and gcloud utility (gcloud auth activate-service-account <<gcp account>> --key-file=<<serviceaccount>>.json due to security violence.
I want to use a different authentication approach with security compliance.
Could anyone please help with the same?
You can try API authentication to access GCP bucket instead of using ‘gsutil’ and client library authentication.
To make requests using OAuth 2.0 to either the Cloud Storage XML API or JSON API, include your application's access token in the Authorization header in every request that requires authentication. You can generate an access token from the OAuth 2.0 Playground.
Authorization: Bearer OAUTH2_TOKEN
Use the list method of the Objects resource.
GET /storage/v1/b/example-bucket/o HTTP/1.1
Host: www.googleapis.com
Authorization: Bearer’ ya29.AHES…….gSwg’
To authorize requests from the command line or for testing, you can use the curl command with the following syntax:
curl -H "Authorization: Bearer OAUTH2_TOKEN "https://storage.googleapis.com/storage/v1/b/BUCKET_NAME/o"
For local testing, you can use the ‘gcloud auth application-default print-access-token’ command to generate a token.
Due to the complexity of managing and refreshing access tokens and the security risk when dealing directly with cryptographic applications, we strongly encourage you to use a verified client library.
Cloud storage authentication for additional information.

How do I allow a team member to use my Google Cloud Speech-to-Text API account?

I enabled the Google Cloud Speech-to-Text API, but I would like to allow a team member to use it on my account.
I went into IAM to add a new user, but I don't see any roles related to Cloud Speech-to-Text API. What IAM role(s) do I need to select to allow the new team member access to the API?
Text-to-speech API is a special (old?) API at Google and doesn't require role. The API URL also doesn't require project definition. So, you need an account linked to the project to be able to reach the Speech-to-text API.
For this, the "service-account" account is the account to use. So, the users need to use a service account to reach the API. To prevent the service account key file generation (source of potential security issue), prefer the impersonation.
With the gcloud cli you can do this to generate a valid access-token on behalf of the service account.
gcloud auth print-access-token --impersonate-service-account=<the service account to impersonate>
So, in your API call, from your computer (I mean with your own user credential) you can do like this
curl -d #inputdata -H "content-type: application/json" \
-H "Authorization: Bearer $(gcloud auth print-access-token --impersonate-service-account=<the service account to impersonate>)" \
https://speech.googleapis.com/v1/speech:recognize

Enable Dialogflow API on Google Cloud Project using Service Accounts

I'm trying to create new Agents in different projects using Service Accounts (I don't know any alternative to Service Accounts) but as I suspect because Service Accounts are specific to a project, I cannot enable the corresponding Dialogflow API using any kind of REST API or something alike. Any help is kindly appreciated.
Beside using the Cloud console, you can enable Cloud APIs using Cloud SDK and Service Usage API
Using Gcloud command:
gcloud services enable dialogflow.googleapis.com
Using RESTful API services.enable method:
POST https://serviceusage.googleapis.com/v1/projects/12345678901/services/dialogflow.googleapis.com:enable
The used service account requires serviceusage.services.enable that you can include in a custom role and assign it to the used service account.
The following is the link for the console method:
https://console.cloud.google.com/flows/enableapi?project=[YOUR_PROJECT]&apiid=dialogflow.googleapis.com

Dataproc : Submit a Spark Job through REST API

We are using GoogleCloudPlatform for big-data analytics. For processing we are currently using the google cloud dataproc & spark-streaming.
I want to submit a Spark job using the REST API, but when I am calling the URI with the api-key, I am getting the below error!
{
"error": {
"code": 403,
"message": "The caller does not have permission",
"status": "PERMISSION_DENIED"
}
}
URI :- https://dataproc.googleapis.com/v1/projects/orion-0010/regions/us-central1-f/clusters/spark-recon-1?key=AIzaSyA8C2lF9kT*************SGxAipL0
I created the API from the google console> API manager
While API keys can be used for associating calls with a developer project, it's not actually used for authorization. Dataproc's REST API, like most other billable REST APIs within Google Cloud Platform, uses oauth2 for authentication and authorization. If you want to call the API programmatically, you'll likely want to use one of the client libraries such as the Java SDK for Dataproc which provides convenience wrappers around the low-level JSON protocols, as well as giving you handy thick libraries for using oauth2 credentials.
You can also experiment with the direct REST API using Google's API explorer where you'll need to click the button on the top right that says "Authorize requests using OAuth 2.0".
I also noticed you used us-central1-f under the regions/ path for the Dataproc URI; note that Dataproc's regions don't map one-to-one with Compute Engine zones or regions; rather, Dataproc's regions will each contain multiple Compute Engine zones or regions. Currently there is only one Dataproc region available publicly, which is called global and is capable of deploying clusters into all Compute Engine zones. For an easy illustration of using an oauth2 access token, you can simply use curl along with gcloud if you have the gcloud CLI installed:
PROJECT='<YOUR PROJECT HERE>'
ACCESS_TOKEN=$(gcloud beta auth application-default print-access-token)
curl \
--header "Authorization: Bearer ${ACCESS_TOKEN}" \
--header "Content-Type: application/json" \
https://dataproc.googleapis.com/v1/projects/${PROJECT}/regions/global/clusters
Keep in mind that the ACCESS_TOKEN printed by gcloud here by nature expires (in about 5 minutes, if I remember correctly); the key concept is that the token you pass along in HTTP headers for each request will generally be a "short-lived" token, and by design you'll have code which separately fetches new tokens whenever the access tokens expire using a "refresh token"; this helps protect against accidentally compromising long-lived credentials. This "refresh" flow is part of what the thick auth libraries handle under the hood.