Google Cloud ML Engine: Unable to access Cloud KMS - google-cloud-platform

I have a encrypted text, encrypted using Cloud KMS, and I need to decrypt it from the context of a code running in Cloud ML Engine. However I'm running into the following error:
shaded.com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
{
"code" : 403,
"errors" : [ {
"domain" : "global",
"message" : "Request had insufficient authentication scopes.",
"reason" : "forbidden"
} ],
"message" : "Request had insufficient authentication scopes.",
"status" : "PERMISSION_DENIED"
}
at shaded.com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
at shaded.com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at shaded.com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at shaded.com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:321)
at shaded.com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1049)
at shaded.com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:419)
at shaded.com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
at shaded.com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
I tried creating the GoogleCredential object with "https://www.googleapis.com/auth/cloudkms" and "https://www.googleapis.com/auth/cloud-all" scopes, but ended up with the same error.
Please let me know if I'm missing something here.
P.S: I do have a valid GoogleCredential object as I'm able to print the access token.

For now it's not working as we restrict the API scopes on VMs. We are working on the feature to allow KMS. Stay tuned!
Update: we have pushed the change, so you should be able to access KMS now. Please give a try.

Related

How to resolve "Access to requested resource is denied"?

I am getting the following when sending a marketplaceParticipations request to sellers/v1/marketplaceParticipations via Postman after following instructions and examples provided at https://developer-docs.amazon.com/sp-api/docs/connecting-to-the-selling-partner-api
{
"errors": [
{
"message": "Access to requested resource is denied.",
"code": "Unauthorized",
"details": ""
}
]
}
We have registered a self-authorized app client in Draft status which has a user ARN IAM attached as described at https://developer-docs.amazon.com/sp-api/docs/registering-your-application.Ï
I've checked the inline and role policies for the ARN IAM. They are exactly as described at https://developer-docs.amazon.com/sp-api/docs/creating-and-configuring-iam-policies-and-entities#step-4-create-an-iam-role.
We are able to successfully request an LWA access token following the docs at https://developer-docs.amazon.com/sp-api/docs/connecting-to-the-selling-partner-api#step-1-request-a-login-with-amazon-access-token.
Please check that the roles of the user you are using allow to make request to that endpoint in your dev profile at https://sellercentral.amazon.com/
As far as I know, the getMarketplaceParticipations doesn't need a Restricted Data Token (RDT). So you must be able to solve it by giving the user the correct roles.
I was able to get them using Postman. It is a good way to check that the request is correctly built and not a programming issue.

GCP permissions: access scopes and custom IAM service account roles

I have a Kotlin app that uses custom service account and needs to query a BigQuery table backed by a Google Spreadsheet. Querying the table requires a "https://www.googleapis.com/auth/drive" access scope, but as I understand custom service accounts can't have access scopes attached to them. Whats the best way forward? Can I add some set of permissions to custom SA that would simulate the access scope?
The error I'm getting is:
INFO - com.google.cloud.bigquery.BigQueryException: Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.
at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:115)
at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.queryRpc(HttpBigQueryRpc.java:652)
at com.google.cloud.bigquery.BigQueryImpl$35.call(BigQueryImpl.java:1282)
at com.google.cloud.bigquery.BigQueryImpl$35.call(BigQueryImpl.java:1279)
at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:105)
at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:64)
at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:38)
at com.google.cloud.bigquery.BigQueryImpl.queryRpc(BigQueryImpl.java:1278)
at com.google.cloud.bigquery.BigQueryImpl.query(BigQueryImpl.java:1266)
at com..extractor.BigQueryDataExtractor.extractData(BigQueryDataExtractor.kt:90)
at com.extractor.commands..call(.kt:62)
at com..extractor.commands..call(.kt:9)
at picocli.CommandLine.executeUserObject(CommandLine.java:1953)
at picocli.CommandLine.access$1300(CommandLine.java:145)
at picocli.CommandLine$RunLast.executeUserObjectOfLastSubcommandWithSameParent(CommandLine.java:2358)
at picocli.CommandLine$RunLast.handle(CommandLine.java:2352)
at picocli.CommandLine$RunLast.handle(CommandLine.java:2314)
at picocli.CommandLine$AbstractParseResultHandler.execute(CommandLine.java:2179)
at picocli.CommandLine$RunLast.execute(CommandLine.java:2316)
at picocli.CommandLine.execute(CommandLine.java:2078)
at com...extractor..run(.kt:25)
at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:791)
at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:775)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:345)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1343)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1332)
at com...extractor..main(.kt:38)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:49)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:108)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:58)
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:88)
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
POST https://www.googleapis.com/bigquery/v2/projects//queries
{
"code" : 403,
"errors" : [ {
"domain" : "global",
"message" : "Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.",
"reason" : "accessDenied"
} ],
"message" : "Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.",
"status" : "PERMISSION_DENIED"
}```
EDIT: I've managed to pass extra scopes to the credentials created by Spring Boot, so solved my problem, but would still be interested in hearing if there was a way to solve the problem without using scopes
You can assign access scopes to the client when creating it. In my case it was a case of setting a property in the Spring Boot configuration file as described here:
https://docs.spring.io/spring-cloud-gcp/docs/current/reference/html/#scopes

Google recommendations AI deleting catalog items error 403

I am trying to integrate my catalog into google recommendations ai, and for debugging purposes I want to be able to delete items from the catalog once imported. The documentations suggests running the following code:
curl -X DELETE \
-H "Authorization: Bearer $(gcloud auth application-default print-access-token)" \
"https://retail.googleapis.com/v2/projects/[PROJECT_NUMBER]/locations/global/catalogs/default_catalog/branches/0/products/[PRODUCT_ID]"
which returns the following error:
{
"error": {
"code": 403,
"message": "Your application has authenticated using end user credentials from the Google Cloud SDK or Google Cloud Shell which are not supported by the retail.googleapis.com. We recommend configuring the billing/quota_project setting in gcloud or using a service account through the auth/impersonate_service_account setting. For more information about service accounts and how to use them in your application, see https://cloud.google.com/docs/authentication/.",
"status": "PERMISSION_DENIED",
"details": [
{
"#type": "type.googleapis.com/google.rpc.ErrorInfo",
"reason": "SERVICE_DISABLED",
"domain": "googleapis.com",
"metadata": {
"service": "retail.googleapis.com",
"consumer": "[redacted for privacy]"
}
}
]
}
Running the suggested code with the --impersonate-service-account flag, results in the same error as above, but preceded with
WARNING: Impersonate service account '[name redacted for privacy]' is detected. This command cannot be used to print the access token for an impersonate account. The token below is still the application default credentials' access token.
If I attempt to log in for authorization instead of printing an access token, I get the following error:
{
"error": {
"code": 401,
"message": "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.",
"status": "UNAUTHENTICATED"
}
}
The link provided is no longer working. I would like to know how to provide proper authorization/authentication for deleting an item from the recommendation ai catalog.
PS. The account I am using to do this is the owner of the project, and as such should have all the appropriate permissions.
To set up authentication for your application of local cURL request to Recommendations AI. You have to create a Service Account, download SA key, and set the environment variable 'GOOGLE_APPLICATION_CREDENTIALS' to the key so allow your local cURL request noted in the original description
If you are unable to download a SA key file, I kindly suggest to try to use a ! before the curl like this, !curl -X POST , also if you are familiar with Python, it's recommended to use the python library for Recommendations AI.

GCloud Auth with using service account to access BigQuery from a java app not working

I want to access BigQuery to select data from table in my JAVA application. Firstly, I have created a service account and gave a permission as BigQuery Admin. Json of service account was passed as an environment variable, I used the code as below(got it from https://cloud.google.com/docs/authentication/production)
static void authImplicit() {
// If you don't specify credentials when constructing the client, the client library will
// look for credentials via the environment variable GOOGLE_APPLICATION_CREDENTIALS.
Storage storage = StorageOptions.getDefaultInstance().getService();
System.out.println("Buckets:");
Page<Bucket> buckets = storage.list();
for (Bucket bucket : buckets.iterateAll()) {
System.out.println(bucket.toString());
}
}
The method returns 401 with this error response:
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 401 Unauthorized
{
"code" : 401,
"errors" : [ {
"domain" : "global",
"location" : "Authorization",
"locationType" : "header",
"message" : "Login Required.",
"reason" : "required"
} ],
"message" : "Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.",
"status" : "UNAUTHENTICATED"
}
From your description looks like you are not placing your json file in the described path in your environmental variable. This is assuming that you are describing correctly your project id in the properties file, you have enabled the required APIs in Google Cloud Platform. In order to determine this with more precision I would need you to share with us your folder structure and your properties files.
I would suggest you to use the CredentialsProvider interface to solve this issue
public interface CredentialsProvider {
Credentials getCredentials() throws IOException;
}
You can find more information on this interface here.
You can also take a look of this quickstart made for spring + bigquery from the Spring team

AWS API gateway Stage name error handling

Consider I have created an API using with AWS API gateway with the following URL
https://0abcgdefg1.execute-api.ap-northeast-1.amazonaws.com/Employee/.
is it possible to create an error message if the end user tried a nonexisting stage name? For eg
https://0abcgdefg1.execute-api.ap-northeast-1.amazonaws.com/Employee1/
is it possible to give some error information like below?
{
"errors": [
{
"message": "Stage name Employee1doesn't exist",
"type": "InvalidStageError"
}
]
}
No APIGW do not support returning custom error information when stage does not exits.
I am also wondering why you need to return this error message to API user ? can you provide details about your use case ?