I have a Kotlin app that uses custom service account and needs to query a BigQuery table backed by a Google Spreadsheet. Querying the table requires a "https://www.googleapis.com/auth/drive" access scope, but as I understand custom service accounts can't have access scopes attached to them. Whats the best way forward? Can I add some set of permissions to custom SA that would simulate the access scope?
The error I'm getting is:
INFO - com.google.cloud.bigquery.BigQueryException: Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.
at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:115)
at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.queryRpc(HttpBigQueryRpc.java:652)
at com.google.cloud.bigquery.BigQueryImpl$35.call(BigQueryImpl.java:1282)
at com.google.cloud.bigquery.BigQueryImpl$35.call(BigQueryImpl.java:1279)
at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:105)
at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:64)
at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:38)
at com.google.cloud.bigquery.BigQueryImpl.queryRpc(BigQueryImpl.java:1278)
at com.google.cloud.bigquery.BigQueryImpl.query(BigQueryImpl.java:1266)
at com..extractor.BigQueryDataExtractor.extractData(BigQueryDataExtractor.kt:90)
at com.extractor.commands..call(.kt:62)
at com..extractor.commands..call(.kt:9)
at picocli.CommandLine.executeUserObject(CommandLine.java:1953)
at picocli.CommandLine.access$1300(CommandLine.java:145)
at picocli.CommandLine$RunLast.executeUserObjectOfLastSubcommandWithSameParent(CommandLine.java:2358)
at picocli.CommandLine$RunLast.handle(CommandLine.java:2352)
at picocli.CommandLine$RunLast.handle(CommandLine.java:2314)
at picocli.CommandLine$AbstractParseResultHandler.execute(CommandLine.java:2179)
at picocli.CommandLine$RunLast.execute(CommandLine.java:2316)
at picocli.CommandLine.execute(CommandLine.java:2078)
at com...extractor..run(.kt:25)
at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:791)
at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:775)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:345)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1343)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1332)
at com...extractor..main(.kt:38)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:49)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:108)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:58)
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:88)
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
POST https://www.googleapis.com/bigquery/v2/projects//queries
{
"code" : 403,
"errors" : [ {
"domain" : "global",
"message" : "Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.",
"reason" : "accessDenied"
} ],
"message" : "Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.",
"status" : "PERMISSION_DENIED"
}```
EDIT: I've managed to pass extra scopes to the credentials created by Spring Boot, so solved my problem, but would still be interested in hearing if there was a way to solve the problem without using scopes
You can assign access scopes to the client when creating it. In my case it was a case of setting a property in the Spring Boot configuration file as described here:
https://docs.spring.io/spring-cloud-gcp/docs/current/reference/html/#scopes
Related
I am getting the following when sending a marketplaceParticipations request to sellers/v1/marketplaceParticipations via Postman after following instructions and examples provided at https://developer-docs.amazon.com/sp-api/docs/connecting-to-the-selling-partner-api
{
"errors": [
{
"message": "Access to requested resource is denied.",
"code": "Unauthorized",
"details": ""
}
]
}
We have registered a self-authorized app client in Draft status which has a user ARN IAM attached as described at https://developer-docs.amazon.com/sp-api/docs/registering-your-application.Ï
I've checked the inline and role policies for the ARN IAM. They are exactly as described at https://developer-docs.amazon.com/sp-api/docs/creating-and-configuring-iam-policies-and-entities#step-4-create-an-iam-role.
We are able to successfully request an LWA access token following the docs at https://developer-docs.amazon.com/sp-api/docs/connecting-to-the-selling-partner-api#step-1-request-a-login-with-amazon-access-token.
Please check that the roles of the user you are using allow to make request to that endpoint in your dev profile at https://sellercentral.amazon.com/
As far as I know, the getMarketplaceParticipations doesn't need a Restricted Data Token (RDT). So you must be able to solve it by giving the user the correct roles.
I was able to get them using Postman. It is a good way to check that the request is correctly built and not a programming issue.
i've a problem to run a job in google dataprep.
I set up a connection through an external database on google sql. In big query I imported the database connection. In google data prep I selected the table to do some operations. I tried to create a very simple flow by joining two tables. After that, run job. By doing the first "tour" I ran a test job and it worked properly. I can't get it working, although the account is owner . I've tried with another account to which I gave owner permissions.
The error code I see in the job run logs is:
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
{
"code" : 403,
"errors" : [ {
"domain" : "global",
"message" : "Access Denied: Project nameproject: User does not have bigquery.jobs.create permission in project nameprogect.",
"reason" : "accessDenied"
} ],
"message" : "Access Denied: Project nameproject: User does not have bigquery.jobs.create permission in project nameproject.",
"status" : "PERMISSION_DENIED"
}
ps: I'm not working with the API but directly in the google data prep panel.
I checked the google documentation but I didn't find much, besides the fact that to run the jobs you have to be the owner of the project.
My user has User permission of big query "Access to run jobs"
Thank you for the help.
Matteo
bigquery.jobs.create is a role assigned to:
BigQuery Admin
BigQuery Job User
BigQuery User
You'll need to assign one of these roles to the account running the tasks to proceed past this error. You can go to IAM & Admin > Roles in the GCP panel to look at which roles are assigned to particular titles, a handy way to see which levels you may need to grant.
The bigquery.jobs.create permission has to be assigned to the compute service account PROJECT_NUMBER-compute#developer.gserviceaccount.com.
Try to add to that service account the permissions:
Storage/Storage Object Viewer
BigQuery/BigQuery User
You can do it by going to the Hamburger Menu in the Console->IAM & Admin->IAM and clicking on the pencil icon at the right side of the service account.
I want to access BigQuery to select data from table in my JAVA application. Firstly, I have created a service account and gave a permission as BigQuery Admin. Json of service account was passed as an environment variable, I used the code as below(got it from https://cloud.google.com/docs/authentication/production)
static void authImplicit() {
// If you don't specify credentials when constructing the client, the client library will
// look for credentials via the environment variable GOOGLE_APPLICATION_CREDENTIALS.
Storage storage = StorageOptions.getDefaultInstance().getService();
System.out.println("Buckets:");
Page<Bucket> buckets = storage.list();
for (Bucket bucket : buckets.iterateAll()) {
System.out.println(bucket.toString());
}
}
The method returns 401 with this error response:
Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 401 Unauthorized
{
"code" : 401,
"errors" : [ {
"domain" : "global",
"location" : "Authorization",
"locationType" : "header",
"message" : "Login Required.",
"reason" : "required"
} ],
"message" : "Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.",
"status" : "UNAUTHENTICATED"
}
From your description looks like you are not placing your json file in the described path in your environmental variable. This is assuming that you are describing correctly your project id in the properties file, you have enabled the required APIs in Google Cloud Platform. In order to determine this with more precision I would need you to share with us your folder structure and your properties files.
I would suggest you to use the CredentialsProvider interface to solve this issue
public interface CredentialsProvider {
Credentials getCredentials() throws IOException;
}
You can find more information on this interface here.
You can also take a look of this quickstart made for spring + bigquery from the Spring team
I am extending a userstore manager ActiveDirectoryUserStoreManager for WSO2AM 2.1.0 overriding the protected String[] doGetExternalRoleListOfUser method to add roles from an external authorization service (roles are used for scope authorization).
All looks working locally, but in other environments (deployed on kubernetes) when requesting a token (code grant) I got following exception: Error occurred while accessing Java Security Manager Privilege Block
(other grant types are working with no issue)
Error occurred while issuing the access token for Client ID : ddSiloINsMx5fwp08FqqF62hcaaa, User ID null, Scope : [] and Grant Type : authorization_code More
ERROR {org.wso2.carbon.identity.oauth2.OAuth2Service} - Error occurred while issuing the access token for Client ID : ddSiloINsMx5fwp08FqqF62hcaaa, User ID null, Scope : [] and Grant Type : authorization_code
java.util.AbstractCollection.addAll(AbstractCollection.java:343)
org.wso2.carbon.apimgt.keymgt.ScopesIssuer.setScopes(ScopesIssuer.java:110)
org.wso2.carbon.apimgt.keymgt.handlers.ExtendedAuthorizationCodeGrantHandler.validateScope(ExtendedAuthorizationCodeGrantHandler.java:48)
org.wso2.carbon.identity.oauth2.token.AccessTokenIssuer.issue(AccessTokenIssuer.java:242)
...
ERROR {org.wso2.carbon.apimgt.keymgt.issuers.RoleBasedScopesIssuer} - Error when getting the tenant's UserStoreManager or when getting roles of user
org.wso2.carbon.user.core.common.AbstractUserStoreManager.callSecure(AbstractUserStoreManager.java:177)
org.wso2.carbon.user.core.common.AbstractUserStoreManager.getRoleListOfUser(AbstractUserStoreManager.java:2586)
org.wso2.carbon.apimgt.keymgt.issuers.RoleBasedScopesIssuer.getScopes(RoleBasedScopesIssuer.java:118)
org.wso2.carbon.apimgt.keymgt.ScopesIssuer.setScopes(ScopesIssuer.java:109)
...
ERROR {org.wso2.carbon.user.core.common.AbstractUserStoreManager} - Error occurred while accessing Java Security Manager Privilege Block
Checking the source code I see there are secure calls made (callSecure), which I don't see immediate reason (though I assume there must be a security reason if someone made so much effort).
The same issue pops up whe nvalidating the token (invoking an API requiring a scope)
As it is working locally, atm I am unable to provide a working testable (repeatable) case, as soon I have one I will update the question.
Using default AD userstore manager there's no issue whatsoever, just we don't have the external roles available for authorization
There was another log entry in the wso2carbon.log (though not in the console - logs available through the carbon console)
Caused by: java.lang.NullPointerException
at org.wso2.carbon.user.core.ldap.ReadOnlyLDAPUserStoreManager.getLDAPRoleListOfUser(ReadOnlyLDAPUserStoreManager.java:1928)
at org.wso2.carbon.user.core.ldap.ReadOnlyLDAPUserStoreManager.doGetExternalRoleListOfUser(ReadOnlyLDAPUserStoreManager.java:2041)
at com.rd.poa.auth.roleuserstore.ExtRoleUserstore.doGetExternalRoleListOfUser(ExtRoleUserstore.java:162)
at org.wso2.carbon.user.core.common.AbstractUserStoreManager.doGetRoleListOfUser(AbstractUserStoreManager.java:3730)
at org.wso2.carbon.user.core.common.AbstractUserStoreManager.getRoleListOfUser(AbstractUserStoreManager.java:2615)
seems users were members of groups outside the "GroupSearch" filter. Making the group search base containing all LDAP group seems to help (so far)
another needed action was stripping the FEDERATED realm from the username WSO2AM2.1.0-update12 scope roles for federated users
I have a encrypted text, encrypted using Cloud KMS, and I need to decrypt it from the context of a code running in Cloud ML Engine. However I'm running into the following error:
shaded.com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
{
"code" : 403,
"errors" : [ {
"domain" : "global",
"message" : "Request had insufficient authentication scopes.",
"reason" : "forbidden"
} ],
"message" : "Request had insufficient authentication scopes.",
"status" : "PERMISSION_DENIED"
}
at shaded.com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
at shaded.com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
at shaded.com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
at shaded.com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:321)
at shaded.com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1049)
at shaded.com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:419)
at shaded.com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
at shaded.com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
I tried creating the GoogleCredential object with "https://www.googleapis.com/auth/cloudkms" and "https://www.googleapis.com/auth/cloud-all" scopes, but ended up with the same error.
Please let me know if I'm missing something here.
P.S: I do have a valid GoogleCredential object as I'm able to print the access token.
For now it's not working as we restrict the API scopes on VMs. We are working on the feature to allow KMS. Stay tuned!
Update: we have pushed the change, so you should be able to access KMS now. Please give a try.