Cannot update DataStudio resource connection to use service account - google-cloud-platform

I have a DataStudio dashboard which contains a dataset resource connection to a BigQuery table which is currently authenticated using an individual user's account.
I want to change that to use a service account.
In order to do that, I followed this guide, which means I have:
Created a service account
Added the BigQuery Job User role to the service account
Added the Service Account Token Creator role to the service account
Added the BigQuery Data Viewer role to the service account (associated to the correct BigQuery table)
Ensured that my account is added as a Service Account User on the service account
On the DataStudio dashboard, I then navigate to:
Resource > Manage Added Data Sources > Select the specific data source > Click on the blade with the current user account being used to authenticate.
When I try to insert the service account details in the window, I see the error message below.
I've followed the link (which leads back to the original guide). The link says I need to add the Service Account Token Creator to the service account, which I've confirm through the following:
Now I'm stumped! What could be the reason why I cannot authenticate the connection using the service account?

I figured it out.
The problem was that the Service Account Token Creator role was assigned to the service account and not the service agent.
This link describes the process of setting up a service account for authenticating Data Studio data sources. Although it's got all the info there, the devil is in the detail in Step 2: Allow the Looker Studio service agent to access your service account.

Related

Error when creating a Job in Dataflow (Current user cannot act as service account)

I'm trying to create a job in Dataflow to export to a text file the data published to a pub/sub topic. When saving the configuration I get a 'Job creation failed' message specifying 'Current user cannot act as service account ...-compute#developer.gserviceaccount.com', as shown in the attached picture.
Following Google's documentation I added the following roles to my user for this project (in addition to the owner role I already have):
Compute Viewer
Dataflow Admin
Dataflow Developer
Storage Object Admin
Service Account User
However the Controller Service Account mentioned in the message doesn't seem to exist in the list of Account Services of this project (IAM & Admin > Account Services). Is there anything I'm missing here?
Other requirements already checked:
I have the Compute Engine API already enabled
As owner I have the iam.serviceAccounts.actAs permission
Your best option is to create a custom service account in IAM and use it to build/run your job. if you're using cloud build to deploy and run your template you'll need to set your logging location.
More details at the below links:
Using custom service accounts in cloud build
Setting logging location in cloud build YAML

In line 14 of my code publish.py, what would be the service_account_path and how can i download this path?

In the code below, i am getting one error attached. Also there is a service_account_path i have to enter, where do i find this path and how can i download this file?
# Replace with your service account path
path_service_account = 'service1'
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = path_service_account
# Replace with your input file path
input_file = 'C:\tumbling window\store_sales.csv'
# create publisher
publisher = pubsub_v1.PublisherClient()[enter image description here][1]
Client libraries make use of service account credentials to authenticate into GCP services and APIs, such as Pub/Sub.
To do this, an environment variable named GOOGLE_APPLICATION_CREDENTIALS needs to be set and its value is a string with the path of the JSON key value of that service account.
First, you need to make sure that you've created a service account with enough permissions to do the required Pub/Sub operations:
In the Cloud Console, go to the Service accounts page.
Select your project.
Click Create service account.
Enter a service account name to display in the Cloud Console.
Choose one or more IAM roles to grant to the service account on the project. This is an important step, if you're only planning to use this account to consume the Pub/Sub service, then you could grant it the Pub/Sub admin role. If your code also makes use of another service (Such as BigQuery, for example) you need to grant the required roles as well. An easy option would be to grant it the Project Editor role, which grants access to all GCP services, but is always a recommended security practice to grant only the required minimal necessary permissions.
Once you've granted the required role(s), click Done to finish creating the service account.
Once you've created the service account, then you need to generate a JSON key file:
In the Cloud Console, go to the Service accounts page.
Select your project.
Find the row of the service account that you want to create a key for. In that row, click the More (3-dot) button, and then click Create key.
Select a JSON Key type and click Create.
Clicking Create downloads a service account key file.
That is the file which path needs to be referenced in path_service_account. Let's say that your JSON key file was downloaded in C:\Downloads\YOUR_JSON_KEY_FILENAME.json, then your code would be something like:
path_service_account = 'C:\Downloads\YOUR_JSON_KEY_FILENAME.json'
This should solve the authentication errors that you're getting.

Google cloud project to service account to user roles mapping creation

I have a Google cloud project created.
I created a service account with project editor role.
Now, if I give a user, serviceAccountUser role to the service account, it doesn't automatically gives the user permission on the project.
If I login using that user to Google cloud console, I'm unable to see the project itself.
It says in the documentation - "Users granted the Service Account User role on a service account can use it to indirectly access all the resources to which the service account has access." So, I thought I would be able to access all the projects as well. Does the above statement mean something else?
On GCP you have user accounts and service accounts. The main purpose of the service accounts is to consume GCP services via API calls, the documentation says: "A service account is a special type of Google account intended to represent a non-human user that needs to authenticate and be authorized to access data in Google APIs.". For login and interact with the GUI and resources you need a user account, if you want to see the resources the account need the viewer role, if you need perform operations on the resources you need admin role, review the documentation for more information (How IAM works).

How to give service account access to two projects?

Using Google Cloud, there exists a BigQuery View table that queries two projects.
However, on the project where the view is located, we wish to run a query against it from Airflow/Composer. Currently it fails with a 403.
AFAIK it will use the default composer service account - however it doesn't have access to the 2nd project used in the sql of the view.
How do I give composer's service account access to the second project?
Think about a service account like a user account: you have a user email that you authorize on different project and component. Exactly the same thing with the service account email.
The service account belongs to a project. An user account belongs to a domain name/organisation. No real difference at the end.
So, you can use a service account email like any user accounts:
Grant authorization in any project
Add it in Google Groups
Even grant it viewer or editor role on GSuite document (Sheet, Docs, Slides,...) to allow it to access and to read/update these document!! Like any users!
EDIT
With Airflow, you can defined connexions and a default connexion. You can use this connexion in your DAG and thus use the service account that you want.
I think you have to add the service account into project IAM.

Share Google group permissions with GCP service account

A Google group of which I'm Manager have been granted certain permissions to access certain BigQuery tables. Effectively all user in the group can access to those tables using their personal credentials.
I would like to share those permissions with a service account and access the tables using service account credentials.
Is this possible? How to configure it?
A service account is generally used for Server to Server communication (between applications). With that in mind, a service account has associated an email address just like the ones associated to your personnel. So, you can assign roles/permissions to the service accounts using its email just like you assigned to your group.
I hope that the following steps help you in some manner:
Create a service account.
Assign predefined BigQuery roles (Admin, DataEditor, User, etc).
Download its json file which contains the credentials.
Use those credentials to authenticate and authorize your application.
To add a specific permission (owner, edit or view) on a specific dataset you can use its service account email.