Permissions For Google Cloud SQL Import Using Service Accounts - google-cloud-platform

I've exported MySQL Database following the MySQL Export Guide successfully.
Now, I'm trying to import MySQL Database following the MySQL Import Guide.
I've checked the permissions for the service_account_email I'm using, and I have allowed both Admin SQL and Admin Storage permissions.
I was able to successfully activate my service account using this command locally:
gcloud auth activate-service-account <service_account_email> --key-file=<service_account_json_file>
After I ran the command:
gcloud sql import sql <instance> <gstorage_file> --database=<db_name> --async
I got this information:
{
"error": {
"errors": Array[1][
{
"domain": "global",
"reason": "required",
"message": "Login Required",
"locationType": "header",
"location": "Authorization"
}
],
"code": 401,
"message": "Login Required"
}
}
Other Things I've Tried
I also tried using the service_account_email of my SQL instance, which came from:
gcloud sql instances describe <instance_name>
But, it seems to have the same error.
Question
Based on the REST API JSON error I'm given, how do I "login" using the service_account_email so I wouldn't get the 401 Error?

Problem is about the permission of database instance service account to write on created bucket. Steps to solve this issue
1) Go to your Cloud SQL Instance and copy service account of instance (Cloud SQL->{instance name}->OVERVIEW->Service account)
2) After copy the service account, go the Cloud Storage Bucket where to want to dump and set desired permission to that account (Storage->{bucket name}->permissions->add member).

The cloud SQL instance is running under a Google service account that is not a part of your project. You will need to grant this user permissions on the file in Cloud Storage that you want to import. Here is a handy dandy bash snippet that will do that.
SA_NAME=$(gcloud sql instances describe YOUR_DB_INSTANCE_NAME --project=YOUR_PROJECT_ID --format="value(serviceAccountEmailAddress)")
gsutil acl ch -u ${SA_NAME}:R gs://YOUR_BUCKET_NAME;
gsutil acl ch -u ${SA_NAME}:R gs://${YOUR_BUCKET_NAME}/whateverDirectory/fileToImport.sql;
The first line gets the service account email address.
The next line gives this service account read permissions on the bucket.
The last line gives the service account read permissions on the file.

Google also has some of the worst error reporting around. If you get this error message it might also be that you entered a PATH incorrectly. In my case it was my path to my bucket directory. Go figure, I don't have permissions to access a bucket that doesn't exist. Technically correct but hardly useful.

After performing some research, and based in the permission error, these are the steps that I find more useful for you to troubleshoot the issue:
In order to easier test ACLs and permissions, you can:
Create and download a key for a service account in question
Use 'gcloud auth activate-service-account' to obtain credentials of service account
Use gsutil as usual to see if you can access the object in question
You might need to grant additional IAM role such as 'roles/storage.admin' to service account in question, see more information here.

According to the google Docs
Describe the instance you are importing to:
gcloud sql instances describe INSTANCE_NAME
Copy the serviceAccountEmailAddress field.
Use gsutil iam to grant the storage.objectAdmin IAM role to the service account for the bucket.
gsutil iam ch serviceAccount:SERVICE-ACCOUNT:objectAdmin gs://BUCKET-NAME
Then Import the database

Related

Permission denied when creating GCP Service Account Key

I've created Service Account A and granted roles Service Account Admin and Service Account Key Admin. I did this work in the GCP Console.
Service Account A's function is to create other service accounts programmatically, using the GCP Java SDK. It successfully creates new service accounts, but when it goes to create a key for the newly created service account, I get the following response:
{
"code": 403,
"errors": [
{
"domain": "global",
"message": "Permission iam.serviceAccountKeys.create is required to perform this operation on service account projects/-/serviceAccounts/<new_service_account_name>#<project_id>.iam.gserviceaccount.com.",
"reason": "forbidden"
}
],
"message": "Permission iam.serviceAccountKeys.create is required to perform this operation on service account projects/-/serviceAccounts/<new_service_account_name>#<project_id>.iam.gserviceaccount.com.",
"status": "PERMISSION_DENIED"
}
I've tried waiting to see if perhaps I tried to create the key too soon after creating the service account, but waiting hours resulted in no change.
Service Account A can successfully create a key for itself, just not for other service accounts it creates.
How do I resolve?
You have one of three problems:
Service Account A actually does not have the IAM role Service Account Key Admin in the project. Use the CLI command gcloud projects get-iam-policy and double-check.
Your code is using the wrong identity. You believe that you are using the service account but instead, another identity is being loaded by ADC (Application Default Credentials), or you made a mistake in your code.
You assign the correct role but on the service account instead of the project. Use the CLI command gcloud iam service-accounts get-iam-policy. If you find the role listed in the output, you assigned the role in the wrong place. Use the CLI command gcloud projects add-iam-policy-binding instead.
Note: There is a fourth method to prevent you from creating service account keys. Constraints might be enabled:
Restricting service account usage

Google Cloud Invalid value for [ACCOUNT]

I am trying to upload a dockerfile to the Google Cloud containers.
Turns out, this is more difficult than actual developing the app.
I did everything the authenticaion page mentioned (https://cloud.google.com/container-registry/docs/advanced-authentication#gcloud-helper), but then i had to set up a key.
I followed this guide (https://cloud.google.com/iam/docs/creating-managing-service-account-keys#creating_service_account_keys), downloaded my json file.
I could not use gcloud shell, so i downloaded gcloud in my local machine from snaps.
Finally issued this command:
gcloud auth activate-service-account ACCOUNT --key-file=KEY-FILE
Where
ACCOUNT is the service account name in the format [USERNAME]#[PROJECT-ID].iam.gserviceaccount.com. You can view existing service accounts on the Service Accounts page of Cloud Console or with the command gcloud iam service-accounts list
KEY-FILE is the service account key file. See the Identity and Access Management (IAM) documentation for information about creating a key.
However, i get this error:
ERROR: (gcloud.auth.activate-service-account) Invalid value for [ACCOUNT]: The given account name does not match the account name in the key file. This argument can be omitted when using .json keys.
I don't know neither what is going on, or why i am getting this error, since i am doing everything by the book.
Some help would be much appreciated.

Error emitted when creating a connection between GCP Bigquery and Cloud SQL (Mysql 5.7) using bq command

I want to create a connection between Bigquery and Cloud SQL(MySQL 5.7) in a newly created project ! (not in existing project, I can create the connection in existing project with no problem)
This is the command I used
bq mk --connection --connection_type='CLOUD_SQL' --properties='{"instanceId":"<PROJECT ID>:<REGION>:<MYSQL INSTANCE>","database":"<MY DATABASE>","type":"MYSQL"}' --connection_credential='{"username":"root", "password":"<PASSWORD>"}' --project_id=<PROJECT ID> --location=<REGION> <MYSQL INSTANCE>
but I got the error like below
BigQuery error in mk operation: Access Denied: URI: services/bigqueryconnection.googleapis.com/projects/<PROJECT ID>:
APPLICATION_ERROR;google.api.serviceconsumermanagement.v1beta1/ServiceConsumerManagerV1Beta1.GenerateServiceIdentity;Permission denied to generate service identity for service
[bigqueryconnection.googleapis.com]
Details: [{
IAM{policy: 'serviceconsumermanagement_consumers-/000000555846b828/bigqueryconnection.googleapis.com/000000b742218216' resource: 'services/bigqueryconnection.googleapis.com/consumers/<PROJECT ID>'
permission: 'serviceconsumermanagement.consumers.generateServiceAccount'}
allowed: false auditlog: false cloudaudit: false
}]
...
...
After a lot of research, I think the problem is maybe I don't have a service acccount like this with a role called BigQuery Connection Service agent. because I have this kind of service account in another project.
service-<project id>#gcp-sa-bigqueryconnection.iam.gserviceaccount.com
I don't really know how this service account being created, maybe by enabling the Bigquery Connection API service, I will have a service account like above being created.
Although the bigquery Connection API has already being enabled, I disalbed it then enabled it again. but the service account still not being created.
Does anyone has any idea about this problem ? Thank you so much.
update:
Sorry I forgot to say I already have the bigquery Admin/bigquery connection admin/cloud sql admin permissions
You need grant bigquery.admin access. Document link
gcloud projects add-iam-policy-binding project_id \
--member group/user:address \
--role roles/bigquery.admin
Example:
$ bq mk --connection --connection_type='CLOUD_SQL' --properties='{"instanceId":"data-lab:us-east1:instance-mysql","database":"information_schema","type":"MYSQL"}' --connection_credential='{"username":"root", "password":"*****"}' --project_id=data-lab --location=us MyExternalSQL
Connection 906721254566.us.MyExternalSQL successfully created
When you enable the BigQuery Connection API, a service account should be automatically created on your behalf with the following roles:
cloudsql.client
logging.logWriter
metrics.metricWriter
and to create and maintain a connection resource, user must have the bigquery.admin predefined role as #Soumendra mentioned.
Check Document Cloud SQL federated queries
I enabled the API, but a service account wasn't created.
Check Document BigQuery Connection API Client Libraries for creating that service account.
Once that is done, try connecting.

Google Cloud credentials with Terraform

This is a bit of a newbie question, but I've just gotten started with GCP provisioning using Terraform / Terragrunt, and I find the workflow with obtaining GCP credentials quite confusing. I've come from using AWS exclusively, where obtaining credentials, and configuring them in the AWS CLI was quite straightforward.
Basically, the Google Cloud Provider documentation states that you should define a provider block like so:
provider "google" {
credentials = "${file("account.json")}"
project = "my-project-id"
region = "us-central1"
zone = "us-central1-c"
}
This credentials field shows I (apparently) must generate a service account, and keep a JSON somewhere on my filesystem.
However, if I run the command gcloud auth application-default login, this generates a token located at ~/.config/gcloud/application_default_credentials.json; alternatively I can also use gcloud auth login <my-username>. From there I can access the Google API (which is what Terraform is doing under the hood as well) from the command line using a gcloud command.
So why does the Terraform provider require a JSON file of a service account? Why can't it just use the credentials that the gcloud CLI tool is already using?
By the way, if I configure Terraform to point to the application_default_credentials.json file, I get the following errors:
Initializing modules...
Initializing the backend...
Error: Failed to get existing workspaces: querying Cloud Storage
failed: Get
https://www.googleapis.com/storage/v1/b/terraform-state-bucket/o?alt=json&delimiter=%2F&pageToken=&prefix=projects%2Fsomeproject%2F&prettyPrint=false&projection=full&versions=false:
private key should be a PEM or plain PKCS1 or PKCS8; parse error:
asn1: syntax error: sequence truncated
if I configure Terraform to point to the application_default_credentials.json file, I get the following errors:
The credentials field in provider config expects a path to service account key file, not user account credentials file. If you want to authenticate with your user account try omitting credentials and then running gcloud auth application-default login; if Terraform doesn't find your credentials file you can set the GOOGLE_APPLICATION_CREDENTIALS environment variabe to point to ~/.config/gcloud/application_default_credentials.json.
Read here for more on the topic of service accounts vs user accounts. For what it's worth, Terraform docs explicitly advice against using application-default login:
This approach isn't recommended- some APIs are not compatible with credentials obtained through gcloud
Similarly GCP docs state the following:
Important: For almost all cases, whether you are developing locally or in a production application, you should use service accounts, rather than user accounts or API keys.
Change the credentials to point directly to the file location. Everything else looks good.
Example: credentials = "/home/scott/gcp/FILE_NAME"
Still it is not recommended to use gcloud auth application-default login, Best best approaches are
https://www.terraform.io/docs/providers/google/guides/provider_reference.html#credentials-1

Can I use the gcloud command to adjust permissions for a service account and enable write access to a storage bucket inside firebase functions?

I have a firebase function which I want to permit write access to cloud storage. I believe I need to setup a service account with those permissions, and then grant them programmatically inside my function, but I'm confused how to do this.
The firebase function writes a file to a bucket on a trigger. The storage settings for the firebase storage are set to the default, which means they require the client to be authenticated:
service firebase.storage {
match /b/{bucket}/o {
match /{allPaths=**} {
allow read, write: if request.auth != null;
}
}
}
In this document (https://cloud.google.com/functions/docs/concepts/iam), under "Runtime service account", I see this:
At runtime, Cloud Functions uses the service account
PROJECT_ID#appspot.gserviceaccount.com, which has the Editor role on
the project. You can change the roles of this service account to limit
or extend the permissions for your running functions.
When it says "runtime," I'm assuming this means the firebase function runs within a context of that service account and the permissions granted to it. As such, I'm assuming I need to make sure the permissions of that service account have write access, as I see from this link (https://console.cloud.google.com/iam-admin/roles?authuser=0&consoleUI=FIREBASE&project=blahblah-2312312).
I see the permission named storage.objects.create and would assume I need to add this to the service account.
To investigate the service account current settings, I ran these commands:
$ gcloud iam service-accounts describe blahblah-2312312#appspot.gserviceaccount.com
displayName: App Engine default service account
email: blahblah-2312312#appspot.gserviceaccount.com
etag: BwVwvSpcGy0=
name: projects/blahblah-2312312/serviceAccounts/blahblah-2312312#appspot.gserviceaccount.com
oauth2ClientId: '98989898989898'
projectId: blahblah-2312312
uniqueId: '12312312312312'
$ gcloud iam service-accounts get-iam-policy blahblah-2312312#appspot.gserviceaccount.com
etag: ACAB
I'm not sure if there is a way to get more details from this, and unsure what etag ACAB indicates.
After reviewing this document (https://cloud.google.com/iam/docs/granting-roles-to-service-accounts) I believe that I need to grant the permissions. But, I'm not entirely sure how to go from the JSON example and what structure it should be and then associate the policy, or if that is even the correct path.
{
"bindings": [
{
"role": "roles/iam.serviceAccountUser",
"members": [
"user:alice#gmail.com"
]
},
{
"role": "roles/owner",
"members": [
"user:bob#gmail.com"
]
}
],
"etag": "BwUqLaVeua8=",
}
For example, my questions would be:
Do I need to make up my own etag?
What email address do I use inside the members array?
I see this command listed as an example
gcloud iam service-accounts add-iam-policy-binding \
my-sa-123#my-project-123.iam.gserviceaccount.com \
--member='user:jane#gmail.com' --role='roles/editor'
What I don't understand is why I have to specify two quasi-email addresses. One is the service account, and one is the user associated with the service account. Does this mean that user jane#gmail.com can operate under the credentials of the service account? Can I just have the service account on its own have permissions which I use in my cloud function?
Is there a simpler way to do this using only the command line, without manually editing JSON?
And, then once I have my credentials properly established, do I need to use a JSON service account file as many examples show:
var admin = require('firebase-admin');
var serviceAccount = require('path/to/serviceAccountKey.json');
admin.initializeApp({
credential: admin.credential.cert(serviceAccount),
databaseURL: 'https://<DATABASE_NAME>.firebaseio.com'
});
Or, can I just make a call to admin.initializeApp() and since "... at runtime, Cloud Functions uses the service account PROJECT_ID#appspot.gserviceaccount.com..." the function will automatically get those permissions?
The issue was (as documented here: How to write to a cloud storage bucket with a firebase cloud function triggered from firestore?) that I had incorrectly specified the first parameter to the bucket as a subdirectory inside the bucket and not as just a bucket. This meant storage thought I was trying to access a bucket which did not exist, and I got the permissions error.