I'm trying to set up a sink that will export a certain set of Google Cloud Platform logs to a Google Cloud Storage bucket but can't get it to work and the documentation doesn't seem to match what's happening on the GCP console.
Steps (all using the GCP console):
1) I set a filter on the log viewer which is showing me the expected logs
2) I choose "Create Export" and fill in the fields:
Sink Name = defaultServiceToGCSSink
Sink Service = Google Cloud
Storage Sink Destination = mylogsBucket
After hitting OK, I get a message:
Unknown user email address: defaultServiceToGCSSink#logging-somedigits.iam.gserviceaccount.com
Apparently the sink is trying to use the name I gave it as the user that will be writing to the storage bucket.
When I check the bucket I can see that a user with that email was added as an owner to mylogsBucket. but still no logs in the bucket.
I also added the group cloud-logs#google.com as an owner to the bucket (as the documentation states) but nothing works and no logs are exported to the bucket (and I've waited for more than a couple of hours).
Should I be adding that new user to IAM? I tried to but it wouldn't accept the email address as a valid user name.
Remove the gserviceaccount.com user from the bucket ALCs and then try creating the sink.
Is there any chance you successfully created the sink at some point in the past and later deleted it? My guess is the service account was put on the bucket earlier, and now the sink creation is failing because it's trying to add the account again.
In usual scenario, it might take some time before the first entries begin to appear in the google storage bucket because log entries are saved to Cloud Storage buckets in hourly batches.
When you export logs to a Cloud Storage bucket, logging writes a set of files to the bucket that are organized in directory hierarchies by log type and date.
Detailed explanation for what happens to the exported logs : https://cloud.google.com/logging/docs/export/using_exported_logs
Related
we are working on one requirement where we want to check that which service account having what type of access on particular GCS bucket from cloud composer.
For dataset we can use below code,
dataset = client.get_dataset(dataset_id) # Make an API request.
entries = list(dataset.access_entries)
we are looking some thing similar to this for gcs bucket.
You can use the Policy Analyser service that you can find in the Asset Inventory section (sure, it's not obvious)
You can try that query for instance
gcloud asset search-all-iam-policies --scope=projects/<ProjectID> --asset-types="storage.googleapis.com/Bucket"
And then filter only on the bucket that you target (use jq for instance). You can also search at the folder or organization scope to get also the inherited roles from higher levels.
I have some objects in a Google Cloud Storage bucket that are publicly downloadable on URLs like https://storage.googleapis.com/blahblahblah. I want to set up a monitoring rule that lets me see how often one of these objects is being downloaded. I have turned on the Data Read audit log as mentioned here, but I don't see any logs when I download the object from the storage.googleapis.com link. I have another bucket where downloads are performed through the Node Google Cloud Storage client library, and I can see download logs from that bucket, so it seems like downloads from the public URL don't get logged.
I also don't see a way to specify the object in a particular bucket when setting up an alert in Google Cloud. Is creating a new bucket solely for this object the best way to try to set up monitoring for the number of downloads, or is there something I'm missing here?
Google Cloud Audit logs do not track objects that are public (allUsers or allAuthenticatedUsers).
Enable usage logs to track access to public objects.
Should you use usage logs or Cloud Audit Logs?
i keep getting the following error when i use GCP storage transfer function.error
the URL i am using and my tsv is located there, here:
https://drive.google.com/file/d/1ZDkQgCIrzGU-emfHiHYwsvj9i108iJVO/view?usp=sharing
I tried placing the tsv directly on cloud storage but received the same error. then i used the gs// address instead of the URL and got the following error:
enter image description here
so it seems to have picked up the URL from looking at the error message,but what does it mean that the process was denied? Denied to read my file or to download the download. Also it looks like it is reading the link as a webpage rather than a txt file.hmmm
I think you have a problem with permissions on the sink bucket link.
The Storage Transfer Service uses a service account to move data into
a Cloud Storage sink bucket. The service account must have certain
permissions for the sink bucket:
Permission Description Use
storage.buckets.get Allows the service
account to get the location of the bucket. Always required.
storage.objects.create Allows the service account to add objects to
the bucket. Always required.
storage.objects.delete Allows the service
account to delete objects in the bucket. Required if you set
overwriteObjectsAlreadyExistingInSink or deleteObjectsUniqueInSink to
true.
storage.objects.list Allows the service account to list objects
in the bucket. Required if you set
overwriteObjectsAlreadyExistingInSink to false or
deleteObjectsUniqueInSink to true.
All of these permissions are
contained in the roles/storage.legacyBucketWriter role, which you can
assign to the service account. For a complete list of Cloud Storage
roles and the permissions they contain, see IAM roles.
We wanted to copy a file from one project's storage to another.
I have credentials for project A and project B in separate service accounts.
The only way we knew how to copy files was to add service key credential permissions to the bucket's access control list.
Is there some other way to run commands across accounts using multiple service keys?
You can use Cloud Storage Transfer Service to accomplish this.
The docs should guide you to setup the permissions for buckets in both projects and do the transfers programmatically or on the console.
You need to get the service account email associated to the Storage Transfer Service by entering your project ID in the Try this API page. You then need to give this service account email the required roles to access the data from the source. Storage Object Viewer should be enough permissions.
At the data destination, you need get the service account email for the second project ID, then give it the Storage Legacy Bucket Writer role.
You can then do the transfer using the snippets in the docs.
I'm attempting to shuttle data between buckets in different projects using Google Cloud's Storage Transfer Service based on an event that's being emitted. This event points to a subdirectory of a bucket belonging to project A that I want to copy to a destination bucket in project B, and so naturally STS looked like a perfect fit.
I've successfully crafted the request, including the prefix expressions, and I believe that I'm just dealing with a permissions error now because when I use my code to copy between buckets in the same project it creates and starts the transfer as expected. When I use my service account to being the transfer across buckets, however, I get the following error:
Error creating the transfer job:
Failed to obtain the location of the destination Google Cloud Storage (GCS) bucket due to insufficient permissions.
Please verify that the necessary permissions have been granted.
So from here I've looked into the permissions of my service account. I've manually added that service account as a project editor (and then owner) of the second project, and I've added the account to the ACLs for the destination bucket I'm trying to transfer into.
Are there any other avenues that I'm missing to look down? I haven't had much success googling around for others that have hit my situation.
One problem that you may be running into is you may be granting access to the wrong service account. One service account is created for each project. The service account can be found using the get googleServiceAccounts command documented here. It should look like
storage-transfer-{UID}#partnercontent.gserviceaccount.com
The UID will be different for each project. Try verifying that the service account with the UID specific to the projectId you specify in the transferJobs create request has READ/LIST access in source bucket and READ/WRITE/LIST access in sink bucket.
If the number of transfers you need to set up is relatively small, you might also try using the console, which is a GUI designed to simplify the process of seting up a transfer. Instructions for using the console can be found here.