fetch all account/service account having access to particular bucket - google-cloud-platform

we are working on one requirement where we want to check that which service account having what type of access on particular GCS bucket from cloud composer.
For dataset we can use below code,
dataset = client.get_dataset(dataset_id) # Make an API request.
entries = list(dataset.access_entries)
we are looking some thing similar to this for gcs bucket.

You can use the Policy Analyser service that you can find in the Asset Inventory section (sure, it's not obvious)
You can try that query for instance
gcloud asset search-all-iam-policies --scope=projects/<ProjectID> --asset-types="storage.googleapis.com/Bucket"
And then filter only on the bucket that you target (use jq for instance). You can also search at the folder or organization scope to get also the inherited roles from higher levels.

Related

Find Resources a GCP service account is tied to within a project

I am doing a quick inventory of our service accounts within a particular GCP project and I want to find all the resources a specific service account has access to. This seems like it'd be a simple lookup, since a GCP policy is simply an Identity given a role on a particular resouce, however it doesn't seem like gcloud has this specific lookup... unless I'm missing something. I can find the service account/role combination via IAM or gcloud beta asset search-all-iam-policies but the final portion of the query seems to be missing.
To find all the resources authorized for a specific account, using the Cloud Asset Inventory is the good tool.
You can perform this kind of request
gcloud beta asset search-all-iam-policies \
--scope=<Where to search>
--query="policy:<who to search>"
The scope is in which perimeter you are looking for. It can be
organizations/<OrganisationNumber>
folders/<folderNumber>
projects/<ProjectNumber or ProjectID>
The query is what you search. Here a policy with a specific service account email. So, set it and launch the request.
Does it what you are looking for?

How to do SSO from your corporate AD or LDAP directory and restricts access for each user to a designated user folder in a bucket?

I came across this question in my AWS study and wondering if anyone can enlighten me with more on my question:
Your fortune 500 company has under taken a TCO analysis evaluating the
use of Amazon S3 versus acquiring more hardware The outcome was that
all employees would be granted access to use Amazon S3 for storage of
their personal documents. Which of the following will you need to
consider so you can set up a solution that incorporates single sign-on
from your corporate AD or LDAP directory and restricts access for each
user to a designated user folder in a bucket? (Choose 3 Answers)
A. Setting up a federation proxy or identity provider
B. Using AWS Security Token Service to generate temporary tokens
C. Tagging each folder in the bucket
D. Configuring IAM role
E. Setting up a matching IAM user for every user in your corporate directory that needs access to a folder in the bucket
The answers from the source say ABD, what I wonder is would it be needed to tag each folder in the bucket (i.e. via use of the 'username' policy variable) to associate the folder to its user?
There is no need to tag each folder. You can simply base your matching logic on folder/path prefixes.
Suppose that you have 2 "folders" (there are no folders in S3 since it is a flat storage) in my-bucket s3 bucket.
s3://my-bucket/user1
S3://my-bucket/user2
You can match on user1 and user2 if you wish to in your IAM role eg.
Role1 - resource: s3://my-bucket/user1/*
and
Role2 - resource: s3://my-bucket/user2/*
That being said, you can implement some tagging structure for your buckets/folders and do the matching based on tags but that is simply unnecessary in this scenario.

GCS: how to upload to bucket as Storage Object Admin

If I grant privileges as Storage Object Admin to someone in order for them to upload data in a bucket, how can they do so?
I have created two GCS accounts and did the experiment of creating a project in account A and granting Storage Object Admin to account B. Nonetheless, when I check the list of avaliable buckets at B nothing shows up.
I am new to GCS. I know the question may seem basic but I did not find anything in the documentation that helps me.
Thank you in advance for your help.
As specified in Cloud IAM Roles for Cloud Storage documentation, the roles/storage.admin role Grants full control of buckets and objects. for the user account. Which means that this account can list, upload and delete objects in the bucket and modify the bucket's settings.
I assume that you open the Cloud Shell in Project B and you are trying to run the command gsutil ls as specified in Cloud Storage > Documentation > Listing Buckets documentation. However this command will list the buckets only from the Project B, that is why you can't see the buckets of Project A there. To do so, set up the project to be Project A in the Cloud Shell. You can do this by running the command gcloud config set project [PROJECT_A_ID], as specified in gcloud config set documentation. Then list the buckets again and you will see the buckets from Project A listed there.
Also to upload to the bucket you can refer to the Uploading Objects documentation. Run this command from Project B's Cloud Shell and replace the [DESTINATION_BUCKET_NAME] with the Project A's bucket name. Since the role is granted properly you will be able to upload the file successfully.
I have tested this myself and it worked for me as expected.

Google Cloud Logs not exporting to storage

I'm trying to set up a sink that will export a certain set of Google Cloud Platform logs to a Google Cloud Storage bucket but can't get it to work and the documentation doesn't seem to match what's happening on the GCP console.
Steps (all using the GCP console):
1) I set a filter on the log viewer which is showing me the expected logs
2) I choose "Create Export" and fill in the fields:
Sink Name = defaultServiceToGCSSink
Sink Service = Google Cloud
Storage Sink Destination = mylogsBucket
After hitting OK, I get a message:
Unknown user email address: defaultServiceToGCSSink#logging-somedigits.iam.gserviceaccount.com
Apparently the sink is trying to use the name I gave it as the user that will be writing to the storage bucket.
When I check the bucket I can see that a user with that email was added as an owner to mylogsBucket. but still no logs in the bucket.
I also added the group cloud-logs#google.com as an owner to the bucket (as the documentation states) but nothing works and no logs are exported to the bucket (and I've waited for more than a couple of hours).
Should I be adding that new user to IAM? I tried to but it wouldn't accept the email address as a valid user name.
Remove the gserviceaccount.com user from the bucket ALCs and then try creating the sink.
Is there any chance you successfully created the sink at some point in the past and later deleted it? My guess is the service account was put on the bucket earlier, and now the sink creation is failing because it's trying to add the account again.
In usual scenario, it might take some time before the first entries begin to appear in the google storage bucket because log entries are saved to Cloud Storage buckets in hourly batches.
When you export logs to a Cloud Storage bucket, logging writes a set of files to the bucket that are organized in directory hierarchies by log type and date.
Detailed explanation for what happens to the exported logs : https://cloud.google.com/logging/docs/export/using_exported_logs

Access google cloud storage bucket from other project using python

lets suppose I have google cloud storage bucket in project X and want to upload object in the bucket which is in project X from Code(Python) which is deployed on project Y.
Both project X and Y are under same credentials(login id).
Is it achievable using OAuth2.0 or any other suggestion?
I have tried using Service Account,AppAssertionCredentials & OAuth2DecoratorFromClientSecrets but failed.
credentials = GoogleCredentials.get_application_default()
service = discovery.build('storage', 'v1', credentials=credentials)
req = service.objects().insert(
bucket=bucket_name,
name=fileName,
media_body=media)
This is a very common use case. You don't need to do anything special in your code to access buckets in other projects. Bucket names are globally unique, so your app will refer to an existing bucket in another project in the same way that it refers to buckets in its own project.
In order for that insert call to succeed, though, you'll need to make the account that is running that code an OWNER of the bucket that you're writing to.
Is that app engine code? App engine code runs as a particular service account. You'll need to grant permission to that service account. Head over to https://console.developers.google.com/permissions/serviceaccounts?project=_ to find out the name of that service account. It's probably something like It's probably my-project-name#appspot.gserviceaccount.com.
Now, using the GCS UI, or via gsutil, give that account full control over the bucket:
gsutil acl ch -u my-project-name#appspot.gserviceaccount.com:FC gs://myBucketName