I plan to run hundreds of websites within one Google Cloud Platform project (using GKE). Each of them will use two Google Cloud Storage buckets for storing its assets.
I planned to create one service for every website in order to grant access to only its own respective buckets. However, there's a limit of 100 service accounts per project, which apparently can't be raised.
How can I make sure that each website only has access to the buckets (or sub paths in a bucket) which is allowed to see?
We have a similar use-case and I believe I've found a solution for this problem. The key is that service accounts from other projects can be given access to buckets of your GCS-enabled project.
Basically you'll use two kinds of GCP projects:
One main project that holds all the data (GCS buckets) and whatever shared resources you have, like Compute Engine VMs or App Engine services
Multiple other projects that are only holding 100 service accounts each
The service accounts from the second type of "user pool" projects can be given access to the buckets of your data project with a fine granularity (1 service account -> 1 bucket). When the last user pool is close to the 100 limit, just create a new project and start adding new service accounts there.
Related
Dear AWS and Cloud Gurus,
I am tasked with the design of a new AWS account structure that is to serve as a development and runtime environment for multiple development teams. The goal here is that multiple individual teams can work on different projects, each in their own AWS account but make use of some shared services and inherit certain elements from a top-level account.
An example for a shared service inherited from the top-level account would be billing. An example for a shared service on the same hierarchy would be authentication and authorisation services.
The projects the different teams work on are all accessible in the end through a unified platform / portal. The user logs onto that portal and can then execute different functions or even start complete applications which are actually developed by the different teams in their respective account.
Now my question is this: What would be a best practise approach on this? Should all teams work in a shared repository, or should each have their own code repository in their own account? Shall the deployment be done into a central account or should the teams deploy within their own account and the app is called from the portal from this account then?
I will try to picture this:
As you can see, there are essentially 5 different AWS Accounts:
The root/masdter Account holds the billing information and also allows for central administration of quotas
Then there are 4 AWS Accounts each with it's own Code Repository and independent access points. 3 of these hold different applications (developed as SPAs) while the forth provides a central navigation page (aka Dashboard) and also provides SSO to the other applications.
Is this setup feasible, given there are different, independent application development teams involved but the applications they provide are in the very end used from a central platform for the end user? How would you go about this?
With the kindest regards,
Chris
I want to build some service where one customer / company can provide a google cloud storage bucket + firestore db and i want to perform some operations on the bucket files and firestore (read/write) but i'm not sure whats the best way to get access to their resources.
[my gc project] -> [customer 1 gc project: bucket + firestore]
-> [customer 2 gc project: bucket + firestore]
-> [customer n gc project: bucket + firestore]
Solutions i can imagine:
Request access with OAuth but then its more like the user gives me the permissions and not the company
The customer creates a service account and gives me the "json"
I create a service account for each customer and he has to add it to his project, i don't know if thats possible and i think there is a limit of about 100 service accounts per customer
I create one service account and each customer has to add it to his projects
Some other requirements:
I need access to the customer project in a way that i can run scheduled jobs in background
I have to access the customer project with google cloud functions
What would be the best fit for me or am i missing something?
If the projects will be created by you on their behalf, I would suggest to create an organization. In an organization projects are classified in folders, similar to a file system. Then, you can add the access control to specific elements to all the projects inside. https://cloud.google.com/iam/docs/resource-hierarchy-access-control
Otherwise, you will have to manually (or create a script) to ask for a service account (second dot) or create on unique service account and add this unique service account to each customer project (third dot).
Our company is creating multi-tenant products and services under our own Google Cloud Platform Account/Organization. Close to 90% of data will be managed and stored within this one account. But each of our customers has their own GCP Account/Organization, and roughly 10% of the total data will come from their end of things (via GCP Storage, databases, etc). Customers will also have their own unrelated data and services, hence the need to use separate accounts.
The data quantity could be as low as 1GB per day or high as 100GB per day, depending on the size of the customer. The data will generally be numerous large files between 100 and 500MB (CSV/row-based data).
What are strategies to safely and efficiently share data between two or more GCP Accounts? Is there something native within GCP that allows for this and helps manage users/permissions, or do we need to build our own APIs/services as if we were communicating with someone else external to GCP?
GCP has a shared VPC concept (https://cloud.google.com/vpc/docs/shared-vpc) that allows you to create a shared network between projects, so you can share resources using internal IPs between projects. This isn't useful for sharing data between accounts though, it is for sharing it inside one organization with multiple projects for different departments.
AFAIK, for sharing data between accounts you have to use VPC Peering (https://cloud.google.com/vpc/docs/vpc-peering) or go through the internet. With peering your data doesn't leave Google's network and it is used by 3rd parties like MongoDB that sell their own cloud platform that actually runs on GCP (and other cloud vendors).
If your actually data is just files though, I don't think there is much risk in going over the internet and using cloud storage. There are many strategies for securing this type of data transfer.
The Google cloud resources and IAM role are enough for segregating the data.
For cloud storage, create a bucket per customer. Grant the correct accounts (user or service) on the bucket to allow a customer to see one, or several (in case of merge for example)
For bigquery, create a dataset per customer and apply the same IAM policy as before.
For Cloud SQL, it's more tricky, because not bind to IAM role. Create a database per customer and play with database user right for granting the access.
Remember, that IAM perform authentication, and authorization only on GCP resources. You can't have custom authorization with IAM. If it's a requirement, you have to implement by yourselves this checks.
In my company, we use Firestore for storing the authorization and the user profiles. The authentication is ensure by GCP (IAP for example) and we use the email of the user as key for the authorizations.
In our team, we are using AWS as our main cloud provider and currently, we have 3 projects hosted on their platform.
We are about to have 2 more projects in the next weeks, but first, we want to organize our projects, because our current organization is a little bit disordered.
We want our projects to be organized following these rules:
Each project must have a staging and production environment.
Each project is independent of each other so that it is not possible to see the resources of a project from within another project, i.e., VPC and S3 Buckets.
The client is responsible for paying the bills of the project (staging and production environment).
Even though the client is responsible for paying the bills, we must have access to the environments to deploy our code and to do other tasks related to development, testing, and operations.
We can assign a team of developers to each project. It should be possible for a developer to be in one or more projects at the same time. Plus, it should be possible to move our developers between projects and to remove their access from a project.
So, is it possible to organize projects in AWS under the rules previously mentioned?
If so, what are good resources to learn how to do this?
If not, what cloud providers allow to organize projects the way we want?
Thanks for your attention and time. I'm looking forward to your replies.
The fact that you want project-specific charges to go to customers and you want each project to be independent indicates that your best choice would be to use a separate AWS Account for each project (or each client).
By keeping projects in separate AWS accounts:
Each account will only have costs associated with a particular project
Resources in each account will be kept separate
User permissions in each account will be kept separate
You can create staging and production environments within the same account (see below)
You can have multiple accounts joined together by using AWS Organizations:
AWS Organizations is an account management service that enables you to consolidate multiple AWS accounts into an organization that you create and centrally manage. AWS Organizations includes account management and consolidated billing capabilities that enable you to better meet the budgetary, security, and compliance needs of your business. As an administrator of an organization, you can create accounts in your organization and invite existing accounts to join the organization.
Some companies go one step further and also keep staging and production in separate AWS accounts. They do this because they wish to keep production resources and users away from non-production resources and users. This reduces the chance of somebody accidentally changing Production when they meant to update Staging. While you can use IAM permissions to reduce such a thing from happening, keeping staging and production in separate accounts guarantees that people with only staging permissions will not be able to impact production.
Your company should maintain ownership of all of the accounts so that you can manage and control them. Each month, you will receive a consolidated bill, but it will show costs broken down by account. Thus, you will know how much to charge your clients.
The developers will need separate logins to each AWS account. So, if they wish to work on Project 1, they will need to login to the AWS account for Project 1. They then have access to the resources in Project 1, but not any of the other projects. When they wish to work on another project, they will need to re-login with credentials for the other project's AWS account. You might think that this adds extra work, but it also adds extra security and ensures that each client's resources are kept totally separate.
A final benefit of using separate accounts is that, in future, if a client wishes to take control of their systems, you can assign the AWS account to them without having to do any work to separate their resources from other clients. It is like handing over the keys of a house — they can move in without anyone having to move out.
I have multiple projects in GCP and I am trying to read all my projects' logs in one place.
Any suggestions?
Unlike monitoring, Stackdriver logging UI does not provide a multi-project view of logs.
It is possible to query the logs from multiple projects using the API. See the resourceNames (or projectIds) field https://cloud.google.com/logging/docs/reference/v2/rest/v2/entries/list
It's also possible to export logs from multiple projects to one place: https://cloud.google.com/logging/docs/export/
You can check our documentation about Monitoring multiple projects using a single Stackdriver account that can contain up to 100 GCP projects for more details.
A project organizes all your Google Cloud Platform resources. A project consists of a set of users; a set of APIs; and billing, authentication, and monitoring settings for those APIs. So, for example, all of your Cloud Storage buckets and objects, along with user permissions for accessing them, reside in a project. You can have one project, or you can create multiple projects and use them to organize your Google Cloud Platform resources, including your Cloud Storage data, into logical groups.
Users can only view and list projects they have access to via IAM roles. The Organization Admin can view and list all projects in the organization.
For logging you have to pass the project id:
projects/[PROJECT_ID]/logs/
Reference: https://cloud.google.com/logging/docs/