Sitecore - copy Role and users to new environment - sitecore

We are using Sitecore 8 update 3 with Active Directory integration. I am trying to copy a Role and the respective users tied to it from our Dev environment over to Prod
example:
role: Sitecore/IHaveAccess
users: ad/dk123, ad/dk234, ad/dk345...
I tried two different methods:
Method 1: Generate package:
By creating a package that described on the page 19: https://sdn.sitecore.net/upload/sitecore6/65/package_designer_admin_guide-a4.pdf
When I installed the package on the new environment, the role was added but none of the users was under the Role.
Method 2: Serialization:
I serialize the item, but when viewing in Notepad++ is does not contain any users. When I serialize a user who was in the group, I do see the group.
Any thoughts why we have the issue?

Unfortunately, the membership information is stored against the user and not the role (the same for roles within roles). In this instance, the membership information is stored against the AD user. You are storing that a user is a member of role x and not that role x contains member y.
This means that you would need to package up both the role, and the corresponding users. I'm not sure of how this would work using AD though, since you are essentially trying to sync back user related data via Sitecore. I would ensure at your AD provider is not set as readonly in the connection string or it's setup. Since you only have a one-way sync, there's no way to store that information back in AD and have it persist.
Personally, I would set up my roles differently to allow the management to be easier, but it depends on your exact requirements obviously:
Create a Sitecore role, assign all your permissions and security against this roles (sitecore\IHaveAccess)
Create a matching AD role (ad\IHaveAccess) and add this as a member of your Sitecore role
Add your AD users to your AD Group. They will gain the correct permissions through Role In Role. If you already have AD Groups set up, you simply add existing Groups to the new Group even.
Using this, at most you have to add your AD roles back into your Sitecore roles (this shouldn't be the case you added the AD roles as a member of the Sitecore role so the membership is stored in Sitecore). It also has the advtantage that your users/roles/membership is centrally located within one system.

Related

How to structure s3 bucket for access control

I know that this could be a trivial problem but I think is important to do things in the right way.
We have an internal application that is used by 80 users now and we want to migrate our storage to s3.
We have 3 environments: dev, test, prod and I was thinking on s structure like this:
dev
user-1
...
user-n
assets (profile picture, other public data)
generated documents (private)
test
prod
In this part we have 3 user rights (ROLE_USER, ROLE_TEAMLEAD, ROLE_ADMIN). Who has role of user should be able to access only his/she's objects, who has role of teamleader can access also all the documents of his team, and who has ADMIN can access all the documents.
What is the safest way to design this, so that when I make a call after an object and a userId/username to get back all the objects that belong to that person.
Should here be a good idea to create groups (should also be easy to update if a teamlead leaves, or if a user changes his/she's teamlead) and also to have aws accounts for all our users?
Any idea/good material will help, thanks.
If your users are IAM (or cognito) users, the structure you have can't accomplish the access control goals with static policies. If you're able to update the IAM policies when membership changes, then the structure can work.
Your IAM policy condition for regular users or admins would be pretty simple to meet the objectives. Each user accessing their own bucket can be allowed by a bucket policy allowing the S3 actions conditioned on the key prefix being their username (${aws:Username} policy variable). Granting access for admins can be done through a group policy on the admin group.
The problem is you have is with the team lead roles. Here, you have two dimensions of access: user and role, but the file structure contains just one of those pieces of information -- you can't determine which objects should belong to a particular teamlead role by the object structure alone. That is, you can't construct a group/bucket policy that grants access according to the requirements without knowing all the usernames in that group (since directories are organized by user only).
This could be fixed if you organized your structure by nesting users within team directories:
team1
user1
user2
team2
user3
user-N
Then you could apply a group policy for each teamlead group to allow access objects under the team directory for the respective team. The IAM policy would not have to change when teamleads or team members change. This is also consistent with the Controlling access to a bucket with user policies guide.
However, this implies a strictly one-to-one relationship between users and teams, which may not be the case for you. And, if users change teams, they'll need their directory in S3 moved.
Alternatively, using the structure you propose, you could generate IAM policies based on group membership at a moment in time, specifying all the users directories belonging to a particular team in the policy. However, whenever the group membership changes, the policy will have to change, too.
As an aside, you may also want to consider using separate buckets for your different environments instead of top level directories. That way, you can effectively test changes that affect the entire bucket (like applying bucket policies) independently for each environment.

Assign different role to a group member

I am looking for advice on a not so particular situation.
I currently have roughly 20000 stores.
All stores have admins, managers and user roles.
An admin can create/manage any roles
A manager can create/manage only user role
A user can login and access custom functionality.
Any persona can be assigned to 1 or multiple store and can have 1 or multiple roles for that particular store.
Ie:
StoreA has userA as Admin and userB as Manager
StoreB has userA as User and userB as Admin
At first, I converted my stores to be groups. But since roles are binded to the group, I would have still have 3 roles for each group (20000 groups and 60000 roles - Group StoreA, Roles: StoreA_Admin, StoreA_Manager, StoreA_User, etc...). Not sure if it is the right decision, And I am not sure about the performance.
Then, I kept the stores as groups, but instead of creating roles, I created custom multivalued attributes that saves the group uid. That worked in carbon, as well as the API, but the console doesn't like the multivalued fields. And if another role is introduced, I would have to create another field.
Any thought on how to approach this situation ?
We can map your story to IS groups and roles as follows.
Please note that groups and roles are treated as two separate resources since IS-5.11.0.
Refer to:
https://is.docs.wso2.com/en/5.11.0/setup/migrating-what-has-changed/#group-and-role-separation
https://medium.com/p/93d42fe2f135
That separation is not clearly visible in the management console. So you can use the console application to create groups and roles.
Group used to represent a collection of users in the user store. One user can belong to zero or more groups.
Role is a collection of permissions. A role can have zero or more permissions.
We can assign a role either to a group/ a user.
Due to this statement:
A user can log in and access custom functionality.
We don't need to assign any role to normal business users specifically.No specific role is required to login into the business application via identity server basic authentication. In case your business application has a role-based access control need to assign a role to business users as well. Otherwise, every user will get login permissions upon successful authentication, it should be enough to do business operations in the application.
In your case, if any store's admin has the same set of permissions and any manager has the same set of permissions, you can't just evaluate the permissions and authorize the requests.
For eg: If user B is the manager of store A and admin of store B, he has inherited both admin and manager roles related permissions. But user B performs a request on store B, you have to authorize the request based on only the roles related to store B.

Limit google account to use ONLY big query

My organization wants to limit the GCP services any user can use. For example we only want to allow the usage of Big Query.
Is there a way to contract GCP in a way that even the top account (or tenant, organization or whatever) can't instantiate anything besides Big Query?
Thanks
I would recommend maybe trying the following. Basically you want to create a group and apply the proper Big Query roles to the group which will then be inherited by all members of the group. Google allows you to create a "company" group that you can have set to auto-add all current/new users of your organization to.
Take the following steps.
Create a "Company" group by following this article. Make sure to set it up so that current and new users of organization will be added. (if you don't want this then just create a group and add in the users necessary)
In GCP, add the corresponding Big Query roles you want to have applied to all your organization's members to the single group.
As suggested by Jack, you can create a group that grants access only to BigQuery, place all users in that group and grant them no extra permissions.
But you must have at least one project owner account that can do anything in that project.
If you want to secure your organization even further, you can do the following:
Limit the amount of services that can be activated using quotas
Monitor actions performed by users with Audit Logs
Set up alerts that will notify you when certain services are activated

Google BigQuery: grant service account permissions to create jobs in only some specific datasets

Problem: I have a project in BigQuery where all my data is stored. Within this project I created multiple datasets containing different views. Now I want to use different service accounts to query the different datasets containing different views via grafana (if that matters). These users should only be able to query the views (and therefore a specific dataset) meant for them.
What I tried: I granted BigQuery User, Viewer or Editor permissions (I tried all of them) at a dataset level (and also BigQuery Meatadata Viewer at a project level). When I query a view, I receive the error:
User does not have bigquery.jobs.create permission in project xy.
Questions: It is not clear to me if granting bigquery.jobs.create permission on project level, will allow the user to query all datasets instead of only the one I want him to access to.
Is there any way to allow the user to create jobs only on a single dataset?
Update October 2021
I've just seen that this question did go unanswered for me back then but still gets a lot of views. I believe the possibilities changed a bit since I asked the question so here is how I'm handling it now:
I give the respective service account the role roles/bigquery.jobUser on project level. This allows it to create jobs in general, however since I don't give any other permissions yet it cannot query data yet.
Then I give the role roles/bigquery.dataViewer on the dataset level. That makes it possible for the service account to query only the dataset I granted the permission on.
It is also possible to grant roles/bigquery.dataViewer on table level, what will restrict access to only the specific table.
In case you want the service account not only to query (view) the data, but also to insert or change it for example, replace roles/bigquery.dataViewer with the role having the necessary permissions (or assign that role in addition).
How to grant the permissions:
On dataset level
On table or view level
We had a same problem, how we solved was, created a custom role and assigned the custom role to the particular dataset.
You can grant bigquery.user role to a specific dataset as indicated in this guide. The bigquery.user role contains the bigquery.jobs.create permission as well as other basic permissions related to querying datasets. You can check the full list of permissions for this role in this list.
As suggested above, you can also create custom roles having only the exact permissions you want by following this piece of documentation.

Can I have dynamic User specific permissions using AWS IAM / Cognito?

I'm attempting to develop an application architecture almost exclusively on top of AWS services.
This application has both User and Organization "entities". As one might except, a User may be an admin, role-x or role-y of one or more organizations. (role-x and role-y are just placeholders for some role with some set of specific permissions. A User may also be standalone (that is, not have a role on any Organization).
Our current thinking is to use DynamoDB to store organization and user specific data. For users this may include some basic information (address, phone number, whatever), and for organizations it may include fields like "mission statement", "business address" and so on.
An admin of an organization would be able to edit all organization fields, whereas a role-x might only be able to update "mission statement" while reading all other fields.
Since I mentioned that a single user may have roles on many different organizations, that might look something like:
user1:
organizations:
123: 'admin'
456: 'role-x'
789: 'admin'
It's also worth noting that these role assignments are modifiable. New or existing users may be invited to take on a specific role for an organization, and an organization may remove a user from a role.
This is a fairly straightforward type of layout, but I wanted to be very clear about the many-to-many nature of the user, org and roles.
I've been reading IAM and Cognito documentation, as well as how it relates to fine-grained control over DynamoDB items or S3 buckets - but many of the examples focus on a single user accessing their own data rather than a many-to-many role style layout.
How might one go about implementing this type of permission system on AWS?
(If policy definitions need to be updated with specific Identities (say, for an Organization), can that reliably be done in a programatic way - or is it ill-advised to modify policies on the fly like that?)
The above answer is outdated.
AWS has added Cognito-Groups recently. That provides more flexibility
You can use technique described in the article to achieve that:
https://aws.amazon.com/blogs/aws/new-amazon-cognito-groups-and-fine-grained-role-based-access-control-2/
Unfortunately the kind of permission system you are trying to implement is not possible with Cognito at the moment. With Cognito you can currently create unique identities for your users in an identity pool. Users can authenticate using any external provider such as Facebook, Amazon, Google, Twitter/Digits or any OpenId Connect Provider. Users can also authenticate through your own backend authentication process. After the user authenticates, Cognito creates a unique identity for that user. There’s a concept of an identity, but there’s no concept of groups. All users/identities within a one identity pool can get credentials from roles associated with that identity pool. Currently you can specify two roles: One role for authenticated identity and one role for unauthenticated identity. There’s no such feature at the moment where you can specify multiple groups for each identity and specify role on that group.
For more information on Cognito, you can refer to
https://aws.amazon.com/cognito/faqs/
http://docs.aws.amazon.com/cognito/devguide/getting-started/