I have a user account, but i need a script for assign a role or group to my user, is this possible from Enterprise Guide?
PDT: The role and the group was created previously.
It is possible, but would involve using the User Import Macros. These require quite a bit of reading to get your head around. It isn't as simple as having a nice procedure to do the job.
SAS User Import Macros
Related
Is it possible to create a bigquery service account to limit access to only 1 dataset? When I go through the service account generation process it appears to give access to an entire project and does not show options to limit to a specific data set.
Short answer is yes. But to do it you do not assign the privileges at the project level. You need to actually go and modify the dataset to do it.
Check the documentation here:
https://cloud.google.com/bigquery/docs/dataset-access-controls
It outlines the process with a few different methods.
I have the following issue in AWS QuickSight: A user created a dataset through Athena. Everything worked fine. The user shared the dataset with another user granting him OWNER rights. Then the first user was deleted. Now the second user can't edit the dataset anymore. He can share it but the person it is shared to can't edit it either. The error message:
Hopefully this can be solved by the Quicksight account Admin using the Quicksight UI to add dataset editing permission to this user as shown here.
Or it may well be that the new Owner does not have the required IAM permissions such as quicksight:UpdateDataSet IAM permission, see the docs.
What does it say when you click the "Show details" link in the screenshot above?
This is quite a mess to be honest. The data sources in QuickSight are connected to the user who created it. They inherit their access roles from whoever created them. This is not accessible through the API though I think it is mentioned in the documentation somewhere. Thus it can't be changed.
So when we deleted the users who originally created the data sources they ceased working along with the data sets based on them.
Our solution for this was that we created "standard" data sources with a technical user - this was not such a big deal because we exclusively use Athena - and then recreated all the data sets and switched them to the new standard data sources - this was a big deal because analysts had to switch data sets in their analysis / dashboards.
To me this shows that QuickSight is not quite complete as a analytics platform in large companies. The API is not quite there.
I'm looking for a built-in SAS macro that lists each user group along with the LASR reports, which the group is authorized for.
Try %mduextr. It will not show their LASR authorization, but it will give a ton of information about which groups they're in and the permissions they have.
%mduextr(libref=work)
Another option is %mdsecds, which is a security report macro. This gives a lot of information about exactly what folders they have access to. Documentation can be found here:
https://go.documentation.sas.com/doc/en/bicdc/9.4/bisecag/n0l1mpdt430djgn1bl1c3euei85w.htm#p1erksk56y6n0tn1ar0twogmbgz8
I have a report from power bi, which has a direct connection to the server to obtain the data (analisys services). To access the data from my account I use the on premises data gateway, which works correctly and I can view the data in the web app. The problem appears when the report to another user (both having the pro account). From the account of the other user you can see that a report was shared, but when you open it the following error appears: "Error executing the query because the cube or some internal structures have not been processed (or do not exist)" .Also grant owner permissions to the cube to the user in question. Any clue where it might be failing?
I think you should Map usernames for this connection.
Go to settings -> Manage gateways
Under your gateway cluster you should have your data source (if not you can add a new one and it's quite straight forward to set up, just choose analysis services, write in database name of server and credentials) and then you should go to Users tab.
There you can see Map usernames where you need to Replace the account to which you want to share with an account that has permissions in SSMS.
For example you want to share to example#elpmaxe.com and you have granted permissions in SSMS to user named example.elpmaxe, so in map usernames you would replace example#elpmaxe.com with example.elpmaxe
The answer was easy but finding it was difficult. The issue was that even though you had assigned the role in the cube to the user who wanted to share the report, you had not given them read permission (assuming the role had already been assigned). It is a basic problem but if you are a beginner in analysis services it can get complicated.
I am trying to do something that would be relatively simple for a relational database but I don't know how to do it for a nonrelational one.
I am trying to make a simple task web app on AWS where people can post their tasks.
I have a table called tasks which uses the userid from the auth token provisioned by AWS Cognito. I am wondering how I can return the user information. I do not want to rely on Cognito by simply calling it every time a user sends a request. So, my thought would be to create another table to store all of the user information. That, however, is not a very nonrelational way of doing things since JOINS are so bad.
So, I was wondering if I should do any of the following
a) Using RDS instead
b) Not use Cognito and set up my own Auth system
c) Just doing the JOIN with a table containing all of the user info
d) Doing the request to Cognito each time
Although I personally like the idea of cognito, at this time it has some major drawbacks...
You can not backup / restore a user pool without loosing their password, also you have to implement your own backup/restore.
A way around is to save the user password in a cognito custom attribute.
I expected by using api gateway/lambda authorizer to have all the user data in the lambda context but its not there. Or am indoing something wrong with api gateway template mapping 😬
Good thing api gateway/lambda authorizer, can be cached by up to an hour, wont call the authorizer function again which seems like a top feature.
Does not work well with cloudformation, with every attribute update it recreates the user pool without restoring the users, thus loosing the users.
I used it only in one implementation and ended up duplicating the users in DynamoDB as well.
I'm avoiding it ever since. I wish they solve these issues as it looks like a service to be included with every project saving lot of time.
Reading your post I asked myself the same questions and not sure the answer either 😄
Pricing seems fair.
The default 5 requests/second to get user info seems strange as it woukd be consumed by one page load doing multiple ajax api requests .
For this in DynamoDB, there is no need for another table. If the access patterns dictate you store the information in another object, then so be it, but more than likely it should be in the same table. Sounds like you need two different item types in the same table.
For the task PK of userid and SK of task::your-task-id. This would allow you to get all of a user's tasks easily or even a specific task very easily if you knew the task ID. You might even have an attribute that is a timestamp and then have a GSI that is the userID as the PK and the timestamp as the SK. then you could use the begins_with operator on the SK and "paginate through all of the user's tasks that are in the month of 2019-04".
For the user information, have the userID be the PK and the SK be user_info and attributes be the user's information.
The one challenge for this is if you were to go to extremes and one single user is doing thousands of ops per second. e.g. "All tweets by very popular celebrity". If you have such a use case there are ways around that as well, e.g. write sharding. These are just examples for you to play with. Without knowing all your access patterns, I cannot model everything you might want to do. I highly recommend you go watch this presentation from reInvent 2018.