I need to synchronize the users of a domain with a third party database.
Using the method users.list I can query for the complete list of users.
With the comlete list I can identify created users and deleted users. I don't find any way identify updated users.
Is there a way to identify updated users?
Use Users.watch. It lets you know of any changes made to users. When changes are made you can update your database.
Related
I am trying to make my first web app with was Lambda, Cognito, Amplify, API Gateway, and DynamoDB. I followed the tutorial aws made, and it went well. Now I need advice on how to store user information for my custom app (which will be written in python). I need to store a large amount of information for each user (the app is very similar to a todo list app) and I was planing on making a dynamoDB table for each user with the table name being the user's username. The problem with this is that I have read that it is not good practice for noSQL databases to have a lot of tables and I don't know how to get the user's unique ID from Cognito. Is there a correct way to do this? I want to ensure that each user can only access their information.
I have a list of domains of all my gapps customers.I want to get only users who have super admin privilege for all those domains.I can fetch all users for the domain and iterate over them and filter out only supers admins. But it will be a lot of api calls.
There has to be another way....
Thanks.
Both information list of all domains and list of users under a domain are not possible is one API Call. You have to make a list of domains call then make each call for a domain and get the list of users as you have mentioned been doing. However, you can avoid getting all users and get only admin users instead. There is a parameter in API call as query where you can mention isAdmin=true to get only super Admins.
But Google provides batch request to combine multiple API calls in one request which you can readily use to fulfill your purpose.
I am building a web app where different companies will upload their own audio files with some additional information. I am building it using Django, Postgres and hosting it on AWS. Users belong to different companies will only be able to access their data when they log into the website.
The website allows those users to upload content, search content and access content.
My question is, what's the best practice to handle those uploaded content? Is it better to create different schema for each company or putting all the content together and allow users to access different content based on the company id that each entry associates with?
putting all the content together and allow users to access different content based on the company id that each entry associates with?
Personally, I would do this, for several reasons:
It's easier to maintain. Adding new companies probably just means a new ID, rather than a new schema and some tables.
You can add security with application code or with database views.
You can have other company specific functionality that uses the same design.
I would also suggest enforcing the data security on the database side, by only allowing the application to query from certain views, where the views are limited by company ID. This means that you won't accidentally SELECT from a base table and forget the company filter, causing the user to see data that isn't theirs.
This is just my opinion - happy to be proven otherwise.
I manage a domain of users and would like to be able to transfer all the documents of a user to another user. As far as I understand the best way to achieve that is to find the fileID's of all files belonging to one user and transfer them to another user. However, I have problem constructing a query.
UPDATE:
So the correct query to retrieve the list of files would be:
response = drive_service.files().list(q="'user#company.com' in owners").execute()
However, it only works for me as an admin. If I try to retrieve the list of files for any other user in my domain it returns an empty list.
Files.list will retrieve all the user's files, in this case it will get all your own files. In order for that query to work would be only if that user is also owner one(or more) of your files.
Even as an admin you cannot access users files directly.
To access other user's files, as an admin you need to impersonate the users and then perform actions in their behalf.
This is achieved by using a service account with domain wide delegation of authority.
Here you can find more information on that as well as a python example.
Hope it helps.
If you want to transfer all the files of one user into another user's Drive, the easiest way would be to use the Data Transfer API provided by Google. This way you don't have to list the files and transfer them one by one. Also you only need the admin access token and wouldn't need domain wide delegation either. You can get the official documentation here
Can we get the file from database using only one login from multiple users?
Let me explain to you suppose I have one database and only one log-id and password for database.
I want to use this database for multiple users across the globe each and every user ask for different files (while user doesn't have this database id and password) at the same time with this login id and password.
I want to create new layer between the database and the user to get these files.
Is this possible or I can say feasible and what are the pros and cons?
Normally database or data source access is encapsulated in Data Access Object that resides a Data Access Layer. In Java this can be done using JDBC API or Object-Relational Mapping framework such as Hibernate or iBatis. Since this questions is tagged with c++, ODB(C++) is an option.