GSUITE users.delete and transfering data - google-admin-sdk

When I delete a user using the admin console, I have the possibility to transfer users data before deleting.
Using the Users.Delete api (in C# winforms application) I don't have this possibility.
Is there perhaps a way to perform a data-transfer using api's (before deleting the account) ?
Thanks for the help.

There is a specific API for this, called Data Transfer API
It features the method Transfers:insert
Use it specifying parameters of the Transfers resource - important is newOwnerUserId and oldOwnerUserId

Related

Is pubsub suitable to be used by client desktop applications?

If I were to create a client desktop application, I'm trying to find a reliable way to notify client applications of new data that needs to be queried from the server. Would pubsub be a good use for this? Most of the documentation I see for it seems to be focused on server to server communication, and is a bit ambiguous if this would work well for server to client notifications.
If it should work, would I be able to properly authenticate subscribers to limit the topics they could subscribe to? This application would be potentially downloadable by anyone, and I would need to ensure that information intended for one client couldn't end up in the hands of another client.
Cloud Pub/Sub is not going to be a good choice for this use case. First of all, note that each topic and project is limited to 10,000 subscriptions. Therefore, if you intend to have more than that, you will run out of subscriptions. Secondly, note that a subscription only receives messages published after the subscription is created. If you only need messages to be delivered that were published after the user came to the website, this may be okay. However, with these two issues combined, you'll need to consider lifetime of your subscriptions. Do they get deleted when a user logs out? If not, when a user comes back, do you expect them to get all of the messages published since the last time they visited?
Additionally, as discussed in the comments, there is the issue of authentication. Your client-side app would have to have the credentials to subscribe. This would require you to essentially leak those credentials into your client-side code, which could be a vulnerability in your application.
The service designed to deliver notifications of this nature is Firebase Cloud Messaging.
If you want to open the application to anyone on the internet, you can't rely on the IAM service that only works with Google identity -> You can't ask your user to have a Google Account, the user experience will be bad.
Thus, you can't use IAM service to secure the PubSub access, and thus to use PubSub because anyone could access it.
In your use case, the first step is to ask the user to register (create an account, validate email, maybe use payment method,...). Then, you have an identity, but managed by you, not by IAM. You know which messages are for this user and which aren't.
If you want to be notified "in real time", I propose you to use long polling method or streaming to push data to the user. Cloud Run is now capable to do this and I recommend you to have a look on that.

Transfer file from AWS S3 to OneDrive with AWS Lambda

A client of ours requested that we have copies of their files on both AWS S3 and OneDrive.
The usual MO: File is sent from an iOS application to an AWS S3 bucket. This triggers an AWS Lambda Function which attaches the file to an email and sends a copy to the client, which they again store on OneDrive. Now, we want to skip the email part and transfer the file directly to OneDrive.
All my research so far points to Zapier or CloudRail or MS Graph REST Api. The problem I'm having is that we want to transfer the file with an AWS Lambda function (Java8), automagically. Almost all the tutorials and examples on MS Graph needs a client to log in manually. Mostly client side logic. The other methods have more overhead, and we don't (unnecessarily) want to make our stack more complicated than it already is.
I realize this is a very specific case. We are systematically replacing the client's file management system, without disrupting their day-to-day operations too much.
Any conclusive pointers/examples/tutorials to get this done server side would be greatly appreciated.
I'm not sure how well S3 aligns with OneDrive, they are quite different models. OneDrive is provisioned by user which begs the question, which user would you want to copy this file too? I would think Azure Storage would be a far better fit as it uses a similar model to S3.
You can use Microsoft Graph API to upload the file to a user's OneDrive. You would need to authenticate the user in order to obtain an Access and Refresh Token. Once this process is done, you can store that Refresh Token and retrieve an updated Access Token as needed.
Also with CloudRail it's necessary to authenticate the user, but there are methods to store and use an access token.
The services have two methods, loadAsString and saveAsString, and they are used to store and load credentials. You could call loadAsString with your access token, the string can be different from service to service, but will look something like this: [{“access_token”: “YOUR ACCESS TOKEN”}]
To add to this, Microsoft now has a cloud migration tool www.mover.io that allows you to sync files & folders from most clouds into Azure blob, Sharepoint or OneDrive directly, so without download/upload to a client machine.
Personally used it only for a one-time sync, but leaving it here for posterity.
The client only has to login once so if you already have the client and secret keys, you can do the manual flow once then save the generated token file together with your code files in AWS. Next time the code is ran, it uses the refresh token. Last time I did this I was able to set the refresh token to never expire but I think Microsoft has randomly removed that option and now the token can only last something like 2 or 3 years max

Google Apps - Data Transfer API - transfert only some ressources

I'm trying to use the new Data Transfer API for Google Apps Domain and I would like to transfer some specific Google Drive files from one user to another. It seems we can use this API to transfer a "full service" (eg: all files from Google Drive) and not only some specific files.
Is my understanding of this API is correct or is it possible to limit the transfer to specific resources?
Thank you.
You're correct. The API enables you to transfer ownership of application data (currently Drive documents and Google+ pages) in bulk. It essentially allows you to automate the manual ownership transfer task documented here. You might want to read this blog here which has some useful background information.
The only way to achieve what you want is to use the Drive API (not to be confused with the Drive SDK).

What is the "proper" way to use DynamoDB for an iOS app?

I've just started messing around with AWS DynamoDB in my iOS app and I have a few questions.
Currently, I have my app communicating directly to my DynamoDB database. I've been reading around lately and people are saying this isn't the proper way to go about getting data from my database.
By this I mean is I just have a function in my code querying my Dynamo database and returning the result.
How I do it works but is there a better way I should be going about this?
Amazon DynamoDB itself is a highly-scalable service and standing up another server in front of it requires scaling the service also in line with the RCU/WCU configured for your tables, which we can and should avoid.
If your mobile application doesn't need a backend server and you can perform all the business functions from the mobile device, then you should probably think about
Using the AWS DynamoDB SDK for iOS devices to write your client application that runs on the mobile device
Use AWS Token Vending Machine to authenticate your mobile users to grant them credentials to be used to run operations on DynamoDB tables.
Control access (i.e what operations should be allowed on tables etc.,) using IAM policies.
HTH.
From what you say, I can guess that you are talking about a way you can distribute data to many clients (ios apps).
There are few integration patterns (a very good book on this: Enterprise Integration Patterns), one of which is called shared database. It is essentially about using a common database for multiple clients to share the data. Main drawback for that pattern (in your case) is that you are doing assumption about how the database schema looks like. It can potentially bring you some headache supporting the schema in the future, if your business logic changes.
The more advanced approach would be sending events on every change in your data instead of directly writing changes to the database from client apps. This way you can add additional processing to the events before the data they carry is written to the database. For example, you may want to change the event format in the new version of your app, but still want to support legacy users, so you add translation procedure which transforms both types of events to the format which fits the database schema. It's basically a question of whether to work with diffs vs snapshots.
You should be aware of added complexity of working with events, and it can be an overkill if your app is simple and changes in schema are unlikely.
Also consider that you can do data preprocessing using DynamoDB Streams, which gives you some advantages of using events still keeping it simple to implement.

Use cases for web application API?

Nowadays a lot of web applications are providing API for other applications to use.
I am new to the usage of API so I want to understand the use cases for it.
Lets take Basecamp as an example.
What are the use cases for using their API in my web application?
For inserting current data in my web application into a newly created Basecamp account instead of inserting everything manually which could take days or weeks if the data is huge?
For updating my application data when the user changes something in Basecamp. If so, how do I know for example when a user add/edit/remove a contact in Basecamp. Do I make a request and check every minute from the backend?
For making backup of the Basecamp data so I can move it to other applications if necessary?
Are all the above examples good use cases for the usage of API?
Are there more use cases?
I want to have a clear picture of why it's good to use another web service API and how I can leverage that on my application.
Thanks.
I've found the biggest reason to use and provide web services is to be able to programmatically drive the application with another process. This allows the coupling of different actions in different applications driven by one event/process/trigger.
For example I could create a use a webservice provided by Basecamp, my bug tracking database and the continuous integration server. I could tie all those things together and kick them off from a commit hook script.
I can have a monitor in production automatically open a ticket in our ticket tracker. This could trigger an autoremediation process from the ticket tracker which logs into the box remotely and restarts the service.
The other major reason I've seen to use and provide web service is to reduce double entry. If you do change management in your production environment that usually means you create Change tickets. The changes that occur may also need to be reflected in the Change Management Database which is usually a model of how production is suppose to look. Most of these systems don't automatically drive the update of your configuration item with the data from the change. Using web services you can stitch them together to eliminate the double (manual) entry that would normally occur.
APIs are used any time you want to get data to/from an application without using the default interface.
*I'd bet there's a mobile app would use the basecamp api.
*You could use the api to pull information from basecamp into another application (like project manager software or an individual's todo webpage)
*the geekiest of us may prefer to update basecamp from a script/command line rather than interrupting our work flow to open a web page and click around.