I'm looking to have all my data under one cloud and don't want to split it between AWS and Google Cloud. Is there a way to use the Gmail and the Google Drive interface while hosting the entire operation on AWS?
If your goal is to "store data on Google Drive but actually store it in your own S3 account", then no -- Google stores data in its own systems.
Interestingly, there are companies that offer storage services similar to Amazon S3 and use the normal S3 API, so that the services look identical (although they might be missing some of the advanced features offered by S3).
Related
They both pretty much offers the same service and purpose, i don't see any reason why one would use workdocs over s3.
Amazon WorkDocs is an online document editor and file sharing system. AWS describes it as a "fully managed, secure enterprise storage and sharing service with strong administrative controls and feedback capabilities that improve user productivity."
Amazon Simple Storage Service (S3) is a data storage service. AWS describes it as "storage for the internet".
To give an example:
Amazon Workdocs is like Google Docs
Amazon S3 is like Google Drive (sort of)
People would use WorkDocs if they wish to collaborate on documents with other people (eg review documents, add comments, share documents in a team, mount a drive of shared documents). Amazon S3 does not provide those capabilities.
I am developing an application with django rest and one of the features is to let the user store ID cards and driver license. I am thinking of using amazon AWS S3 to store the files.
Is that secure enough for that functionality? What is usually used for that type of files?
Amazon Simple Storage Service (S3)
It allows you to store an infinite amount of data that can be accessed
programmatically via different methods like REST API, SOAP, web
interface, and more. It is an ideal storage option for videos, images
and application data.
Features:
Fully managed
Store in buckets
Versioning
Access control lists and bucket policies
AES-256 bit encryption at rest
Private by default
Best used for:
Hosting entire static websites
Static web content and media
Store data for computation and large-scale analytics, like analyzing
financial transactions, clickstream analytics, and media transcoding
Disaster recovery solutions for business continuity
Secure solution for backup & archival of sensitive data
Use encryption to protect your data:
If your use case requires encryption during transmission, Amazon S3 supports the HTTPS protocol, which encrypts data in transit to and from Amazon S3. All AWS SDKs and AWS tools use HTTPS by default
Restrict access to your S3 resources:
By default, all S3 buckets are private and can be accessed only by users that are explicitly granted access. When using AWS, it's a best practice to restrict access to your resources to the people that absolutely need it, you can see in that Doc.
I would go with aws s3 for such a use case where I want to store this kind of information.
Setting default server-side encryption behavior for Amazon S3 buckets. Depending on the type of setup and amount of money I am willing to spend, I would choose to go with Customer Managed Key for encrypting the bucket.
Considering the I am going through all the security checks AWS recommends How can I secure the files in my Amazon S3 bucket?.
Enable replication, Versioning, Logging and maybe IP based access for all the good keeping.
S3 provides all kinds of bells and whistles for security in that case.
What is the equivalent Google Cloud Platform product related to AWS Data Exchange
https://aws.amazon.com/data-exchange/
I've looked in the official Google Cloud Platform documentation that compares AWS and GCP products and they don't mention the AWS Data Exchange product at all.
So, most likely there isn't an equivalent GCP product to AWS Data Exchange.
That said, BigQuery supports third party data transfer, so maybe this is closest to what you're looking for:
Third party transfers for BigQuery Data Transfer Service allow you to automatically schedule and manage recurring load jobs for external data sources such as Salesforce CRM, Adobe Analytics, and Facebook Ads.
We were doing most of our cloud processing (and still do) using AWS. However, we also now have some credits on GCP and would like to use and want to explore interoperability between the cloud providers.
In particular, I was wondering if it is possible to use AWS S3 from within GCP. I am not talking about migrating the data but whether there is some API which will allow AWS S3 to work seamlessly from within GCP. We have a lot of data and databases that are hosted on AWS S3 and would prefer to keep everything there as it still does the bulk of our compute.
I guess one way would be to transfer the AWS keys to the GCP VM and then use the boto3 library to download content from AWS S3 but I was wondering if GCP, by itself, provides some other tools for this.
From an AWS perspective, an application running on GCP should appear logically as an on-premises computing environment. This means that you should be able to leverage the services of AWS that can be invoked from an on-premises solution. The GCP environment will have Internet connectivity to AWS which should be a pretty decent link.
In addition, there is a migration service which will move S3 storage to GCP GCS ... but this is distinct from what you were asking.
See also:
Getting started with Amazon S3
Storage Transfer Service Overview
I am trying to find a way to archive application logs of app deployed on Cloud Foundry to an external cloud storage like amazon s3.
Is there a service within cloud foundry that caters for such requirement? If not, is there any third party utility providing the same?
Thanks a lot for the help in advance.
Thanks,
Kinjal
At present there is no such thing, as far as I know. However, via the Cloud Foundry API you have full access to all files deployed as part of an application, including the logs. This means building an application that could transfer all those files to S3 at a regular interval would be fairly trivial.