I am very new to AWS and was wondering if it was possible to use AWS Rekognition in a Google Cloud Function that pulls images from Google Cloud Storage? If so, what are the keys steps to getting this to work?
Thanks
Many Amazon Rekognition API calls involve interaction with files stored in Amazon S3. These API calls will not work with Google Cloud Storage.
Some API calls allow images to be passed as part of the call. These will work from any environment.
Related
I'm looking to have all my data under one cloud and don't want to split it between AWS and Google Cloud. Is there a way to use the Gmail and the Google Drive interface while hosting the entire operation on AWS?
If your goal is to "store data on Google Drive but actually store it in your own S3 account", then no -- Google stores data in its own systems.
Interestingly, there are companies that offer storage services similar to Amazon S3 and use the normal S3 API, so that the services look identical (although they might be missing some of the advanced features offered by S3).
I am working on a pet project based on multi-cloud (AWS and GCP) which is based on serverless architecture.
Now there are files generated by the business logic within GCP (using Cloud Functions and Pub/Sub) and they are stored in GCP Cloud storage. I want to ingest these files dynamically to AWS S3 bucket from the Cloud Storage.
One possible way is by using the gsutil library (Exporting data from Google Cloud Storage to Amazon S3) but this would require a compute instance, and run the gsutil commands manually which I want to avoid.
In answering this I'm reminded a bit of a Rube Goldberg type setup but I don't think this is too bad.
From the Google side you would create a Cloud Function that is notified when a new file is created. You would use the Object Finalize event. This function would get the information about the file and then call an AWS Lambda fronted by AWS API Gateway.
The GCP Function would pass the bucket and file information to the AWS Lambda. On the AWS side you would have your GCP credentials and the GCP API download the file and upload it to S3.
Something like:
All serverless on both GCP and AWS. Testing isn't bad as you can keep them separate - make sure that GCP is sending what you want and make sure that AWS is parsing and doing the correct thing. There is likely some authentication that needs to happen from the GCP cloud function to API gateway. Additionally, the API gateway can be eliminated if you're ok pulling AWS client libraries into the GCP function. Since you've got to pull GCP libraries into the AWS Lambda this shouldn't be much of a problem.
We were doing most of our cloud processing (and still do) using AWS. However, we also now have some credits on GCP and would like to use and want to explore interoperability between the cloud providers.
In particular, I was wondering if it is possible to use AWS S3 from within GCP. I am not talking about migrating the data but whether there is some API which will allow AWS S3 to work seamlessly from within GCP. We have a lot of data and databases that are hosted on AWS S3 and would prefer to keep everything there as it still does the bulk of our compute.
I guess one way would be to transfer the AWS keys to the GCP VM and then use the boto3 library to download content from AWS S3 but I was wondering if GCP, by itself, provides some other tools for this.
From an AWS perspective, an application running on GCP should appear logically as an on-premises computing environment. This means that you should be able to leverage the services of AWS that can be invoked from an on-premises solution. The GCP environment will have Internet connectivity to AWS which should be a pretty decent link.
In addition, there is a migration service which will move S3 storage to GCP GCS ... but this is distinct from what you were asking.
See also:
Getting started with Amazon S3
Storage Transfer Service Overview
All of Amazon documentation on their Video Rekognition API are examples of videos that are stored in S3 bucket. Is there anyone out there who have tried using the API without storing the videos in S3 i.e. on local machine or GCS?
All video-related Amazon Rekognition API calls (eg start_face_detection() and start_face_search()) require input to be provided from Amazon S3.
Calls related to still images can alternatively be passed as a series of Bytes in the call.
I am trying to find a way to archive application logs of app deployed on Cloud Foundry to an external cloud storage like amazon s3.
Is there a service within cloud foundry that caters for such requirement? If not, is there any third party utility providing the same?
Thanks a lot for the help in advance.
Thanks,
Kinjal
At present there is no such thing, as far as I know. However, via the Cloud Foundry API you have full access to all files deployed as part of an application, including the logs. This means building an application that could transfer all those files to S3 at a regular interval would be fairly trivial.