Trigger file download to EC2 Windows instance after edit in S3 - amazon-web-services

Attempting to architect a solution that's as automated as possible for downloading a script from S3 after it's edited to a specific drive & directory in a windows EC2 instance.
As I see it at the moment, I believe I can use the S3 Event Notifications to trigger whenever a file has been edited in the bucket itself. What I'm struggling with is how I can utilize a script to download it automatically from there. I know SNS would be an option to subscribe to the S3 Notifications and this instance would have a public IP address.
I've automated tasks exporting TO S3 using Batch files in the Windows Task Scheduler with:
AWS CP or AWS SYNC
But I want to do it the other way around and don't quite see another question with a similar ask.
What's the missing cog in the machine here? Thanks in advance for your help!

I would do it this way:
Use the S3 Event Notification with SNS as you mentioned.
Create a web service running on the Windows EC2 instance that you can register to receive the S3 notifications (e.g. using python on whatever you are comfortable with).
On receipt of the notification have the web service use the AWS SDK that you should also co-install on the Windows EC2 instance to GET the S3 object and store it in the directory of your choice.
Look at this AWS documentation article for how to register the web service for SNS notifications.

Related

Accessing Amazon S3 via FTP?

I have did a number of searches and can't seem to understand if this is doable at all.
I have a data logger that has FTP-push function. The FTP-push function have the following settings:
FTP server
Port
Upload directory
User name
Password
In general, I understand that a Filezilla client (I have a Pro edition) is able to drop files into my AWS S3 bucket and I had done this successfully in my local PC.
Is it possible to remove the Filezilla client requirement and input my S3 information directly into my data logger? Something like the below diagram:
Data logger ----FTP----> S3 bucket
If not, what will be the most sensible method to have my data logger JSON files drop into AWS S3 via FTP?
Frankly, you'd be better off with:
Logging to local files
Using a schedule to copy the log files to Amazon S3 using the aws s3 sync command
The schedule could be triggered by cron (Linux) or a Scheduled Task (Windows).
Amazon did add support recently to AWS Transfer for FTP support. This will provide an integration with Amazon S3 via FTP without setting up any additional infrastructure, however you should review the pricing at the moment.
As an alternative you could create an intermediary server that can sync between itself and AWS S3 using the cli aws s3 sync.

Move data from S3 bucket to external vendor SFTP

I have a requirement to send files from S3 bucket to an external client. FTP or SFTP can be used for this. Based on certain research I found this can be done using Lambda or using EC2 but couldn't find detailed steps for it. Please let me know how this can be done.
Amazon S3 cannot "send" files anywhere.
Therefore, you will need some code running 'somewhere' that will:
Download the file(s) from Amazon S3
Send the file(s) to the external client via SFTP
This is all easily scriptable. The difficulty probably comes in deciding which files to send and how to handle any errors.
You probably couldn't find any documentation on the topic because sending files via SFTP has nothing specifically related to AWS. Just do it the way you would from anywhere.
For example, let's say you wanted to do it via a Python program running either on an Amazon EC2 instance or as an AWS Lambda function:
Download the desired files by using the AWS SDK for Python (boto3). See: Amazon S3 examples
Send the files via SFTP. See: Transfer file from AWS S3 to SFTP using Boto 3
Came across a similar requirement, and this can be done very easily with the lambda function.
functional requirement for our use case was automated transfer of the files when it's ready to send back to the customer.
Architecture
We came up with this simplistic architecture for the basic use case.
Workflow
Upload a file to the S3 bucket
Trigger Push event notification for the lambda function. Prefer to have a separate lambda function for each client so that we can store all SFTP connection details in environment variables.
Env variables will be used to store Server details, credentials, file path, etc...
Lambda function will fetch a file from the S3 bucket
Lambda will transfer the file to External Server.
Worthy Addition
Worth Considering changes on top of this simple approach
If the Lambda function failed to fetch a file then it should do a couple of retries and if it still fails, they should send a notification to the client who is uploading the file to S3 bucket.
If the external transfer fails then Lambda should add that to any SQS queue from that any application can process messages and notify the system and also we can setup retry after few days again.

How to transfer a file from S3 to someones SFTP server

I have a workflow need. I have a customer that does not want to deal with our S3 folders where we drop their files. They want us to send the files directly to their SFTP account. When I unload files from my backend they automatically unload to S3 from AWS services. As this is a one time request per customer I don't wish to set up an automated transfer protocol in a Lamda or bash script. nor do I wish to go through the hassle of copying the file to my local server only to post it to the SFTP site. I would prefer to just right click on the file and select to transfer to SFTP location. Does anyone know if AWS has any plans to add file transfer protocol support into the S3 console UI? (SFTP, FTP, etc.)
What would be even better is if AWS S3 allowed all files dropped in an S3 bucket location to be automatically transferred to the SFTP location defined -- in the scenario where the customer never wishes to deal with S3, but we need to use it.
Given the current capabilities of Amazon S3, automating a send of files from Amazon S3 to an SFTP target would require the use of an AWS Lambda function.
There are a few ways to do this, since you are looking for the most easiest way i would suggest you to install s3fuse on a linux server, this enables you to mount s3 as a file system. You can directly mount it on the sftp server and copy them locally , below is the URL for s3Fuse.
https://cloud.netapp.com/blog/amazon-s3-as-a-file-system
The other method would be to use the AWS CLI to do recursive copy , this would involve installing AWS CLI and generate API keys. Below is an example of the command.
aws s3 cp s3://mybucket/test.txt test2.txt
You can revoke the API keys once you are done with the transfer!

For the open data in s3 from AWS, how can I register some notification when new open data landed in its bucket?

I'm trying to achieve the ask from the title. For example, my architecture design involves trigger a Lambda function whenever a new data land on the open data s3 bucket (say this one: https://registry.opendata.aws/sentinel-2/).
I read https://docs.aws.amazon.com/AmazonS3/latest/dev/NotificationHowTo.html and Amazon S3 triggering another a Lambda function in another account but I non of those really help me so far. Had anyone done similar task before? Thanks in advance!
You can configure Amazon S3 events to send a message to an Amazon SNS Topic OR have it trigger an AWS Lambda function.
If you wish to do any further logic (eg checking specific permissions), you would need to do that within the Lambda function.
See: Configuring Amazon S3 Event Notifications - Amazon Simple Storage Service

Voice message save in aws s3 bucket using Amazon Connect

how to save voice message of customer number and store in an s3 bucket using aws connect. I made a contact workflow but I am not understanding how to save voice message to s3 bucket?
We've tried many ways to build a voicemail solution, including many of the things you might have found on the web. After much iteration we realized that we had a product that would be useful to others.
For voicemail in Amazon Connect, take a look at https://amazonconnectvoicemail.com as a simple, no-code integration that can be customized to meet the needs of your customers and organization!
As soon as you enabled Voice Recording all recordings are placed automatically in the bucket you defined at the very beginning as you setup your AWS Connect Instance. Just check your S3 Bucket if you can spot the recordings.
By default, AWS creates a new Amazon S3 bucket during the
configuration process, with built-in encryption. You can also use
existing S3 buckets. There are separate buckets for call recordings
and exported reports, and they are configured independently.
(https://docs.aws.amazon.com/connect/latest/adminguide/what-is-amazon-connect.html)
The recording in S3 is only starting when an agent is taking the call. Currently, there is no direct voice mail feature in Amazon connect. You can forward the call to a service that allows it, such as Twillio.