I am needing to export some data from DynamoDB or Postgres as a csv and send that file to an email every month. Is there a scheduler I can set up that will do this for me? Or will I need to create a lambda function to export data and send the email every month?
To automate this as much as possible, your best bet is writing a lambda.
You can setup a recurring monthly event trigger using CloudWatch Events that will remove the need to manually kick things off.
Inside the lambda, your code would query the data from DynamoDB or Postgres, generate the CSV, and send it directly to the target over email as an attachment using Amazon SES (Simple Email Service). You could also choose to store the CSV in S3 before sending it over email, to have as a backup in case email delivery doesn't work as expected (for one reason or another).
Tutorial: Schedule AWS Lambda Functions Using CloudWatch Events
Using File Attachments with Amazon SES
Related
I would normally handle the task of sending an email after a new DynamoDB entry with Lambda and SES but I'm required to not use Lambda for it.
There's a 'Contact us' section in the website and the email needs to be sent every time a new entry is made. We use API Gateway to post the data to DyanmoDB
Is there a way to carry this out without Lambda?
It's not possible without writing code. Furthermore you may probably want to tailor each email to make it more personal to the user, thus Lambda is a must.
You could design something using EventBridge Pipes which can trigger when a new user is added and can have SNS as a destination which could trigger an email. But that email may not be customizable nor may it send to people not subscribed to the topic
DynamoDB triggers Lambda functions by feeding the records of a DynamoDB Stream to the Lambda function. That is by far the easiest way to process updates to a DynamoDB table, but you can also write other code that processes a DynamoDB outside of Lambda: https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Streams.html
If you really want to do it without lambda you can use the following pattern and glue some AWS services together.
You can consume the dynamodb stream with eventbridge pipes and send it to a stepfunction and there you use the sdk integration to call SES. Instead of stepfunction you can directly use SNS for simpler setups.
You will not have to use lambda for this setup. Further you can transform your message either in the pipe or in the stepfunction.
Be aware that eventbirdge pipes has currently no AWS CDK L2 construct which might make it harder to configure if you use CDK.
we have an aws account with some s3 buckets that are used by several people. I would like to know if there is a way, and if so which ones, to configure an sns notification by email when a restore from s3 glacier deep archive is completed so that this email goes to the person who initiated the retrieval ?
Yes, there in fact there are multiple ways to accomplish this. What you would want to do is to create an Event Notification for your bucket. This can be found in the console by going into the bucket Properties and pressing the Create Event Notification button.
You will be prompted to select the type of action for which you want to send notifications. You would want to select s3:ObjectRestore:Completed:
You will need destination for your events. For sending emails, one option would be to create an SNS topic and use that as a destination. This topic can be integrated with Amazon Simple Email Service (SES) which will be able to send mails.
Other option would be to select a Lambda function as destination and use that to send emails and do other automations.
the problem is more, how as an admin can I configure the sending of mail to the user who initiated the restoration and only to him
In this case you would want to enable Event Bridge integration for your bucket. If you enable it, it will send events which contain requester who initiated the restore for an object (see event full event format here). Event Bridge can be also integrated with SNS or Lambda.
I now that we can set up SNS service SMS subscription which would save the MessageId and few other info.
At first, I thought it would also save the messages too but while reading the records, it is not included.
Is there a way to include it or it's just something we have to do manually while sending the sms?
Thanks in advance for any help.
The message content is not part of your delivery receipt metadata, so it's a good practice that you do your own book-keeping and track all the messages you've sent. A sample schema could be something like:
Destination Phone Number (e.g. +12065551234)
ISO Country Code (e.g. US)
Sent Date (epoch)
Message Id (auto-generated from request)
MCCMNC carrier information
Message Content
Then you can store all this data in a data storage of your choice.
An example architecture (that is cost efficient for high traffic) could be:
Create metadata POJO and serialize into JSON
Use Kinesis Firehose to stream/batch your serialized metadata into S3
Use AWS Glue to determine the schema of your S3 objects
Use AWS Athena to query your stored S3 data
You can read the documentation on how to setup CloudWatch to store metrics and logs for SMS deliveries here
Another recommendation is to check your SMS usage reports. Here is the documentation for that.
Best of luck.
Disclosure: I work in AWS and I worked in the Mobile Messaging (SMS/Push) team.
I am trying to understand AWS data sync on xamarin. they have a nice SDK to use in xamarin.forms. I am using onesignal notifications instead of aws notifications.
my question is I want to fire a notification after a data insert or sync. Onesignal has an api working with http post.
So how do I make post to onesignal endpoint when
user data is syncronized with aws cognito
a new shared data is inserted into DynamoDB
Of course I can do that on clientside but this is not a good practice. I would like to do that on the server side.
For #1, you can use Cognito events, which execute a Lambda function, whenever the data is synchronized by the user. In this Lambda function you can fire the notification.
For #2, I do not see how that is related to Cognito Sync, as the data is stored in datasets not DynamoDB. If you are talking idependently about data being inserted into DynamoDB and getting notified about that, then again DynamoDB has nice integration with AWS Lambda to send notifications on table updates.
Is there a way by which I can get notified when a upload is completed in S3 Bucket? The requirement is that I need to provide link to users after uploading of a video is complete in the bucket. By default now I provide link after 30 minutes of start of video, whether video takes 5 minutes to upload or 40 minutes. So is there any way like any API that provides information that the upload has been completed?
Notifications can be triggered in Amazon S3 when any of the following occur:
s3:ObjectCreated:*
s3:ObjectCreated:Put
s3:ObjectCreated:Post
s3:ObjectCreated:Copy
s3:ObjectCreated:CompleteMultipartUpload
s3:ObjectRemoved:*
s3:ObjectRemoved:Delete
s3:ObjectRemoved:DeleteMarkerCreated
s3:ReducedRedundancyLostObject
Notifications can be sent via three destinations:
Amazon Simple Notification Service (SNS), which in-turn can send notifications via email, HTTP/S endpoint, SMS, mobile push notification
Amazon Simple Queueing Service (SQS)
Amazon Lambda (not currently available in all regions)
See: Configuring Amazon S3 Event Notifications
The most appropriate choice depends on your programming preference and how your app is written:
Use SNS to push to an HTTP endpoint to trigger some code in your app
Write some code to periodically check an SQS queue
Write a Lambda function in Node.js or Java
Once triggered, your code would then need to identify who uploaded the video, retrieve their user details, then send them an email notification. This would be easiest if you control the key (filename) of the object being uploaded, since this will assist in determining the user to notify.
You can use Amazon Lambda to post a message to Amazon SNS (or notify you any other way) when a file is uploaded to S3.
Setup an S3 trigger to your Lambda function. See this tutorial: http://docs.aws.amazon.com/lambda/latest/dg/walkthrough-s3-events-adminuser.html
Inside your Lambda function, send out your notification. You can use SNS, SES, SQS, etc.
There is no direct method that can tell that whether the upload is complete or not in S3 bucket. You can do a simple thing which I have followed after lot of research and it is working correctly.
Follow this link and read the size of file after every 30 seconds or so as per your requirement when the file size has not changed for two simultaneous readings once again check the size for surety because it might be due to network congestion that size might not have changed for two simultaneous readings.