Not receiving notifications from Amazon S3 Put events - amazon-web-services

I have two buckets created (bucket1 bucket2) and two SNS notification topics (alert_bkp_1 and alert_bkp_2) created and configured in the S3 event property.
Setup is for receiving email alerts whenever a Put occurs inside the bucket.
I'm encountering two problems with this setting:
Bucket 1
Bucket1 receives .tar files of 4 servers within a scheduled backup window in the servers crontab, but I receive email notifications of only 3.
Details:
When I send manually I get notification of all 4 servers normally.
The 4 backups are being stored diary in the bucket normally.
Bucket 2
Bucket 2 is receiving the .dump type backup files normally, but does not receive the email notifications.
Detail:
When I send manually I get notification of the 2 servers normally.
The 2 backups are being stored diary in the bucket normally.
I have already made several changes in the configuration of events of S3, but without success, I do not know what to do, someone would have any tips to help me with this problem?

Related

Lambda invocation on two SNS events at the sametime

I have a usecase where I need to read the two files which are in a different account, I will be receiving an SNS event with the filename and I need to create an EMR cluster from the Lambda only if two files are available in the other s3 bucket.
Currently I am writing dummy files to s3 bucket every time I receive a SNS event and then creating the EMR cluster only after ensuring that on the second SNS event that I have received, the first file is available in my accounts s3 bucket- This approach is working fine.
But I am unable to solve the issue of what really happens if we receive two files at the same time in the other s3 bucket and if we get two sns events around the same time, as each event thinks the other file hasn’t been arrived yet.
How would I solve this problem .

How to receive an alert if no file is received in aws S3

I am running a ETL pipeline whose final outcome is dropped in a AWS S3 bucket, now sometimes the pipeline is successful but I dont receive file in S3, after a bit of debugging and looking into the logs I can handle that problem. But what I want to do is set up an alert if no file is received in the S3 bucket. Consider the pipeline is schedule for every 24 hours. I am a noob in AWS so answer in Layman's terms will be appreciated. Thanks
I completed this by doing the following:
I am keepng tack of the last file.
I created a Event on the S3 bucket to trigger an AWS lambda function whenever a new file is added to the bucket. Stored te current time in DynamoDB table. This updated the date wheneve a new file is added.
Configure Amazon Cloudwatch to trigger a AWS function evey few hours(in my case 24 hours) that checks the ast updated date in DynamoDB. If it is more then the time we are expecting trigger an alet.
code for both lambda's
If you know the name of the file that is uploaded to S3, you can use GetObjectMetadata method to check if the file is present in S3. If the file isn't there, you will get 404 NOT FOUND error.
You can set an alert to check for the status code of the response you get from calling GetObjectMetadata method.

S3 trigger sending bulk upload notification to AWS SNS

I want to execute two lambda function when a new object uploaded on s3.
I am using s3 trigger at bucket level and sending notifications to AWS SNS, 2 lambda functions have subscribe to these SNS to get notifications.
I tried uploading 100 files and I could see all file name got printed in log in both function.
But in one function I could see 2 logs where as in another I could see 3 logs when combined I can see name of all 100 files.
How SNS works if it gets too many object upload notification ?
How Lambda function works, for e.g let's say its still processing the first file and it gets notification to process second file and third file
Is there a chance of few files getting skipped by Lambda because it received too many notification

Access denied for S3 triggered SQS event record

I currently have the following scenario:
A S3 bucket is set up to receive files and publish to a SQS queue an event record for each file that is created or updated.
A bunch of kubernetes pods reading this queue, reading the file keys from the event and processing the files.
The problem:
Almost 50% of the times, when the k8s pod try to load the file after reading the event record, it receives an "Access Denied" permission error. When the same message comes back from flight and the pod tries to load the file again, it always does without problems.
I've already tried to set up the delivery delay to more time, didn't work.

Amazon S3 Event Notification not triggering sometimes

We have an Amazon S3 Bucket with Event notification setup for POST and Multipart upload completed and initially we had it set to trigger a Lambda directly but due to error handling concerns we change it to SQS to get the "backout" feature of SQS to easier capture any message in error.
The files are put to S3 from a SFTP server (EC2 instance) and the events are put to SQS in like 99.9% of the cases but ever so often a file is missed...
We can easily spot this as the SQS will in turn trigger a Lambda and the first thing the Lambda does is to rename the file to ".processing" and as soon as the processing s completed the file is moved to another Bucket.
Now and again we find files with the original file name which has not gotten the ".processing" extension and there are no SQS messages or logs that shows that the Lambda has picked them up. This happens like once in a thousand files or something like that...
Files are always transferred the same way to the Bucket but sometimes there are large batches and it seems to happen more frequently in large batches...
What could be the reason some files are not triggering a notification?
Or what can I check to find what could possibly cause this?