AWS SNS - how to customize the s3 event - amazon-web-services

I created s3 bucket with event enabled for the bucket. When the new object is uploaded to the bucket sns will trigger an event to lambda to extract content from the file.
Is there any way to customize the sns event in generic format.

There is no out-of-box way to customize the notification format (without involving a middleman like Lambda). I would suggest using AWS Lambda as the direct message destination instead of letting the S3 notification flows through SNS. This AWS documentation will be useful for you (https://docs.aws.amazon.com/AmazonS3/latest/dev/notification-walkthrough-2.html).

Related

I want to trigger gitlab-ci pipeline, whenever there is an object put in AWS S3 bucket

I am looking for a way to trigger my gitlab ci pipeline whenever there is a object (with specific name) is added in S3 bucket.
I am new with AWS lambda, can someone please help
You can use s3 event notifications and deliver these events to other AWS services, including AWS Lambda or SQS queues.
Leveraging these event notifications, you could send the event directly to a lambda function that subscribes to putobject events and parses the event content structure to determine if an object of the specific name was created and use the create pipeline API to trigger pipelines on GitLab.
Architecture overview:

Is there an AWS equivalent to `Stackdriver Sinks to BigQuery`

I use a lot StackDriver sinks to BigQuery. It helps on keeping the Data and have them in a convenient queryable form.
I am searching for the equivalent on AWS using CloudWatch but it seems only S3 is integrated. Any workarounds or should I code it?
This can be done with a workaround. It needs only one code component reading from S3 and posting to Stackdriver, then GCP sinks can pickup.
Create a CloudWatch Sink to S3 bucket.
The S3 bucket will be integrated to send SNS messages on object Creation.
A process should listen to those SNS notifications and copy the contents of the bucket into StackDriver.
Since the guarantees of SNS and Cloudwatch to S3 are there, a Lambda function on the SNS notification messages can make the solution more seamless.

AWS S3 sending content as attachment

Is there a direct way after saving something in a S3 bucket to send the content as attachment by SNS (via email) or do i have create a lambda function in order to tell SNS, what i want to send?
S3 bucket -> SNS -> Email (my preferred way, if possible)
or is this not possible without lambda?
The event that S3 sends to Lambda, SNS or SQS only contains a reference to the item that was created, not the actual content.
If you want to pass on the content, you have to download it in whichever code responds to that event and then send it to your destination.
There is no mechanism that sends the content of a newly uploaded object to an SNS topic.

Want to fetch all image name which is uploaded to s3 and process them in Lambda

I am trying to setup AWS Rekognition using lambda so i have created s3 event trigger for each image uploaded to s3 so when a new image uploaded to s3 it will send an SNS notification > SNS will send to SQS > SQS will trigger lambda to process on that image.
MY main question is how i pass the uploaded Object key name to lambda? if anyone knows or having any idea please let me know
I'am not sure why you are using SNS and SQS to reach to lambda from S3.
You can directly link S3 event to Lambda.You can filter the event to PUT/POST/DEL etc too.This is best approach, instead of using SNS and SQS in between.
S3 event pretty much has all the data required to process.
You may look at
https://docs.aws.amazon.com/lambda/latest/dg/with-s3.html
https://docs.aws.amazon.com/lambda/latest/dg/with-s3-example.html

S3 event notification body content

I've configured an event notification on an AWS s3 bucket, putting a message on an SQS queue.
The body of that event contains an array of records.
I would like to understand in which conditions there are multiple records in the body.
Is it when we upload files immediately after each other?
Or only when uploading multiple files at once?
So is this generated on a time basis, collecting all the requests in X amount of time and sending a message to SQS, or is it a separate event for each request to the bucket?
There are different ways you can find S3 Event message structure;
This is document from AWS with Event message structure.
To get specific S3 Event message structure, you can go for one of following practical approach;
You can enable AWS cloud-trail logs. Then, perform some events on related AWS S3 bucket, afterwards view S3 events using the AWS cloudtrail Event history or Insights based on your AWS S3 bucket.
Use a simple AWS lambda mapped to your AWS S3 events, which will just print the AWS S3 events associated with your S3 bucket.