Passing variable values through S3 and SQS event trigger message - amazon-web-services

I have setup the aws pipeline as S3 -> SQS -> Lambda. S3 PutObject event will generate an event trigger message and pass it to SQS and SQS will trigger the lambda. I have a requirement to pass a variable value from S3 to SQS and finally to Lambda as part of the event message. Variable value could be the file name or some string value.
can we customize the event message json data generated by S3 event to pass some more information along with the message.
Does SQS just pass the event message received from S3 to Lambda or does any alteration to the message or generate its own message.
how to display or see the message generated by S3 in SQS or Lambda.

You can't manipulate the S3 event data. The schema looks like this. That will be passed onto the SQS Queue which will add some it's own metadata and pass it along to Lambda. This tutorial has a sample SQS record.

When Amazon S3 triggers an event, a message is sent to the desired destination (AWS Lambda, Amazon SNS, Amazon SQS). The message includes the bucket name and key (filename) of the object that triggered the event.
Here is a sample event (from Using AWS Lambda with Amazon S3 - AWS Lambda):
{
"Records": [
{
"eventVersion": "2.1",
"eventSource": "aws:s3",
"awsRegion": "us-east-2",
"eventTime": "2019-09-03T19:37:27.192Z",
"eventName": "ObjectCreated:Put",
"userIdentity": {
"principalId": "AWS:AIDAINPONIXQXHT3IKHL2"
},
"requestParameters": {
"sourceIPAddress": "205.255.255.255"
},
"responseElements": {
"x-amz-request-id": "D82B88E5F771F645",
"x-amz-id-2": "vlR7PnpV2Ce81l0PRw6jlUpck7Jo5ZsQjryTjKlc5aLWGVHPZLj5NeC6qMa0emYBDXOo6QBU0Wo="
},
"s3": {
"s3SchemaVersion": "1.0",
"configurationId": "828aa6fc-f7b5-4305-8584-487c791949c1",
"bucket": {
"name": "lambda-artifacts-deafc19498e3f2df",
"ownerIdentity": {
"principalId": "A3I5XTEXAMAI3E"
},
"arn": "arn:aws:s3:::lambda-artifacts-deafc19498e3f2df"
},
"object": {
"key": "b21b84d653bb07b05b1e6b33684dc11b",
"size": 1305107,
"eTag": "b21b84d653bb07b05b1e6b33684dc11b",
"sequencer": "0C0F6F405D6ED209E1"
}
}
}
]
}
The bucket can be obtained from Records[].s3.bucket.name and the key can be obtained from Records[].s3.object.key.
However, there is no capability to send a particular value, since S3 triggers the event. However, you could possibly derive a value. For example, if you had events from several different buckets triggering the Lambda function, then the Lambda function could look at the bucket name to determine why it was triggered, and then substitute a desired value.

Related

How to send scheduled custom message using Amazon EventBridge

I'm trying to build an Amazon EventBridge rule that runs on a schedule(weekly), to put an event in the SQS.
There are multiple options to choose from, as to what message is to be sent as an event.
I understand that it's essentially a JSON object, which can be set to a custom JSON, or the default(or some seletive fields from this) Something like:
{
"version": "0",
"id": "6a7e8feb-b491-4cf7-a9f1-bf3703467718",
"detail-type": "EC2 Instance State-change Notification",
"source": "aws.ec2",
"account": "111122223333",
"time": "2017-12-22T18:43:48Z",
"region": "us-west-1",
"resources": [
"arn:aws:ec2:us-west-1:123456789012:instance/i-1234567890abcdef0"
],
"detail": {
"instance-id": " i-1234567890abcdef0",
"state": "terminated"
}
}
AWS EventBridge: https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-create-rule-schedule.html
EB Events: https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-events.html
My question is: How can I send a JSON Object that has a different parameters every time?
Say I want to publish this object, with a date-range different, with
activeFrom: Today-7 days
activeTill: Today's date.
{
"dummyId": "xyz",
"activeFrom": "2021-07-09T18:43:48Z",
"activeTill": "2021-07-15T18:43:48Z"
}
You can let the EventBridge trigger a lambda function on schedule. In that lambda, you can bake your JSON and send the event to SQS.

Cloudwatch: event type syntax for monitoring S3 files

I need to create a cloudwatch event that runs a lambda function everytime my file in S3 gets updated/re-uploaded. What "eventName" should I use? I tried using "ObjectCreated" but it doesn't seem to work. Perhaps the syntax is incorrect.
https://docs.aws.amazon.com/AmazonS3/latest/API/API_GetObject.html
{
"source": [
"aws.s3"
],
"detail-type": [
"AWS API Call via CloudTrail"
],
"detail": {
"eventSource": [
"s3.amazonaws.com"
],
"eventName": [ "ObjectCreated:*"],
"requestParameters": {
"bucketName": [
"mynewbucket"
],
"key": [
"file.csv"
]
}
}
}
CloudWatch Events (or EventBridge) does not automatically track data events for S3 objects. You need to either use CloudTrail for this, which tracks data events on a particular S3 bucket and emits CloudWatch Events (or EventBridge) events for that: https://aws.amazon.com/blogs/compute/using-dynamic-amazon-s3-event-handling-with-amazon-eventbridge/
Or you can use S3 Event Notifications with an SNS topic and use a Lambda subscription on the SNS topic.

AWS account Id in SNS topic event

Is there any way to get AWS Account Id in an SNS topic event to the subscriber? Actually, in my case, I want multiple customer account can trigger their s3 putObject to the given sns topic arn which is from my account and I have a lambda method which is subscribed to that topic. Now I'm getting event payload in my lambda handler whenever a customer puts an object to s3 bucket. But as I said, there would be many customer so my lambda need to process that coming event is from which customer? So I need customer account Id available in the sns event payload, is it possible?
Schema that is received by subscriber already contains Arns of both subscriber and topic. Here is the schema. We can parse the accountId from it.
`"TopicArn":"arn:aws:sns:us-east-2:123456789012:sns-lambda"`
"EventSubscriptionArn": "arn:aws:sns:us-east-2:123456789012:sns-lambda:21be56ed-a058-49f5-8c98-aedd2564c486"
It appears that your situation is:
Multiple AWS Accounts have Amazon S3 buckets with an Amazon S3 Event configured to trigger your AWS Lambda function
You want the ability to for the Lambda function to detect which account triggered the event
I don't think that this information is available. Here is a sample S3 Put event from the AWS Lambda "Test" console:
{
"Records": [
{
"eventVersion": "2.0",
"eventSource": "aws:s3",
"awsRegion": "ap-southeast-2",
"eventTime": "1970-01-01T00:00:00.000Z",
"eventName": "ObjectCreated:Put",
"userIdentity": {
"principalId": "EXAMPLE"
},
"requestParameters": {
"sourceIPAddress": "127.0.0.1"
},
"responseElements": {
"x-amz-request-id": "EXAMPLE123456789",
"x-amz-id-2": "EXAMPLE123/5678abcdefghijklambdaisawesome/mnopqrstuvwxyzABCDEFGH"
},
"s3": {
"s3SchemaVersion": "1.0",
"configurationId": "testConfigRule",
"bucket": {
"name": "example-bucket",
"ownerIdentity": {
"principalId": "EXAMPLE"
},
"arn": "arn:aws:s3:::example-bucket"
},
"object": {
"key": "test/key",
"size": 1024,
"eTag": "0123456789abcdef0123456789abcdef",
"sequencer": "0A1B2C3D4E5F678901"
}
}
}
]
}
There does not appear to be a field containing the Account ID of the source bucket.
To confirm this, I triggered an event on an S3 bucket and logged the event. I could not find any reference to an AWS Account ID.

Get input in Lambda from s3 event trigeer

I am trying to trigger a Lambda once I get any file in an s3 bucket. For that I have configured the event in the s3 bucket. But, I need to give an input to the Lambda (event) that will be triggered. How do I do that?
Add a new Lambda function handler to the project. Create a new function handler name, say 'S3FunctionHandler'; use the default input type that is already selected, S3 Event; and as leave the output type as object:
This will create some boilerplate code with the Lambda function handler that takes an S3 event as input:
Select S3FunctionHandler and select an IAM Role:
Switch over to the AWS Management Console to test our Lambda function with a dummy S3 event. We need to test it, so to do this let's configure a test event. Select the S3 Put event, which you find by clicking on the Actions tab. This simulates somebody uploading a new object to an S3 bucket.
If an Amazon S3 Event is configured to trigger an AWS Lambda function, then S3 will provide information to the Lambda function about the S3 object that triggered the function.
From Using AWS Lambda with Amazon S3 - AWS Lambda:
{
"Records": [
{
"eventVersion": "2.1",
"eventSource": "aws:s3",
"awsRegion": "us-east-2",
"eventTime": "2019-09-03T19:37:27.192Z",
"eventName": "ObjectCreated:Put",
"userIdentity": {
"principalId": "AWS:AIDAINPONIXQXHT3IKHL2"
},
"requestParameters": {
"sourceIPAddress": "205.255.255.255"
},
"responseElements": {
"x-amz-request-id": "D82B88E5F771F645",
"x-amz-id-2": "vlR7PnpV2Ce81l0PRw6jlUpck7Jo5ZsQjryTjKlc5aLWGVHPZLj5NeC6qMa0emYBDXOo6QBU0Wo="
},
"s3": {
"s3SchemaVersion": "1.0",
"configurationId": "828aa6fc-f7b5-4305-8584-487c791949c1",
"bucket": {
"name": "my-bucket",
"ownerIdentity": {
"principalId": "A3I5XTEXAMAI3E"
},
"arn": "arn:aws:s3:::my-bucket"
},
"object": {
"key": "foo.jpg",
"size": 1305107,
"eTag": "b21b84d653bb07b05b1e6b33684dc11b",
"sequencer": "0C0F6F405D6ED209E1"
}
}
}
]
}
This information includes the Bucket name, Key (filename) of the object, the event that triggered the function and various other tidbits of information. The Lambda function can then use this information to process the object appropriately.

Amazon Cognito Streams gives garbage data in the Lambda events

I have configured Lambda to read from Kinesis Stream that Cognito Sync writes to. I get see an event in Lambda logs for every CognitoSync called, but the event does not contain the data key-values that I have set in dataset. I do see the key-value being sent to Cognito in the request.
Event that Lambda gets, looks like following. How do I get the key-value in dataset from this.
2015-03-07T16:18:40.082Z 9be3582e-c4e5-11e4-be53-6f01632e7b6d
{
"Records": [
{
"eventSource": "aws:kinesis",
"kinesis": {
"partitionKey": "us-east-1:d4bfff5d-9605-484d-9aab-0e63829b1e54-Fia",
"kinesisSchemaVersion": "1.0",
"data": "eyJpZGVudGl0eVBvb2xJZCI6InVzLWVhc3QtMTowMmFiM2JiYi04N2RlLTQyMzUtYWEyZS1kNzliYzQ1YmFmOTciLCJpZGVudGl0eUlkIjoidXMtZWFzdC0xOmQ0YmZmZjVkLTk2MDUtNDg0ZC05YWFiLTBlNjM4MjliMWU1NCIsImRhdGFzZXROYW1lIjoiRmlhciIsIm9wZXJhdGlvbiI6InJlcGxhY2UiLCJwYXlsb2FkVHlwZSI6IklubGluZSIsImtpbmVzaXNTeW5jUmVjb3JkcyI6W3sia2V5IjoiU3RhdGUiLCJ2YWx1ZSI6IltbXCItXCIsXCItXCIsXCItXCIsXCItXCIsXCItXCIsXCItXCIsXCItXCJdLFtcIi1cIixcIi1cIixcIi1cIixcIi1cIixcIi1cIixcIi1cIixcIi1cIl0sW1wiT1wiLFwiLVwiLFwiLVwiLFwiLVwiLFwiLVwiLFwiLVwiLFwiLVwiXSxbXCJYXCIsXCItXCIsXCItXCIsXCItXCIsXCItXCIsXCItXCIsXCItXCJdLFtcIk9cIixcIi1cIixcIi1cIixcIi1cIixcIi1cIixcIi1cIixcIi1cIl0sW1wiWFwiLFwiLVwiLFwiLVwiLFwiLVwiLFwiLVwiLFwiLVwiLFwiLVwiXV0iLCJzeW5jQ291bnQiOjYsImxhc3RNb2RpZmllZERhdGUiOjE0MjU3NDUxMTQ3NjMsImRldmljZUxhc3RNb2RpZmllZERhdGUiOjE0MjU3NDUxMTE0NDAsIm9wIjoicmVwbGFjZSJ9XSwia2luZXNpc1N5bmNSZWNvcmRzVVJMIjpudWxsLCJsYXN0TW9kaWZpZWREYXRlIjoxNDI1NzQ1MTE0NzYzLCJzeW5jQ291bnQiOjZ9",
"sequenceNumber": "49548516359756600751834810213344902796782628138546888706"
},
"eventID": "shardId-000000000000:49548516359756600751834810213344902796782628138546888706",
"invokeIdentityArn": "arn:aws:iam::111111111111:role/LambdaKinesisInvocationRole-funcog",
"eventName": "aws:kinesis:record",
"eventVersion": "1.0",
"eventSourceARN": "arn:aws:kinesis:us-east-1:111111111111:stream/funcog",
"awsRegion": "us-east-1"
}
]
}
It appears that the data you get from Kinesis (in the .Records[0].kinesis.data element) is Base64-encoded. Decoding gives the following:
{"identityPoolId":"us-east-1:02ab3bbb-87de-4235-aa2e-d79bc45baf97","identityId":"us-east-1:d4bfff5d-9605-484d-9aab-0e63829b1e54","datasetName":"Fiar","operation":"replace","payloadType":"Inline","kinesisSyncRecords":[{"key":"State","value":"[[\"-\",\"-\",\"-\",\"-\",\"-\",\"-\",\"-\"],[\"-\",\"-\",\"-\",\"-\",\"-\",\"-\",\"-\"],[\"O\",\"-\",\"-\",\"-\",\"-\",\"-\",\"-\"],[\"X\",\"-\",\"-\",\"-\",\"-\",\"-\",\"-\"],[\"O\",\"-\",\"-\",\"-\",\"-\",\"-\",\"-\"],[\"X\",\"-\",\"-\",\"-\",\"-\",\"-\",\"-\"]]","syncCount":6,"lastModifiedDate":1425745114763,"deviceLastModifiedDate":1425745111440,"op":"replace"}],"kinesisSyncRecordsURL":null,"lastModifiedDate":1425745114763,"syncCount":6}
So in your Lambda function, you will need to parse this data. One way to do so might be the following:
var data = JSON.parse(new Buffer(event.Records[0].kinesis.data, 'base64').toString('ascii'));
console.log("Key: " + data.kinesisSyncRecords[0].key);
// etc...