I have a step function and I want to send a notification to SNS topic once the State Machine completes the execution. I know state machine can trigger Lambda and I can implement the logic in lambda but wanted to checkout if State Machine can directly notify SNS?
State Machine does support SNS. You can have a lambda and notify your SNS from there but you can directly use SNS with SFN without using any Lambda. Below you can see the implementation:
{
"Publish to SNS to notify DataDog": {
"Type": "Task",
"Resource": "arn:aws:states:::sns:publish",
"Parameters": {
"TopicArn": "<SNS topic ARN>",
"Message.$": "$",
"MessageAttributes": {
"notify": {
"DataType": "String",
"StringValue": "Send to SNS"
}
}
},
"Next": "<Next state>"
}
}
You can read more about it here
Related
I want to schedule events via Event bridge, so that
Event Bridge will send the events to SNS and subscribe with SQS, then in my springboot application i will listen to SQS ..
but the problem here is, i cannot find a way to provide details in this event.
i want to send something like this:
{
"version": "0",
"id": "89d1a02d-5ec7-412e-82f5-13505f849b41",
"detail-type": "Scheduled Event",
"source": "aws.events",
"time": "2016-12-30T18:44:49Z",
"detail": {"use-case-name": "Update all customers"}
}
is there any possibility i can put details in there?
i try to configure like this
but the event is still does not have any information in details
{
"version": "0",
"id": "7e62a5fa-2f75-d89d-e212-40dad2b9ae43",
"detail-type": "Scheduled Event",
"source": "aws.events",
"resources": [
"..."
],
"detail": {}
}
You can use Target's Input or InputTransformer attribute to send information to target (SNS/SQS in your scenario). You can pass a static JSON message or modify input message depending on the event data.
Note: AWS Eventbridge console has these fields so you can test them without writing code. You won't see target input information on sample event details but if you go to SQS console and see available messages (Poll for messages), you can confirm that messages passed to SQS include the JSON string you defined in the EventBridge side.
SQS sample message:
In my pipeline I have an event notification on an S3 bucket which triggers an SNS topic. That SNS topic in turn has a lambda function subscribed to it. I need the SNS topic to send a hard coded message body to the lambda because it get's used in that function.
Since the SNS topic publishes the message automatically when the S3 event notification is set off I am wondering if and how I can edit the message that gets sent to lambda?
To be clear: I want the same message sent every time. The goal is for lambda to get a variable which is only dependent on which topic the lambda was triggered from.
Currently I am building this through the UI but will eventually code it in terraform for production.
When Amazon SNS triggers an AWS Lambda function, the information it sends includes SNS TopicArn.
You could use that ARN to determine which SNS Topic triggered the Lambda function, and therefore which action it should process.
{
"Records": [
{
"EventSource": "aws:sns",
"EventVersion": "1.0",
"EventSubscriptionArn": "arn:aws:sns:us-east-1:{{{accountId}}}:ExampleTopic",
"Sns": {
"Type": "Notification",
"MessageId": "95df01b4-ee98-5cb9-9903-4c221d41eb5e",
"TopicArn": "arn:aws:sns:us-east-1:123456789012:ExampleTopic",
"Subject": "example subject",
"Message": "example message",
"Timestamp": "1970-01-01T00:00:00.000Z",
"SignatureVersion": "1",
"Signature": "EXAMPLE",
"SigningCertUrl": "EXAMPLE",
"UnsubscribeUrl": "EXAMPLE",
"MessageAttributes": {
"Test": {
"Type": "String",
"Value": "TestString"
},
"TestBinary": {
"Type": "Binary",
"Value": "TestBinary"
}
}
}
}
]
}
Rather than having Amazon S3 send a message to Amazon SNS directly, you might be able to configure an Amazon CloudWatch Events rule that triggers on object creation and sends a Constant as part of the message to Amazon SNS, like this:
If large files are being uploaded, you might also need to trigger it on CompleteMultipartUpload.\
You could also have the rule trigger the AWS Lambda function directly (without going via Amazon SNS), depending upon your use-case. A Constant can also be specified for this.
I have the following event pattern that I want my lambda function to be triggered from but i'm unable to find any documentation to allow this to be set in the zappa_settings.json file. I'm only able to find SNS and EventBridge cron expressions.
{
"source": ["aws.ec2"],
"detail-type": ["AWS API Call via CloudTrail"],
"detail": {
"eventSource": ["ec2.amazonaws.com"],
"eventName": ["RunInstance"]
}
}
How can I get my lambda function to create and trigger from this event?
I can add an SNS topic as a trigger to the lambda function and point EventBridge at the topic. Not perfect as the UI allows EventBridge to trigger Lambda functions directly but it works. Just need to add the following into the zappa_settings file and setup the EventBridge outside zappa.
"events": [
{
"function": "auto_tagger.lambda_handler",
"event_source": {
"arn": "arn:aws:sns:eu-west-1:1234567890:trigger-test",
"events": [
"sns:Publish"
]
}
}
]
Question
How to know the event format coming to Lambda from AWS services?
The AWS Lambda console includes a Test function, which can provide a sample event for most of the events that are generated by AWS.
You can modify these sample events to include your specific data.
For example, the Amazon S3 Put sample event simulates a new object being added to an Amazon S3 bucket. You can modify the event to include your own Bucket and Object names, then use it to test the function without actually using Amazon S3.
Generating Sample Event Payloads
$ sam local generate-event --help
Usage: sam local generate-event [OPTIONS] COMMAND [ARGS]...
You can use this command to generate sample payloads from different event
sources such as S3, API Gateway, and SNS. These payloads contain the
information that the event sources send to your Lambda functions.
Commands:
alexa-skills-kit
alexa-smart-home
apigateway
batch
cloudformation
cloudfront
cloudwatch
codecommit
codepipeline
cognito
config
connect
dynamodb
kinesis
lex
rekognition
s3
sagemaker
ses
sns
sqs
stepfunctions
S3 put
$ sam local generate-event s3 put
{
"Records": [
{
"eventVersion": "2.0",
"eventSource": "aws:s3",
"awsRegion": "us-east-1",
"eventTime": "1970-01-01T00:00:00.000Z",
"eventName": "ObjectCreated:Put",
"userIdentity": {
"principalId": "EXAMPLE"
},
"requestParameters": {
"sourceIPAddress": "127.0.0.1"
},
"responseElements": {
"x-amz-request-id": "EXAMPLE123456789",
"x-amz-id-2": "EXAMPLE123/5678abcdefghijklambdaisawesome/mnopqrstuvwxyzABCDEFGH"
},
"s3": {
"s3SchemaVersion": "1.0",
"configurationId": "testConfigRule",
"bucket": {
"name": "example-bucket",
"ownerIdentity": {
"principalId": "EXAMPLE"
},
"arn": "arn:aws:s3:::example-bucket"
},
"object": {
"key": "test/key",
"size": 1024,
"eTag": "0123456789abcdef0123456789abcdef",
"sequencer": "0A1B2C3D4E5F678901"
}
}
}
]
}
Firehose
$ sam local generate-event kinesis kinesis-firehose
{
"invocationId": "invocationIdExample",
"deliveryStreamArn": "arn:aws:kinesis:EXAMPLE",
"region": "us-east-1",
"records": [
{
"recordId": "49546986683135544286507457936321625675700192471156785154",
"approximateArrivalTimestamp": 1495072949453,
"data": "SGVsbG8sIHRoaXMgaXMgYSB0ZXN0IDEyMy4="
}
]
}
Update
As per the comment by #John Rotenstein, Lambda console can generate much more sample events.
For Go lang, aws-lambda-go/events/, provides a list of sample codes showing how to handle events from different sources and sample test data. Good resource to have a look.
Just print it out the first time you start your development. For python, the command is:
print(json.dumps(event))
The output should be available in CloudWatch log group for your lambda. This is must useful for debugging and testing with real-live events.
Sometimes, in documentation you can also find it. But I found that just printing it out is the fastest and most reliable way to get to know the event format.
I have created a lambda function and I want to trigger it from Amazon SQS. For the Event value in handler (Event,Context), I want to specify a value from this SQS. I want to specify a big JSON. How can I do that?
From Sample Events Published by Event Sources - AWS Lambda, Amazon SQS will send this event information to the AWS Lambda function:
{
"Records": [
{
"messageId": "c80e8021-a70a-42c7-a470-796e1186f753",
"receiptHandle": "...",
"body": "{\"foo\":\"bar\"}",
"attributes": {
"ApproximateReceiveCount": "3",
"SentTimestamp": "1529104986221",
"SenderId": "594035263019",
"ApproximateFirstReceiveTimestamp": "1529104986230"
},
"messageAttributes": {},
"md5OfBody": "9bb58f26192e4ba00f01e2e7b136bbd8",
"eventSource": "aws:sqs",
"eventSourceARN": "arn:aws:sqs:us-west-2:594035263019:NOTFIFOQUEUE",
"awsRegion": "us-west-2"
}
]
The body of the SQS message is provided in the body parameter.
The maximum size of an SQS message is 256 KB, but I'm not sure you'd be able to pass something that big to Lambda. I recommend you try it and see!
Worst case, store the content in Amazon S3 and pass a reference to the S3 object in the message.
Create an SQS queue. This SQS queue should take s3 bucket names as an input. Maybe it should also take the region of the s3 bucket as well? Might want to have it take a JSON object:
{"bucketname": "this_is_my_bucket", "region": "us-west-2"}