CloudWatch alarm to SNS in different region - amazon-web-services

I'm trying to notify an SNS topic from a CloudWatch alarm that's in a different region. The reason is that I want SMS alerting, which isn't available in the region where my services are. If I enter the ARN of the subscription and save the changes in the console, I get "There was an error saving the alarm. Please try again." Trying again does not help. Using a topic in the local region does work, but that's not what I need.
Is there a way to notify a topic in a different region? If not, is there another easy way I can achieve my goal?

Didn't find any docs that explicitly say this can't be done but tried to set an SNS from us-east-1 as an action of an alarm in eu-west-1 using the CLI and I got this:
An error occurred (ValidationError) when calling the PutMetricAlarm operation: Invalid region us-east-1 specified. Only eu-west-1 is supported.
So I'll assume it's not supported.
To get the functionality you need you can use AWS Lambda. Lets say your service is in a region where SMS is not supported, I'll use eu-central-1 as an example.
Setup would go like this:
[us-east-1] Create your SNS topic that can send SMS messages, in the region where SMS is supported.
[eu-central-1 Create a lambda function that sends messages to the SNS topic from step 1 in the region where your service is.
[eu-central-1] Create an SNS topic in the region where your service is.
For the SNS topic configure subscription with AWS Lambda protocol and point it to lambda from step 2.
[eu-central-1] Create your alarm in the region where your service is and put the SNS topic from step 3 as an action.

To add to #Tartaglia's answer, here's the source of such a lambda function using Python 3, cobbled together from various sources because I don't have time to do it properly:
import json
import logging
import boto3
logger = logging.getLogger()
logger.setLevel(logging.INFO)
session = boto3.Session(region_name='eu-west-1') # EU (Ireland)
sns_client = session.client('sns')
def lambda_handler(event, context):
logger.info('Received event: %s', event)
for record in event['Records']:
sns_message = record['Sns']
response = sns_client.publish(
TopicArn='YOUR TOPIC ARN HERE',
Subject=sns_message.get('Subject', None),
Message=sns_message.get('Message', None))
logger.info('Publish response: %s', response)
return 'OK'

Related

Lambda boto3 function to verify if a cloud trail is created

I need help with creating a lambda function to verify certain aws services is enabled or created. I need to create a script that would verify maice, cloudtrail, cofig is all configured/enabled. I started with cloudtrail but it's erroring out. I kinda need help.
import boto3
def lambda_handler(event, context):
client= boto3.client('cloudtrail')
response = client.get_trail_status(Name='Test')
print (response)
Based on the information given so far, I believe that the issue is that you are not properly providing the 'Name':
As per the docs, you need to provide the ARN of the trail, not just the trail name:
Request Parameters
For information about the parameters that are common to all actions,
see Common Parameters.
The request accepts the following data in JSON format.
Name
Specifies the name or the CloudTrail ARN of the trail for which you are requesting status. To get the status of a shadow trail (a
replication of the trail in another region), you must specify its ARN.
The following is the format of a trail ARN.
arn:aws:cloudtrail:us-east-2:123456789012:trail/MyTrail
Type: String
Required: Yes

boto3 publish message in SNS

I want to run a script to publish message to my sns topic. I followed the documentation and my code is this :
import boto3
client = boto3.client('sns')
response = client.publish(
TopicArn='my topic arn',
Message='ptt message '
)
print("Response: {}".format(response))
It prints the response json, but I don't see any message in my AWS SNS console. I have a lambda function which is pretty basic as a subscriber to this sns.
The lambda code :
import json
def lambda_handler(event, context):
# TODO implement
return {
'statusCode': 200,
'body': json.dumps('Hello from Lambda!')
}
How can I be sure that my message "ptt message" has been published successfully? How can I check it?
For testing of integration you can configure Delivery status logging for your SNS topic. There is a simple setting for this in GUI of SNS. This will export to CloudWatch the logs of your subscription triggers.
Secondly, your Lambda is not doing anything with the event, so your "ptt message stays in the "event" dictionary. You can for example log your event and see the structure where the actual message body comes.
import logging
def lambda_handler(event, context):
logging.info(event)
...
And simply to see that the Lambda was triggered by the subscription, you can open the CloudWatch Logs, find the LogGroup automatically created by AWS for your Lambda (has the name of the function in it) and search the Log Group. There are comfortable filters by "last X minutes".
In order to test this integration you can also publish test messages to SNS Topic from the web interface if this feels more convenient than running a script.

ansible sns - why is subscriber optional

When I am subscribing a lambda function to an SNS topic through either the aws console or aws cli there is a column shown in the subscription information called "subscriber". It does contain my account id.
I don't seem to be able to control this field through the cli or console however I can through ansible:
---
- name: set up topic and subscribe lambda
sns_topic:
name: "topic_name"
state: present
display_name: "Display Name"
subscriptions:
- endpoint: "arn:aws:lambda:ap-southeast-2:123456789:function:functionName"
protocol: "lambda"
subscriber: 123456789
The subscriber field is optional, however SNS messages do not seem to reach lambda without it.
What is this field for?
Can I subscribe Lambdas in other accounts to my SNS topic?
How can I effect change in this field with AWS CLI?
Why is it optional in Ansible 2.4
Couldn't find reference to ghost field i.e. subscriber.
Maybe it was supported in older sdk/api and now has been dropped.
Reseach:
I don't see subscriber field in the console when subscribing lambda. Image below
Even through CLI, there is no such field
Link for SNS Subscribe command through CLI documentation.
[Extra Research]: Even the java sdk asks for only three parameters in order to subscribe
subscribe(String topicArn, String protocol, String endpoint)
Java doc link

How to subscribe an SNS topic of one account by SQS of another account using boto3?

I'm trying to create an SNS topic in one account and attach it to Config Rules.
I have 3 such accounts and want to create SNS topic in each of the account.
Now i want to subscribe all of the 3 topics of 3 different accounts by SQS of the fourth account.
I'm able to do it manually. Can somebody please tell me how it can be done via boto3.
Thanks in Advance.
In order to subscribe a SNS topic present in Account A by an SQS present in Account B using boto3, following is the procedure.
In Account A, create SNS topic and add the proper permission.
For example,
import boto3
sns_client = boto3.clien('sns')
topics = sns_client.create_topic(Name='SNS topic name')
sns_client.add_permission(
TopicArn=str(topics['TopicArn']),
Label=label,
AWSAccountId=[
"AccountB_Id",
],
ActionName=[
"GetTopicAttributes",
"SetTopicAttributes",
"AddPermission",
"RemovePermission",
"DeleteTopic",
"Subscribe",
"ListSubscriptionsByTopic",
"Publish",
"Receive"
]
)
Now to subscribe the created topic from Account B, execute the following code from account B.
import boto3
subscription_client = boto3.client('sns')
subscription_client.subscribe(
TopicArn="ARN of the topic created",
Protocol="sqs",
Endpoint="ARN of the SQS present in Account B"
)
Now you would see the SNS topic of account A been subscribed by account B.

AWS: Boto SQS writing isn't saving

I'm writing to SQS using boto as following.
I got the Message object response from print.
<boto.sqs.message.Message object at 0x102dd4790>
But I'm not seeing that message inside AWS SQS website?
SQS is region specific. So using the following code to initialise sqs fixes this.
sqs = boto.sqs.connect_to_region("ap-southeast-2", aws_access_key_id=AWS_ACCESS_KEY_ID, aws_secret_access_key=AWS_SECRET_ACCESS_KEY)