I need help with creating a lambda function to verify certain aws services is enabled or created. I need to create a script that would verify maice, cloudtrail, cofig is all configured/enabled. I started with cloudtrail but it's erroring out. I kinda need help.
import boto3
def lambda_handler(event, context):
client= boto3.client('cloudtrail')
response = client.get_trail_status(Name='Test')
print (response)
Based on the information given so far, I believe that the issue is that you are not properly providing the 'Name':
As per the docs, you need to provide the ARN of the trail, not just the trail name:
Request Parameters
For information about the parameters that are common to all actions,
see Common Parameters.
The request accepts the following data in JSON format.
Name
Specifies the name or the CloudTrail ARN of the trail for which you are requesting status. To get the status of a shadow trail (a
replication of the trail in another region), you must specify its ARN.
The following is the format of a trail ARN.
arn:aws:cloudtrail:us-east-2:123456789012:trail/MyTrail
Type: String
Required: Yes
Related
I'm trying to create an amazon cloudWatch rule which triggers whenever an object is uploaded into a bucket. I know that to do this I need to trigger on the PutObject Event, however best I can tell that requires enabling object level logging on the bucket. I will be using a multitude of buckets and want to be able to automate that process, and because of how most of the system is set up using boto3 seems to make the most sense. So how can I turn object-level logging on using boto3?
The only AWS official resource I've been able to find so far is: How Do I Enable Object-Level Logging for an S3 Bucket with AWS CloudTrail Data Events?
Which explains how to enable object level logging through the GUI.
I've also looked through the boto3 library documentation
Both have ultimately not been helpful based on my understanding.
My chief goal is to enable object-level logging through boto3, if that's something that can be done.
You can configure an Amazon S3 Event so that, when a new object is created, it can:
Trigger an AWS Lambda function
Put a message in an Amazon SQS queue
Send a message to an Amazon SNS topic
See: Configuring Amazon S3 Event Notifications
You can use the put_event_selectors() function in CloudTrail service.
client = boto3.client('s3')
client.put_event_selectors(
TrailName='TrailName',
EventSelectors=[
{
'ReadWriteType': 'All',
'IncludeManagementEvents': True,
'DataResources': [
{
'Type': 'AWS::S3::Object',
'Values': [
'arn:aws:s3:::your_bucket_name/',
]
},
]
},
])
I'm creating a logs aggregator lambda to send Cloudwatch logs to a private log analysis service. Given the number of resources used by my employer, it was decided to create a subscription lambda that handles log group subscription to the aggregator.
The solution works fine, but it requires to manually search a resource's log group via amazon console and then invoke the subscription lambda with it.
My question:
Is there a way to, given a resource arn, find which log group is mapped to it? Since I'm using Cloudformation to create resources it is easy to export a resource's arn.
UPDATE
To present an example:
Let's say I have the following arn:
arn:aws:appsync:<REGION>:<ACCOUNTID>apis/z3pihpr4gfbzhflthkyjjh6yvu
which is an Appsync GraphQL API.
What I want it a method (using te API or some automated solution) to get the Cloudwatch log group of that resource.
You can try the describe-log-groups command. It is available on the cli, must also be there on the API.
To get the names of the log groups you can go with:
aws logs describe-log-groups --query 'logGroups[*].logGroupName' --log-group-name-prefix '/aws/appsync/[name-of-the-resource]'
Output will look like this:
[
"/aws/appsync/[name-of-your-resource]"
]
I am creating infrastructure with terraform with API Gateway connecting to DynamoDb API. I am creating resource aws_api_gateway_integration to define the integration with DynamoDb with type attribute set as AWS.
But somehow i am unable to get the uri value right for db.
Documentation says it should be of format arn:aws:apigateway:{region}:{subdomain.service|service}:{path|action}/{service_api}.
Current value configured is arn:aws:apigateway:us-east-1:dynamodb:GetItem.
I am not sure what is service_api. Did anyone encounter this before? Kindly help.
This also took me some time to figure out, I think the DynamoDB API-Gateway Integration deserves a dedicated example in the Terraform docs similar to Lambda. Actually service_api just refers to the DynamoDB Action (which you would also enter in the AWS Api Gateway Console Action field) prefixed with the literal action/. So the following block eventually worked for my PUT request mapping:
resource "aws_api_gateway_integration" "my-integration" {
type = "AWS"
integration_http_method = "POST"
uri = "arn:aws:apigateway:eu-central-1:dynamodb:action/PutItem"
# (...)
}
I'm trying to notify an SNS topic from a CloudWatch alarm that's in a different region. The reason is that I want SMS alerting, which isn't available in the region where my services are. If I enter the ARN of the subscription and save the changes in the console, I get "There was an error saving the alarm. Please try again." Trying again does not help. Using a topic in the local region does work, but that's not what I need.
Is there a way to notify a topic in a different region? If not, is there another easy way I can achieve my goal?
Didn't find any docs that explicitly say this can't be done but tried to set an SNS from us-east-1 as an action of an alarm in eu-west-1 using the CLI and I got this:
An error occurred (ValidationError) when calling the PutMetricAlarm operation: Invalid region us-east-1 specified. Only eu-west-1 is supported.
So I'll assume it's not supported.
To get the functionality you need you can use AWS Lambda. Lets say your service is in a region where SMS is not supported, I'll use eu-central-1 as an example.
Setup would go like this:
[us-east-1] Create your SNS topic that can send SMS messages, in the region where SMS is supported.
[eu-central-1 Create a lambda function that sends messages to the SNS topic from step 1 in the region where your service is.
[eu-central-1] Create an SNS topic in the region where your service is.
For the SNS topic configure subscription with AWS Lambda protocol and point it to lambda from step 2.
[eu-central-1] Create your alarm in the region where your service is and put the SNS topic from step 3 as an action.
To add to #Tartaglia's answer, here's the source of such a lambda function using Python 3, cobbled together from various sources because I don't have time to do it properly:
import json
import logging
import boto3
logger = logging.getLogger()
logger.setLevel(logging.INFO)
session = boto3.Session(region_name='eu-west-1') # EU (Ireland)
sns_client = session.client('sns')
def lambda_handler(event, context):
logger.info('Received event: %s', event)
for record in event['Records']:
sns_message = record['Sns']
response = sns_client.publish(
TopicArn='YOUR TOPIC ARN HERE',
Subject=sns_message.get('Subject', None),
Message=sns_message.get('Message', None))
logger.info('Publish response: %s', response)
return 'OK'
Background
AWS services are regional (e.g. us-west-2, us-east-1) and the boto3 library requires you to set a default region before accessing the Client or Resources. However, the documentation here shows that you can have an SNS Topic ARN with a wildcard substituted for the region. The documentation says:
Documentation: Amazon Simple Notification Service (Amazon SNS)
Syntax:
arn:aws:sns:region:account-id:topicname
arn:aws:sns:region:account-id:topicname:subscriptionid
Examples:
arn:aws:sns:*:123456789012:my_corporate_topic
arn:aws:sns:us-east-1:123456789012:my_corporate_topic:02034b43-fefa-4e07-a5eb-3be56f8c54ce
Code
When I use boto3's SNS Resource/Client to Publish to a Topic ARN (that has a wildcard for the region), I get the below error. When I don't have the wildcard for the region (e.g. I specify us-west-2), everything works. I looked into the boto3 library and it seems to just replace values in a JSON mapping (e.g. inserts Topic string) so I don't understand why this would be an invalid parameter if the documentation above shows that it's valid.
import boto3
client = boto3.client('sns', region_name='us-west-2')
client.publish(TopicArn='arn:aws:sns:*:123456789:some-topic', Message='SomeMessage')
Error Message
File "/Users/wliu/.virtualenvs/myenv/lib/python2.7/site-packages/botocore/client.py", line 548, in _make_api_call
raise ClientError(parsed_response, operation_name)
ClientError: An error occurred (InvalidParameter) when calling the Publish operation: Invalid parameter: TopicArn Reason: A * ARN must begin with arn:null, not arn:aws:sns:*:123456789:my_topic
The documentation does not show that it's valid for the context in which you are using it. You're misapplying or misinterpreting the documentation, confusing the applicability of patterns and literals. Publish requires a literal, and doesn't mention wildcards in the relevant section of the docs of the underlying API.
You can use wildcards as part of the resource ARN when specifing the resource to which an IAM policy statement applies, when the particular service supports resouce-level policies.
From the SNS-specific policy language documentation:
For Amazon SNS, topics are the only resource type you can specify in a policy. Following is the Amazon Resource Name (ARN) format for topics.
Example
If you had a topic named my_topic in each of the different Regions that Amazon SNS supports, you could specify the topics with the following ARN.
arn:aws:sns:*:123456789012:my_topic
http://docs.aws.amazon.com/sns/latest/dg/UsingIAMwithSNS.html#SNS_ARN_Format
However, this is all applicable only to policies, which also support patterns like arn:aws:sns:*:123456789012:bob_*, and such a pattern would (perhaps more intuitively) not be a valid topic for a Publish request.