Pulling AWS accountID from CloudTrail to use in Lambda function - amazon-web-services

I'm working to pull the accountID from a newly created AWS account within an organization. I'm using a CloudWatch rule that triggers the lambda function off of the CreateAccountResult event name. Within this event, it gives me the createAccountStatus of "SUCCEEDED" as well as the accountID of the new account.
I want to be able to pull JUST the accountID and insert it into a variable within my lambda function.
This lambda function is being used to create an AWS connector to link the account to Trend Micro. Essentially, what I'm using in this script is:
account = '**accountID**'
payload = "{\n \"crossAccountRoleArn\": \"arn:aws:iam" + account + ":role/TrendMicroDSM\",\n \"workspacesEnabled\": true\n}"
I want the account variable to automatically update with the newest account's accountID
Is this even possible?

If you are using Python, the create_account() function returns:
{
'CreateAccountStatus': {
'Id': 'string',
'AccountName': 'string',
'State': 'IN_PROGRESS'|'SUCCEEDED'|'FAILED',
'RequestedTimestamp': datetime(2015, 1, 1),
'CompletedTimestamp': datetime(2015, 1, 1),
'AccountId': 'string',
'GovCloudAccountId': 'string',
'FailureReason': 'ACCOUNT_LIMIT_EXCEEDED'|'EMAIL_ALREADY_EXISTS'|'INVALID_ADDRESS'|'INVALID_EMAIL'|'CONCURRENT_ACCOUNT_MODIFICATION'|'INTERNAL_FAILURE'|'GOVCLOUD_ACCOUNT_ALREADY_EXISTS'
}
}
Therefore, you could simply use:
import boto3
client = boto3.client('organizations')
response = client.create_account(...)
account_id = response['CreateAccountStatus']['AccountId']

Related

How to subscibe to SNS topic with lambda ARN?

I created a lambda function using the aws_cdk.aws_lambda.Function constructor. Following that I aim to subscribe to a SNS topic I created with boto3. However, one of the arguments needed is the lambda function ARN which I try to get with dynamodb_lambda.function_arn but the problem is that the the attributes returns a unresolved token more specifically ${Token[TOKEN.240]}
Here is a portion of the code to further clarify what I'm doing.
dynamodb_lambda = lambda_.Function(
self, #scope
"foobar", #id
runtime = lambda_.Runtime.PYTHON_3_7,
handler = "lambda_handlers.dynamodb_lambda_handler", #filename.methodname at path
code = lambda_.Code.from_asset(path),
role = iam_.Role( #need this for cloudwatch access
self, #scope
"foobar", #id
assumed_by = iam_.ServicePrincipal('lambda.amazonaws.com'),
managed_policies = [
iam_.ManagedPolicy.from_aws_managed_policy_name('DynamoDBFullAccess')
]
)
)
client_sns = boto3.client("sns")
response = client_sns.create_topic(
Name = c.SNS_TOPIC_NAME,
Tags = [ # for easier filtering and searching
{
'Key': 'CohortStudent',
'Value': 'anon'
}
])
client_sns.subscribe(
TopicArn = response['TopicArn'],
Protocol = 'lambda', #usually "email" or "sms", see link above for possible values
Endpoint = dynamodb_lambda.function_arn
)
The last parameter is where I'm facing trouble. The lambda function isn't created yet so the ARN is a token, but the subscribe function doesn't accept that.
botocore.errorfactory.InvalidParameterException: An error occurred (InvalidParameter) when calling the Subscribe operation: Invalid parameter: Lambda endpoint ARN
Please help me understand how to figure this out. Any help is appreciated.
As luk2302 has commented, using boto3 calls to subscribe to a SNS Topic created using aws_cdk is wrong. A simple fix would be to use subscribe to the lambda using aws_cdk.

How to trigger ıot events with lambda

I have a aws ıot events model. I have to trigger that model with using lambda ptyhon functions, like <if this happens, trigger the model like>.
So, how can ı trigger ıot events model with using lambda
Assuming your model is listening for a data from an Input, you can invoke the BatchPutMessage API from Lambda code.
Refer here for documentation.
response = client.batch_put_message(
messages=[
{
'messageId': 'string',
'inputName': 'string',
'payload': b'bytes',
'timestamp': {
'timeInMillis': 123
}
},
]
)

Uploading item in Amazon s3 bucket from React Native with user's info

I am uploading an image on AWS S3 using React Native with AWS amplify for mobile app development. Many users use my app.
I want that whenever any user uploads the image on S3 through the mobile app, I want to get user's ID also along with that image. So that later I can recognise the images on S3 that which image belongs to which user. How can I achieve this?
I am using AWS Auth Cognito for user registration/ Sign-In. I came to know that whenever a user is registered in AWS cognito (for the first time), the user gets a unique ID in the pool. Can I use this user ID to be passed alongwith image whenever user uploads image?
Basically I want to have some form of functionality so that I can track back to the user who uploaded the image on S3. This is because after the image is uploaded on S3, I later want to process this image and send the result back ONLY to the user of the image.
You can store the data in S3 in structure similar to the one below:
users/
123userId/
image1.jpeg
image2.jpeg
anotherUserId456/
image1.png
image2.png
Then, if you need all files from given user, you can use ListObjects API in S3 lambda - docs here
// for lambda function
const AWS = require('aws-sdk');
const s3 = new AWS.S3();
const objects = await s3.listObjectsV2({
Bucket: 'STRING_VALUE', /* required */
Delimiter: 'STRING_VALUE',
EncodingType: url,
ExpectedBucketOwner: 'STRING_VALUE',
Marker: 'STRING_VALUE',
MaxKeys: 'NUMBER_VALUE',
Prefix: 'STRING_VALUE',
RequestPayer: requester
}).promise()
objects.forEach(item => {
console.log(item
)});
Or if you are using S3 Lambda trigger, you can parse userId from "key" / filename in received event in S3 lambda (in case you used structure above).
{
"key": "public/users/e1e0858f-2ea1-90f892b68e0c/item.jpg",
"size": 269582,
"eTag": "db8aafcca5786b62966073f59152de9d",
"sequencer": "006068DC0B344DA9E9"
}
Another option is to write "userId" into metadata of the file that will be uploaded to S3.
You can pass "sub" property from Cognito's currently logged user, so in S3 Lambda trigger function you will get the userId from metadata.
import Auth from "#aws-amplify/auth";
const user = await Auth.currentUserInfo();
const userId = user.attributes.sub;
return userId;
// use userId from Cognito and put in into custom metadata
import {Storage} from "aws-amplify";
const userId = "userIdHere"
const filename = "filename" // or use uuid()
const ref = `public/users/#{userId}/${filename}`
const response = await Storage.put(ref, blob, {
contentType: "image/jpeg",
metadata: {userId: userId},
});
AWS Amplify can do all above automatically (create folder structures, etc.), if you do not need any special structure how files are stored = docs here.
You only need to configure Storage ('globally' or per action) with "level" property.
Storage.configure({ level: 'private' });
await Storage.put(ref, blob, {
contentType: "image/jpeg",
metadata: {userId: userId},
});
//or set up level only for given action
const ref= "userCollection"
await Storage.put(ref, blob, {
contentType: "image/jpeg",
metadata: {userId: userId},
level: "private"
});
So, for example, if you use level "private", file "124.jpeg" will be stored in S3 at
"private/us-east-1:6419087f-d13e-4581-b72e-7a7b32d7c7c1/userCollection/124.jpeg"
However, as you can see, "us-east-1:6419087f-d13e-4581-b72e-7a7b32d7c7c1" looks different than the "sub" in Cognito ("sub" property does not contain regions).
The related discussion is here, also with few workarounds, but basically you need to decide how you will manage user identification in your project on your own (if you use "sub" everywhere as userId, or you will go with another ID - I think it is called identityID and consider that as userId).
PS: If you are using React Native, I guess you will go with Push Notification for sending updates from backend - if that is the case, I was doing something similar ("moderation control") - so I added another Lambda function, Cognito's Post-Confirmation Lambda, that creates user in DynamoDB with ID of Cognitos's "sub" property.
Then user can save token from mobile device needed for push notifications, so when the AWS Rekognition finished detection on the image that user uploaded, I queried DynamoDB and used SNS to send the notification to the end user.

Export DynamoDb metrics logs to S3 or CloudWatch

I'm trying to use DynamoDB metrics logs in an external observability tool.
To do that, I'll need to get these log data from S3 or CloudWatch log groups (not from Insights or CloudTrail).
For this reason, if there isn't a way to use CloudWatch, I'll need to export these metric logs from DynamoDb to S3, and from there export to CloudWatch or try to get those data directly from S3.
Do you know this is possible?
You could try using Logstash, it has a plugin for Cloudwatch and S3:
https://www.elastic.co/guide/en/logstash/current/plugins-inputs-cloudwatch.html
https://www.elastic.co/guide/en/logstash/current/plugins-inputs-s3.html
AWS puts DynamoDB metrics (table operation, table, and account) over CloudWatch metrics. Also, you can create as many metrics as you need. If you use Python, you can read it with boto3. The CloudWatch client has this method:
get_metric_data
Try this with your metrics:
cloudwatch_client = boto3.client('cloudwatch')
yesterday = date.today() - timedelta(days=1)
today = date.today()
response = cloudwatch_client.get_metric_data(
MetricDataQueries=[
{
'Id': 'some_request',
'MetricStat': {
'Metric': {
'Namespace': 'DynamoDB',
'MetricName': 'metric_name',
'Dimensions': []
},
'Period': 3600,
'Stat': 'Sum',
}
},
],
StartTime=datetime(yesterday.year, yesterday.month, yesterday.day),
EndTime=datetime(today.year, today.month, today.day),
)
print(response)

Boto3 - Create S3 'object created' notification to trigger a lambda function

How do I use boto3 to simulate the Add Event Source action on the AWS GUI Console in the Event Sources tab.
I want to programatically create a trigger such that if an object is created in MyBucket, it will call MyLambda function(qualified with an alias).
The relevant api call that I see in the Boto3 documentation is create_event_source_mapping but it states explicitly that it is only for AWS Pull Model while I think that S3 belongs to the Push Model. Anyways, I tried using it but it didn't work.
Scenarios:
Passing a prefix filter would be nice too.
I was looking at the wrong side. This is configured on S3
s3 = boto3.resource('s3')
bucket_name = 'mybucket'
bucket_notification = s3.BucketNotification(bucket_name)
response = bucket_notification.put(
NotificationConfiguration={'LambdaFunctionConfigurations': [
{
'LambdaFunctionArn': 'arn:aws:lambda:us-east-1:033333333:function:mylambda:staging',
'Events': [
's3:ObjectCreated:*'
],
},
]})