AWS Cloudwatch python sdk API not working - amazon-web-services

I have a root AWS account which is linked with other three sub aws accounts. in my root account I created a Lambda function to get billing metrics from cloudwatch using python SDK and APIs . its working I am using IAM user's access key and secret key which has billing access and all admin access but I copied the lambda code and put into sub account's lambda function it doesn't retrieve any data. I can't understand why its not working in sub account ?
import boto3
from datetime import datetime, timedelta;
def get_metrics(event, context):
ACCESS_KEY='accesskey'
SECRET_KEY='secretkey'
client = boto3.client('cloudwatch',aws_access_key_id=ACCESS_KEY,aws_secret_access_key=SECRET_KEY)
response = client.get_metric_statistics(
Namespace='AWS/Billing',
MetricName='EstimatedCharges',
Dimensions=[
{
'Name': 'LinkedAccount',
'Value': '12 digit account number'
},
{
'Name': 'Currency',
'Value': 'USD'
},
],
StartTime='2017, 12, 19',
EndTime='2017, 12, 21',
Period=86400,
Statistics=[
'Maximum',
],
)
print response

Related

How to trigger an AWS Lambda function that takes input from a Flutter form when a submit button is pressed

I have a Lambda function, which currently works. It takes three strings as input and makes an entry in DynamoDB.
I have a Flutter app (which is additionally an AWS Amplify project.)
The code for the Lambda function is as follows:
import json
import logging
import os
import time
import uuid
import boto3
dynamodb = boto3.resource('dynamodb')
def create(event, context):
timestamp = str(time.time())
table = dynamodb.Table(os.environ['DYNAMODB_TABLE'])
item = {
'id': str(uuid.uuid1()),
'name': event['name'],
'description': event['description'],
'price': event['price'],
'createdAt': timestamp,
'updatedAt': timestamp,
}
# write the todo to the database
table.put_item(Item=item)
# create a response
response = {
"statusCode": 200,
"body": json.dumps(item)
}
return response
The form in my Flutter application consists of three text fields and a submit button. I want the three fields to take the item's name, description and price, and trigger the AWS Lambda function to create an entry for the DynamoDB table when he button is pressed.
However, I cannot find any documentation on guide as to how I can use Dart's onpressed feature to successfully trigger the function that takes these three strings as input and creates the entry.
I am open to alternate ways of achieving this if this is impossible using my current design.

lambda assume role empty results

I'm trying to get EC2 instances from Account B using Lambda in Account A. Not sure what I'm missing.
Account A: Lambda code is running.
Account B: EC2 Instances are running.
below Assume Role prints access key and session token ID, but does not return any results.
IAM role in Account B has AmazonEC2ReadOnlyAccess policy attached and trust relationship has arn:aws:iam::ACCOUNT_A:role/role-name_ACCOUNT_A
This is the code:
import json
import boto3
from collections import OrderedDict
from pprint import pprint
import time
from time import sleep
from datetime import date
import datetime
def lambda_handler(event, context):
# Assume Role To connect to other Account
sts_connection = boto3.client('sts')
acct_b = sts_connection.assume_role(
RoleArn="arn:aws:iam::ACCOUNT_B:role/role_name_account_B",
RoleSessionName="cross_acct_lambda"
)
ACCESS_KEY = acct_b['Credentials']['AccessKeyId']
SECRET_KEY = acct_b['Credentials']['SecretAccessKey']
SESSION_TOKEN = acct_b['Credentials']['SessionToken']
# create service client using the assumed role credentials, e.g. S3
ec2 = boto3.client(
"ec2",
aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY,
aws_session_token=SESSION_TOKEN,
)
status = ec2.describe_instance_status()
pprint(status)
return {
'statusCode': 200,
'body': json.dumps('Hello from Lambda!')
}
Result:
Response:
{
"statusCode": 200,
"body": "\"Hello from Lambda!\""
}
Result:
Response:
{
"statusCode": 200,
"body": "\"Hello from Lambda!\""
}
Request ID:
"ZZZZZZZZZZZZZZZZZZZZ"
Function logs:
START RequestId: ZZZZZZZZZZZZZZZZZZ Version: $LATEST
{'InstanceStatuses': [],
Thanks.
Once I added the region I could see the results, Thanks John Rotenstein and Jarmod for your guidance.

How do I format the Key in my boto3 lambda function to update dynamodb?

I need help with figuring out how to format my Key in lambda to update an item in DynamoDB. Below is the code I have but I can't figure out how to format the Key.
My table looks as follows:
'''
import json
import boto3
dynamodb = boto3.resource('dynamodb')
client = boto3.client('dynamodb')
def lambda_handler(event, context):
response = client.update_item(
TableName='hitcounter',
Key={????????},
UpdateExpression='ADD visits :incr',
ExpressionAttributeValues={':incr': 1}
)
print(response)
'''
ERROR MESSAGE:
'''
{
"errorMessage": "'path'",
"errorType": "KeyError",
"stackTrace": [
" File \"/var/task/lambda_function.py\", line 11, in lambda_handler\n Key={'path': event['path']},\n"
]
}
'''
The AWS docs provides an example for Updating an item:
table.update_item(
Key={
'username': 'janedoe',
'last_name': 'Doe'
},
UpdateExpression='SET age = :val1',
ExpressionAttributeValues={
':val1': 26
}
)
I'm not sure from your question, if the AWS examples are unclear or what is the issue specifically?

AWS Cloudwatch Logs to Azure Log Analytics

I am aware of the HTTP Data Collector API that can be used to pull data into Azure Log analytics, my ask here is on AWS Cloudwatch data to Azure. We have Azure hosted application and an external AWS hosted Serverless Lamda functions and we want to import the logs of those 13 serverless functions into Azure. I know from the documentation and there is a python function that can be used as a AWS Lamda function and the python example is in MSFT documentation. But what I am failing to understand is what Json format that AWS cloud collector needs to create so they can send it to Azure Log Analytics. Any examples on this ? Any help on how this can be done. I have come across this blog also but that is splunk specific. https://www.splunk.com/blog/2017/02/03/how-to-easily-stream-aws-cloudwatch-logs-to-splunk.html
Hey never mind I was able to dig a little deeper and I found that in AWS I can STREAM the Logs from one Lambda to other Lambda function thru subscription. Once that was setthen all I did was consumed that and on the fly created the JSON and sent it to Azure Logs. In case if you or anyone is interested in it, following is the code:-
import json
import datetime
import hashlib
import hmac
import base64
import boto3
import datetime
import gzip
from botocore.vendored import requests
from datetime import datetime
Update the customer ID to your Log Analytics workspace ID
customer_id = "XXXXXXXYYYYYYYYYYYYZZZZZZZZZZ"
For the shared key, use either the primary or the secondary Connected Sources client authentication key
shared_key = "XXXXXXXXXXXXXXXXXXXXXXXXXX"
The log type is the name of the event that is being submitted
log_type = 'AWSLambdafuncLogReal'
json_data = [{
"slot_ID": 12345,
"ID": "5cdad72f-c848-4df0-8aaa-ffe033e75d57",
"availability_Value": 100,
"performance_Value": 6.954,
"measurement_Name": "last_one_hour",
"duration": 3600,
"warning_Threshold": 0,
"critical_Threshold": 0,
"IsActive": "true"
},
{
"slot_ID": 67890,
"ID": "b6bee458-fb65-492e-996d-61c4d7fbb942",
"availability_Value": 100,
"performance_Value": 3.379,
"measurement_Name": "last_one_hour",
"duration": 3600,
"warning_Threshold": 0,
"critical_Threshold": 0,
"IsActive": "false"
}]
#body = json.dumps(json_data)
#####################
######Functions######
#####################
Build the API signature
def build_signature(customer_id, shared_key, date, content_length, method, content_type, resource):
x_headers = 'x-ms-date:' + date
string_to_hash = method + "\n" + str(content_length) + "\n" + content_type + "\n" + x_headers + "\n" + resource
bytes_to_hash = bytes(string_to_hash, encoding="utf-8")
decoded_key = base64.b64decode(shared_key)
encoded_hash = base64.b64encode(
hmac.new(decoded_key, bytes_to_hash, digestmod=hashlib.sha256).digest()).decode()
authorization = "SharedKey {}:{}".format(customer_id,encoded_hash)
return authorization
Build and send a request to the POST API
def post_data(customer_id, shared_key, body, log_type):
method = 'POST'
content_type = 'application/json'
resource = '/api/logs'
rfc1123date = datetime.utcnow().strftime('%a, %d %b %Y %H:%M:%S GMT')
print (rfc1123date)
content_length = len(body)
signature = build_signature(customer_id, shared_key, rfc1123date, content_length, method, content_type, resource)
uri = 'https://' + customer_id + '.ods.opinsights.azure.com' + resource + '?api-version=2016-04-01'
headers = {
'content-type': content_type,
'Authorization': signature,
'Log-Type': log_type,
'x-ms-date': rfc1123date
}
response = requests.post(uri,data=body, headers=headers)
if (response.status_code >= 200 and response.status_code <= 299):
print("Accepted")
else:
print("Response code: {}".format(response.status_code))
print(response.text)
def lambda_handler(event, context):
cloudwatch_event = event["awslogs"]["data"]
decode_base64 = base64.b64decode(cloudwatch_event)
decompress_data = gzip.decompress(decode_base64)
log_data = json.loads(decompress_data)
print(log_data)
awslogdata = json.dumps(log_data)
post_data(customer_id, shared_key, awslogdata, log_type)

AWS EC2 BOTO3: Create Instance Tag while Lanuch EC2 Instance

I am using below script for launch Instance but is there anyway to provide Tag (instance-name)
import boto3
ec2 = boto3.resource('ec2', region_name='us-east-1')
def lambda_handler(event, context):
ec2.create_instances(ImageId='ami-0cf6b4320f9bf5529', InstanceType='t2.micro', MinCount=1, MaxCount=1)
Yes, by providing a TagSpecifications option as detailed in the official documentation for the create_instances method.
See the documentation. There is an example request format.
instance = ec2.create_instances(
...
TagSpecifications=[
{
'ResourceType': 'client-vpn-endpoint'|'customer-gateway'|'dedicated-host'|'dhcp-options'|'elastic-ip'|'fleet'|'fpga-image'|'host-reservation'|'image'|'instance'|'internet-gateway'|'launch-template'|'natgateway'|'network-acl'|'network-interface'|'reserved-instances'|'route-table'|'security-group'|'snapshot'|'spot-instances-request'|'subnet'|'traffic-mirror-filter'|'traffic-mirror-session'|'traffic-mirror-target'|'transit-gateway'|'transit-gateway-attachment'|'transit-gateway-route-table'|'volume'|'vpc'|'vpc-peering-connection'|'vpn-connection'|'vpn-gateway',
'Tags': [
{
'Key': 'string',
'Value': 'string'
},
]
},
],
...
)
You can select the resource type and set the tags as you wish.