I'm not too sure how the formatting works with using json and boto3 in the same file. The function works how it should but I don't know how to get a response from an API without an Internal server error.
I don't know if it is permissions or the code is wrong.
import boto3
import json
def lambda_handler(event, context):
client = boto3.resource('dynamodb')
table = client.Table('Visit_Count')
input = {'Visits': 1}
table.put_item(Item=input)
return {
'statusCode': 200
body: json.dumps("Hello, World!")
}
Instead of body it should be 'body'. Other than that you should check CloudWatch logs for any further lambda errors.
Anon and Marcin were right, I just tried it and it worked
Your Lambda role also need to have dynamodb:PutItem
import boto3
import json
def lambda_handler(event, context):
client = boto3.resource('dynamodb')
table = client.Table('Visit_Count')
input = {'Visits': 1}
table.put_item(Item=input)
return {
'statusCode': 200,
'body': json.dumps('Hello from Lambda!')
}
Related
I have a very simple lambda that returns the same event object it receives.
import json
def lambda_handler(event, context):
print('json.dumps(event)', json.dumps(event))
return json.dumps(event)
I am invoking it with the following
import json
import boto3
lambda_client = boto3.client('lambda', region_name = REGION)
response = lambda_client.invoke(
FunctionName='mylambdafunction',
InvocationType='Event',
Payload=json.dumps( {'key111': 'value1', 'key2': 'value2', 'key3': 'value3'})
)
However, when I do response['Payload'].read() I get an empty bytestring b'' back. Does anyone know what might be the issue? I can see the in the logs the function is receiving the event and attempts to return it, but all I get is that empty byte string no matter what I return payload comes back empty (it is a botocore.response.StreamingBody obj that I apply .read() to).
What is the issue here?
I have a website with a backend of AWS Amplify. For a post-payment function, I am creating a lambda function to update the database. I am trying to query certain fields with the help of AppSync and then run a mutation. This is my function code:
import json
import boto3
import os
import decimal
import requests
from requests_aws4auth import AWS4Auth
def lambda_handler(event, context):
dynamoDB = boto3.resource('dynamodb', region_name='ap-northeast-1')
// load event data (hidden)
userid = sentData.get("userid")
slots = sentData.get("slots")
url = os.environ.get("AWS_GRAPHQL_API_ENDPOINT")
api_key = os.environ.get("AWS_GRAPHQL_API_KEY")
session = requests.Session()
query = """
query MyQuery {
getUserPlan(id: "9ddf437a-55b1-445d-8ae6-254c77493c30") {
traits
traitCount
}
}
"""
credentials = boto3.session.Session().get_credentials()
session.auth = AWS4Auth(
credentials.access_key,
credentials.secret_key,
'ap-northeast-1',
'appsync',
session_token=credentials.token
)
# response = session.request(
# url=url,
# method="POST",
# json={"query": query},
# headers={"Authorization": api_key},
# )
# response = requests.post(
# url=url,
# json={"query": query},
# headers={"x-api-key": api_key}
# )
response = session.request(
url=url,
method="POST",
json={"query": query},
);
print(response.json())
return {
"statusCode": 200,
}
I get the following error when I execute the function:
{'data': {'getUserPlan': None}, 'errors': [{'path': ['getUserPlan'], 'data': None, 'errorType': 'Unauthorized', 'errorInfo': None, 'locations': [{'line': 3, 'column': 9, 'sourceName': None}], 'message': 'Not Authorized to access getUserPlan on type UserPlan'}]}
I have referred to this and this. I have tried their solutions but they haven't worked for me. I have confirmed that all the environment variables are working properly and even added the local aws-cli iam user to the custom-roles.json file for admin privileges by Amplify. When I was trying with the API Key, I made sure that it hadn't expired as well.
I figured out how to fix it. I had to create a function through the amplify-cli, give it access to the api, push the function and then add the name of the role to adminRoleNames in custom-roles.json
I am trying to send an event-driven notification through SMS using aws SNS.
I am trying to write a script to send a message to a single number first.
Here is my below code for that
import json
import boto3
def lambda_handler(event, context):
client = boto3.client("sns")
response = client.publish(
PhoneNumber="+91 xxxxxxxxxx",
Message="Hello World!"
)
print (response)
# TODO implement
return {
'statusCode': 200,
'body': json.dumps('Hello from Lambda!')
}
The thing here is I am not getting any message from this.
Could anyone help me out in achieving this use case of mine?
I'm trying to get EC2 instances from Account B using Lambda in Account A. Not sure what I'm missing.
Account A: Lambda code is running.
Account B: EC2 Instances are running.
below Assume Role prints access key and session token ID, but does not return any results.
IAM role in Account B has AmazonEC2ReadOnlyAccess policy attached and trust relationship has arn:aws:iam::ACCOUNT_A:role/role-name_ACCOUNT_A
This is the code:
import json
import boto3
from collections import OrderedDict
from pprint import pprint
import time
from time import sleep
from datetime import date
import datetime
def lambda_handler(event, context):
# Assume Role To connect to other Account
sts_connection = boto3.client('sts')
acct_b = sts_connection.assume_role(
RoleArn="arn:aws:iam::ACCOUNT_B:role/role_name_account_B",
RoleSessionName="cross_acct_lambda"
)
ACCESS_KEY = acct_b['Credentials']['AccessKeyId']
SECRET_KEY = acct_b['Credentials']['SecretAccessKey']
SESSION_TOKEN = acct_b['Credentials']['SessionToken']
# create service client using the assumed role credentials, e.g. S3
ec2 = boto3.client(
"ec2",
aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY,
aws_session_token=SESSION_TOKEN,
)
status = ec2.describe_instance_status()
pprint(status)
return {
'statusCode': 200,
'body': json.dumps('Hello from Lambda!')
}
Result:
Response:
{
"statusCode": 200,
"body": "\"Hello from Lambda!\""
}
Result:
Response:
{
"statusCode": 200,
"body": "\"Hello from Lambda!\""
}
Request ID:
"ZZZZZZZZZZZZZZZZZZZZ"
Function logs:
START RequestId: ZZZZZZZZZZZZZZZZZZ Version: $LATEST
{'InstanceStatuses': [],
Thanks.
Once I added the region I could see the results, Thanks John Rotenstein and Jarmod for your guidance.
I am aware of the HTTP Data Collector API that can be used to pull data into Azure Log analytics, my ask here is on AWS Cloudwatch data to Azure. We have Azure hosted application and an external AWS hosted Serverless Lamda functions and we want to import the logs of those 13 serverless functions into Azure. I know from the documentation and there is a python function that can be used as a AWS Lamda function and the python example is in MSFT documentation. But what I am failing to understand is what Json format that AWS cloud collector needs to create so they can send it to Azure Log Analytics. Any examples on this ? Any help on how this can be done. I have come across this blog also but that is splunk specific. https://www.splunk.com/blog/2017/02/03/how-to-easily-stream-aws-cloudwatch-logs-to-splunk.html
Hey never mind I was able to dig a little deeper and I found that in AWS I can STREAM the Logs from one Lambda to other Lambda function thru subscription. Once that was setthen all I did was consumed that and on the fly created the JSON and sent it to Azure Logs. In case if you or anyone is interested in it, following is the code:-
import json
import datetime
import hashlib
import hmac
import base64
import boto3
import datetime
import gzip
from botocore.vendored import requests
from datetime import datetime
Update the customer ID to your Log Analytics workspace ID
customer_id = "XXXXXXXYYYYYYYYYYYYZZZZZZZZZZ"
For the shared key, use either the primary or the secondary Connected Sources client authentication key
shared_key = "XXXXXXXXXXXXXXXXXXXXXXXXXX"
The log type is the name of the event that is being submitted
log_type = 'AWSLambdafuncLogReal'
json_data = [{
"slot_ID": 12345,
"ID": "5cdad72f-c848-4df0-8aaa-ffe033e75d57",
"availability_Value": 100,
"performance_Value": 6.954,
"measurement_Name": "last_one_hour",
"duration": 3600,
"warning_Threshold": 0,
"critical_Threshold": 0,
"IsActive": "true"
},
{
"slot_ID": 67890,
"ID": "b6bee458-fb65-492e-996d-61c4d7fbb942",
"availability_Value": 100,
"performance_Value": 3.379,
"measurement_Name": "last_one_hour",
"duration": 3600,
"warning_Threshold": 0,
"critical_Threshold": 0,
"IsActive": "false"
}]
#body = json.dumps(json_data)
#####################
######Functions######
#####################
Build the API signature
def build_signature(customer_id, shared_key, date, content_length, method, content_type, resource):
x_headers = 'x-ms-date:' + date
string_to_hash = method + "\n" + str(content_length) + "\n" + content_type + "\n" + x_headers + "\n" + resource
bytes_to_hash = bytes(string_to_hash, encoding="utf-8")
decoded_key = base64.b64decode(shared_key)
encoded_hash = base64.b64encode(
hmac.new(decoded_key, bytes_to_hash, digestmod=hashlib.sha256).digest()).decode()
authorization = "SharedKey {}:{}".format(customer_id,encoded_hash)
return authorization
Build and send a request to the POST API
def post_data(customer_id, shared_key, body, log_type):
method = 'POST'
content_type = 'application/json'
resource = '/api/logs'
rfc1123date = datetime.utcnow().strftime('%a, %d %b %Y %H:%M:%S GMT')
print (rfc1123date)
content_length = len(body)
signature = build_signature(customer_id, shared_key, rfc1123date, content_length, method, content_type, resource)
uri = 'https://' + customer_id + '.ods.opinsights.azure.com' + resource + '?api-version=2016-04-01'
headers = {
'content-type': content_type,
'Authorization': signature,
'Log-Type': log_type,
'x-ms-date': rfc1123date
}
response = requests.post(uri,data=body, headers=headers)
if (response.status_code >= 200 and response.status_code <= 299):
print("Accepted")
else:
print("Response code: {}".format(response.status_code))
print(response.text)
def lambda_handler(event, context):
cloudwatch_event = event["awslogs"]["data"]
decode_base64 = base64.b64decode(cloudwatch_event)
decompress_data = gzip.decompress(decode_base64)
log_data = json.loads(decompress_data)
print(log_data)
awslogdata = json.dumps(log_data)
post_data(customer_id, shared_key, awslogdata, log_type)