Send Image over AWS SNS Notification with Boto3 - amazon-web-services

I am trying to send an image stored in AWS Lambda /tmp/ folder.
I extract the image from AWS Kinesis Video Stream and write them to /tmp/ using opencv.
I am testing the code locally using boto3 but this would normally be in a lambda function.
import boto3
import base64
if __name__ == '__main__':
client = boto3.client('sns')
phone_number = '+12345678900'
img = open('tmp/image10.jpeg', 'rb').read()
response = client.publish(
PhoneNumber=phone_number,
Message = "here is a picture: ",
MessageAttributes = {
'store' : {"DataType": "Binary", "BinaryValue": base64.b64encode(img)}
}
)
print(response)
I am getting a success response and the text in Message argument but not the image:
{'MessageId': '2d435caa-bb24-5198-9bdc-04ecef05c0ef', 'ResponseMetadata': {'RequestId': '5t97d18b-p15f-554b-b93f-819df53e64bc', 'HTTPStatusCode': 200, 'HTTPHeaders': {'x-amzn-requestid': '5c97d98b-a15f-554b-bf3f-819df53e64bc', 'content-type': 'text/xml', 'content-length': '294', 'date': 'Fri, 30 Oct 2020 17:49:20 GMT'}, 'RetryAttempts': 0}}

Related

Using Postman and AWS Lambdas to upload and download to S3

I have 2 lambda functions, to upload and downlaod a file from an S3 bucket. Im using Postman and configuring the POST and GET requests to have either a filename sent via json (GET Json payload: {"thefile" : "test_upload.txt"} ) and have set a form-data key of "thefile" and the value with the test file selected from my working directory on the computer.
The issue comes when sending the API requests via postman. Its giving me 'Internal Server Error'
The code for my lambdas is below:
**UPLOAD **
import json
import boto3
import os
def lambda_handler(event, context):
s3 = boto3.client('s3')
data = json.loads(event["body"])
file = data["thefile"]
print(file)
try:
s3.upload_file(file, os.environ['BUCKET_NAME'], file)
url = s3.generate_presigned_url(
ClientMethod='get_object',
Params={
'Bucket': os.environ['BUCKET_NAME'],
'Key': file
},
ExpiresIn=24 * 3600
)
print("Upload Successful", url)
return {
"headers": { "Content-Type": "application/json" },
"statusCode": 200,
"isBase64Encoded": False,
"body": str(url)
}
except FileNotFoundError:
print("The file was not found")
return {
"headers": { "Content-Type": "application/json" },
"statusCode": 404,
"isBase64Encoded": False,
"body": "File not found"
}
**DOWNLOAD **
import json
import boto3
import botocore
import base64
from botocore.vendored import requests
def lambda_handler(event, context):
s3 = boto3.client('s3')
data = json.loads(event["body"])
file = data["thefile"]
try:
response = s3.get_object(Bucket=BUCKET_NAME, Key=file,)
download_file = response["Body"].read()
filereturn = base64.b64encode(download_file).decode("utf-8")
return {
"headers": { "Content-Type": "application/json" },
"statusCode": 200,
"body": json.dumps(filereturn),
"isBase64Encoded": True,
"File Downloaded": "File Downloaded successfully"
}
except Exception as e:
return {
"headers": { "Content-Type": "application/json" },
"statusCode": 404,
"body": "Error: File not found!"
}
Can anyone tell me what I'm doing wrong? The Lambdas have full access to S3 in their policy, I've even switched off authorisation and made the bucket public in case it was a permissions error but still nothing. It must be something stupid im either forgetting or have mis-coded but I cant for the life of me figure it out!

Why Records: [] is empty when i consume data from kinesis stream by python script?

I am trying to consume data from kinesis data stream which is created and produce data to it successfully , but when running consumer script in python :
import boto3
import json
from datetime import datetime
import time
my_stream_name = 'test'
kinesis_client = boto3.client('kinesis', region_name='us-east-1')
response = kinesis_client.describe_stream(StreamName=my_stream_name)
my_shard_id = response['StreamDescription']['Shards'][0]['ShardId']
shard_iterator = kinesis_client.get_shard_iterator(StreamName=my_stream_name,
ShardId=my_shard_id,
ShardIteratorType='LATEST')
my_shard_iterator = shard_iterator['ShardIterator']
record_response = kinesis_client.get_records(ShardIterator=my_shard_iterator,
Limit=2)
while 'NextShardIterator' in record_response:
record_response = kinesis_client.get_records(ShardIterator=record_response['NextShardIterator'],
Limit=2)
print(record_response)
# wait for 5 seconds
time.sleep(5)
But the output of the message data is empty ('Records': []):
{'Records': [], 'NextShardIterator':
'AAAAAAAAAAFFVFpvvveOquLUe7WO9nZAcYNQdcS6f6a+YGrrrjZo1gULcu/ZYxC7AB+xVlUhgL9UFPrQ22qmcQa6iIsmuKWl26buBk3utXlVqiGuDUYSgqMOtkp0Y7pJwa6N/I0fYfl2PLTXp5Qz8+5ZYuTW1KDt+PeSU3992bwgdOm7744cxcSnYFaQuHqfa0vLlaRBTOACVz4fwjggUBN01WdsoEjKmgtfNmuHSA7s9LLNzAapMg==',
'MillisBehindLatest': 0, 'ResponseMetadata': {'RequestId':
'e451dd27-c867-cf3d-be83-edbe95e9da9f', 'HTTPStatusCode': 200,
'HTTPHeaders': {'x-amzn-requestid':
'e451dd27-c867-cf3d-be83-edbe95e9da9f', 'x-amz-id-2':
'ClSlC3gRJuEqL9YJcHgC2N/TLSv56o+6406ki2+Zohnfo/erFVMDpPqkEWT+XAeeHXCdhYBbnOeZBPyesbXnVs45KQG78eRU',
'date': 'Thu, 14 Apr 2022 14:23:21 GMT', 'content-type':
'application/x-amz-json-1.1', 'content-length': '308'},
'RetryAttempts': 0}}

No message is showing up in lambda log in aws

I have an SQS queue that triggers a Lambda as soon as it gets a message. The lambda is getting triggered but I can't see any response when I use the receive_message API call.
import json
import urllib.parse
import boto3
sqs = boto3.client('sqs')
def lambda_handler(event, context):
try:
response = sqs.receive_message(
QueueUrl="https://sqs.us-east-1.amazonaws.com/*****/test-queue"
)
print("Response",response)
return
except Exception as e:
print(e)
raise e
Output
{'ResponseMetadata': {'RequestId': 'd4d364b9-ac8c-5dcd-a174-33b4aae995c9', 'HTTPStatusCode': 200, 'HTTPHeaders': {'x-amzn-requestid': 'd4d364b9-ac8c-5dcd-a174-33b4aae995c9', 'date': 'Sun, 12 Apr 2020 12:05:57 GMT', 'content-type': 'text/xml', 'content-length': '240'}, 'RetryAttempts': 0}}
Can someone help me here??
If you have your lambda function being automatically triggered by SQS, then the message(s) will be in the event object.
You don't have to call sqs.receive_message explicitly to read messages from the queue. Just use the event object.

How to create a data catalog in Amazon Glue externally?

I want to create a data catalog externally in Amazon Glue. Is there any way?
AWS Glue Data Catalog consists of meta information about various data sources within AWS, e.g. S3, DynamoDB etc.
Instead of using Crawlers or AWS Console, you can populate data catalog directly with
AWS Glue API
related to different structures, like Database, Table etc. AWS provides several SDKs for different languages, e.g.
boto3 for python with easy to
use object-oriented API. So as long as you know how your data structure, you can use methods
create_database()
create_table()
create_partition()
batch_create_partition()
Create Database definition:
from pprint import pprint
import boto3
client = boto3.client('glue')
response = client.create_database(
DatabaseInput={
'Name': 'my_database', # Required
'Description': 'Database created with boto3 API',
'Parameters': {
'my_param_1': 'my_param_value_1'
},
}
)
pprint(response)
# Output
{
'ResponseMetadata': {
'HTTPHeaders': {
'connection': 'keep-alive',
'content-length': '2',
'content-type': 'application/x-amz-json-1.1',
'date': 'Fri, 11 Oct 2019 12:37:12 GMT',
'x-amzn-requestid': '12345-67890'
},
'HTTPStatusCode': 200,
'RequestId': '12345-67890',
'RetryAttempts': 0
}
}
Create Table definition:
response = client.create_table(
DatabaseName='my_database',
TableInput={
'Name': 'my_table',
'Description': 'Table created with boto3 API',
'StorageDescriptor': {
'Columns': [
{
'Name': 'my_column_1',
'Type': 'string',
'Comment': 'This is very useful column',
},
{
'Name': 'my_column_2',
'Type': 'string',
'Comment': 'This is not as useful',
},
],
'Location': 's3://some/location/on/s3',
},
'Parameters': {
'classification': 'json',
'typeOfData': 'file',
}
}
)
pprint(response)
# Output
{
'ResponseMetadata': {
'HTTPHeaders': {
'connection': 'keep-alive',
'content-length': '2',
'content-type': 'application/x-amz-json-1.1',
'date': 'Fri, 11 Oct 2019 12:38:57 GMT',
'x-amzn-requestid': '67890-12345'
},
'HTTPStatusCode': 200,
'RequestId': '67890-12345',
'RetryAttempts': 0
}
}

How to call an external API webservice from Python in Amazon Lambda?

I am pretty new to AWS Lambda i have a python code which has a post method for making an external API call,
import requests
import json
url = "http://sample/project"
headers = {
'content-type': "application/json"
}
r = request.post(url,headers=headers)
I tried putting it into a AWS Lamda call i tried like this below but it didn't get worked out
import requests
import json
url = "http://sample/project"
headers = {
'content-type': "application/json"
}
def lambda_handler(event, context):
response = requests.request("POST", url, headers=headers)
return response
But i am not getting in any response if i am running from local machine i am getting the output.Please help me how can i make a post call from AWS Lamda