Using Postman and AWS Lambdas to upload and download to S3 - amazon-web-services

I have 2 lambda functions, to upload and downlaod a file from an S3 bucket. Im using Postman and configuring the POST and GET requests to have either a filename sent via json (GET Json payload: {"thefile" : "test_upload.txt"} ) and have set a form-data key of "thefile" and the value with the test file selected from my working directory on the computer.
The issue comes when sending the API requests via postman. Its giving me 'Internal Server Error'
The code for my lambdas is below:
**UPLOAD **
import json
import boto3
import os
def lambda_handler(event, context):
s3 = boto3.client('s3')
data = json.loads(event["body"])
file = data["thefile"]
print(file)
try:
s3.upload_file(file, os.environ['BUCKET_NAME'], file)
url = s3.generate_presigned_url(
ClientMethod='get_object',
Params={
'Bucket': os.environ['BUCKET_NAME'],
'Key': file
},
ExpiresIn=24 * 3600
)
print("Upload Successful", url)
return {
"headers": { "Content-Type": "application/json" },
"statusCode": 200,
"isBase64Encoded": False,
"body": str(url)
}
except FileNotFoundError:
print("The file was not found")
return {
"headers": { "Content-Type": "application/json" },
"statusCode": 404,
"isBase64Encoded": False,
"body": "File not found"
}
**DOWNLOAD **
import json
import boto3
import botocore
import base64
from botocore.vendored import requests
def lambda_handler(event, context):
s3 = boto3.client('s3')
data = json.loads(event["body"])
file = data["thefile"]
try:
response = s3.get_object(Bucket=BUCKET_NAME, Key=file,)
download_file = response["Body"].read()
filereturn = base64.b64encode(download_file).decode("utf-8")
return {
"headers": { "Content-Type": "application/json" },
"statusCode": 200,
"body": json.dumps(filereturn),
"isBase64Encoded": True,
"File Downloaded": "File Downloaded successfully"
}
except Exception as e:
return {
"headers": { "Content-Type": "application/json" },
"statusCode": 404,
"body": "Error: File not found!"
}
Can anyone tell me what I'm doing wrong? The Lambdas have full access to S3 in their policy, I've even switched off authorisation and made the bucket public in case it was a permissions error but still nothing. It must be something stupid im either forgetting or have mis-coded but I cant for the life of me figure it out!

Related

Fastapi Response body is not shown in postman while importing swagger json

I have defined my response model like this
#router.post("/test", responses=response.test)
where other responses are defined like this
class Example_200(BaseModel):
class Config:
schema_extra = {
"example": {
"data": {
"id": "918814b7d3934655aafb2c1ade2f5554",
"deleted" : True
}
}
}
test = {
"200": {"model": Example_200},
"400": {"model": Example_400},
"404": {"model": Example_404},
}
but when i import openapi json into postman, i am not getting 200 response printed, it is displaying like this, what is wrong here?

Problem importing a dashboard: The response shows a login page

When I use the Superset's API to import a dashboard the response shows me a login page.
I am doing the request using Python.
import requests
headers = {
'accept': 'application/json',
'Authorization': f'Bearer {jwt_token}',
'X-CSRFToken': csrf_token,
'Referer': url
}
files = {
'formData': (
dashboard_path,
open(dashboard_path, 'rb'),
'application/json'
)
}
response = requests.get(url, files=files, headers=headers)
Does anyone know how to solve this problem?
I had some trouble with the Superset API myself, mostly because I did not handle the CSRF Token correctly:
It seems to be important that the retrieval of the JWT Token, the CSRF Token and the actual request happen in the same session.
If I don't do that, I can reproduce your error and are also sent to the login page (also you use a GET request in this example, but it should be POST).
Here an example from my local test-setup:
import requests
session = requests.session()
jwt_token = session.post(
url='http://localhost:8088/api/v1/security/login',
json={
"username": "admin",
"password": "admin",
"refresh": False,
"provider": "db"
}
).json()["access_token"]
csrf_token = session.get(
url='http://localhost:8088/api/v1/security/csrf_token/',
headers={
'Authorization': f'Bearer {jwt_token}',
}
).json()["result"]
headers = {
'accept': 'application/json',
'Authorization': f'Bearer {jwt_token}',
'X-CSRFToken': csrf_token,
}
response = requests.post(
'http://localhost:8088/api/v1/dashboard/import',
headers=headers,
files={
"formData": ('dashboards.json', open("dashboards.json", "rb"), 'application/json')
},
)
session.close()

AWS Lambda: key error when sending a POST message

I have a very simple problem:
my lamda function works fine as long as i do not write something like "a = event["key"], but a = "test":
This is from Cloudwatch:
#message
[ERROR] KeyError: 'key1' Traceback (most recent call last):
#message
[ERROR] KeyError: 'key1' Traceback (most recent call last): File "/var/task/lambda_function.py", line 5, in lambda_handler a = event["key1]
This is what i have sent with postman (i even tried curl) in the body as raw data:
{
"key1": "value1",
"key2": "value2",
"key3": "value3"
}
My lamda function looks like this:
import json
def lambda_handler(event, context):
# TODO implement
a = event["key1"]
return {
'statusCode': 200,
'body': json.dumps(a)
}
REST Api LAMBDA will pass the request as is where as LAMBDA_PROXY will append additonal metadata on query parms, api keys, etc. so, the input request body is passed as json string as attribute body. json.loads(event['body']) will give us the actual request body.
More details on changing integration type is here
Below code can extract key1 from input json object for Lambda_Proxy.
import json
def lambda_handler(event, context):
print(event)
a = json.loads(event['body'])['key1']
return {
'statusCode': 200,
'body': json.dumps(a)
}
Fastest way for me was to use a HTTP API and use form-data with key1=test. Then i printed event["body"] and found out that my body was base64 encoded. I used the following code to make that visible:
import json
import base64
def lambda_handler(event, context):
# TODO implement
a = event["body"]
print(a)
message_bytes = base64.b64decode(a)
message = message_bytes.decode('ascii')
return {
'statusCode': 200,
'body': json.dumps(message)
}
The output was:
"----------------------------852732202793625384403314\r\nContent-Disposition: form-data; name=\"key1\"\r\n\r\ntest\r\n----------------------------852732202793625384403314--\r\n"

DynamoDB using the Serverless Python Template gives KeyError for body

The code for the lambda function is the following:
import json
import logging
import os
import time
import uuid
import boto3
dynamodb = boto3.resource('dynamodb')
def create(event, context):
data = json.loads(event['body'])
if 'text' not in data:
logging.error("Validation Failed")
raise Exception("Couldn't create the todo item.")
timestamp = str(time.time())
table = dynamodb.Table(os.environ['DYNAMODB_TABLE'])
item = {
'id': str(uuid.uuid1()),
'name': data['text'],
'description': data['text'],
'price': data['text'],
'createdAt': timestamp,
'updatedAt': timestamp,
}
# write the todo to the database
table.put_item(Item=item)
# create a response
response = {
"statusCode": 200,
"body": json.dumps(item)
}
return response
The test using AWS' Lambda's testing feature is:
{
"name": "Masks",
"description": "A box of 50 disposable masks",
"price": "$10"
}
The log output is:
START RequestId: 5cf1c00a-dba5-4ef6-b5e7-b692d8235ffe Version: $LATEST
[ERROR] KeyError: 'body'
Traceback (most recent call last):
  File "/var/task/todos/create.py", line 12, in create
    data = json.loads(event['body'])END RequestId: 5cf1c00a-dba5-4ef6-b5e7-b692d8235ffe
Why is "body" giving me a key error? How do I fix this? The template is directly from www.severless.com, and based off of online tutorials, people have used the exact same code, albie with different values, successfully?
I've tried changing variable names and value to no avail.
sls deploy
Does successfully create the table, but I am unable to input any data into it.
Edit 1: For those of you unfamiliar with AWS' Lambda Test feature, using Postman to input the same data is leading either to a 502 Gateway Error.
Assuming that this is the correct event object:
{
"name": "Masks",
"description": "A box of 50 disposable masks",
"price": "$10"
}
your code which matches this event should be:
import json
import logging
import os
import time
import uuid
import boto3
dynamodb = boto3.resource('dynamodb')
def create(event, context):
timestamp = str(time.time())
table = dynamodb.Table(os.environ['DYNAMODB_TABLE'])
item = {
'id': str(uuid.uuid1()),
'name': event['name'],
'description': event['description'],
'price': event['price'],
'createdAt': timestamp,
'updatedAt': timestamp,
}
# write the todo to the database
table.put_item(Item=item)
# create a response
response = {
"statusCode": 200,
"body": json.dumps(item)
}
return response

How to call an external API webservice from Python in Amazon Lambda?

I am pretty new to AWS Lambda i have a python code which has a post method for making an external API call,
import requests
import json
url = "http://sample/project"
headers = {
'content-type': "application/json"
}
r = request.post(url,headers=headers)
I tried putting it into a AWS Lamda call i tried like this below but it didn't get worked out
import requests
import json
url = "http://sample/project"
headers = {
'content-type': "application/json"
}
def lambda_handler(event, context):
response = requests.request("POST", url, headers=headers)
return response
But i am not getting in any response if i am running from local machine i am getting the output.Please help me how can i make a post call from AWS Lamda