Error when sending message from lambda to DLQ - amazon-web-services

I am following this article in order to send from a lambda to a DLQ:
Using dead-letter queues in Amazon SQS — Boto3 documentation
The code is as follows
from datetime import datetime
import json
import os
import boto3
from botocore.vendored import requests
QUEUE_NAME = os.environ['QUEUE_NAME']
MAX_QUEUE_MESSAGES = os.environ['MAX_QUEUE_MESSAGES']
dead_letter_queue_arn = os.environ['DEAD_LETTER_QUEUE_ARN']
sqs = boto3.resource('sqs')
queue_url = os.environ['SQS_QUEUE_URL']
redrive_policy = {
'deadLetterTargetArn': dead_letter_queue_arn,
'maxReceiveCount': '10'
}
def lambda_handler(event, context):
# Receive messages from SQS queue
queue = sqs.get_queue_by_name(QueueName=QUEUE_NAME)
response = requests.post("http://httpbin.org/status/500", timeout=10)
if response.status_code == 500:
sqs.set_queue_attributes(QueueUrl=queue_url,
Attributes={
'RedrivePolicy': json.dumps(redrive_policy)
}
)
I am doing in this way because I need implement exponential backoff, but I cannot even send to DLQ becase this error
[ERROR] AttributeError: 'sqs.ServiceResource' object has no attribute 'set_queue_attributes'
Traceback (most recent call last):
  File "/var/task/lambda_function.py", line 24, in lambda_handler
    sqs.set_queue_attributes(QueueUrl=queue_url,
According to the set_queue_attributes() documentation, the object has the attribute set_queue_attributes.

Well in case anyone has the same problem, there is a difference between client and resource, I suppose the error has the necessary information but with AWS for me was difficult to spot, according to this
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/sqs.html#SQS.Client.set_queue_attributes
response = client.set_queue_attributes(
QueueUrl='string',
Attributes={
'string': 'string'
}
)
You should be using the client
import boto3
client = boto3.client('sqs')
My mistake was I already has something from boto related to sqs
sqs = boto3.resource('sqs')
That is the reason the error
[ERROR] AttributeError: 'sqs.ServiceResource' object has no attribute 'set_queue_attributes'
because I need to use client instead resource from sqs

Related

Error sending MIME Multipart message through AWS Lambda via Amazon SES

I am trying to send an email using Amazon SES, AWS S3, and AWS Lambda together. I have been hitting an error like this for awhile now and I am not completely sure what to do here. I have the stack trace from the error below.
Edit: I have a fully verified Amazon SES domain and I am receiving emails to trigger the Lambda function. I am also able to successfully send emails using the built-in testing features, just not using this function.
{
"errorMessage": "'list' object has no attribute 'encode'",
"errorType": "AttributeError",
"stackTrace": [
" File \"/var/task/lambda_function.py\", line 222, in lambda_handler\n message = create_message(file_dict, header_from, header_to)\n",
" File \"/var/task/lambda_function.py\", line 175, in create_message\n \"Data\": msg.as_string()\n",
" File \"/var/lang/lib/python3.7/email/message.py\", line 158, in as_string\n g.flatten(self, unixfrom=unixfrom)\n",
" File \"/var/lang/lib/python3.7/email/generator.py\", line 116, in flatten\n self._write(msg)\n",
" File \"/var/lang/lib/python3.7/email/generator.py\", line 195, in _write\n self._write_headers(msg)\n",
" File \"/var/lang/lib/python3.7/email/generator.py\", line 222, in _write_headers\n self.write(self.policy.fold(h, v))\n",
" File \"/var/lang/lib/python3.7/email/_policybase.py\", line 326, in fold\n return self._fold(name, value, sanitize=True)\n",
" File \"/var/lang/lib/python3.7/email/_policybase.py\", line 369, in _fold\n parts.append(h.encode(linesep=self.linesep, maxlinelen=maxlinelen))\n"
]
}
Additionally, here is the relevant code. The start of the code is within a create_message() method
import os
import boto3
import email
import re
from botocore.exceptions import ClientError
from email.mime.multipart import MIMEMultipart
from email.mime.text import MIMEText
from email.mime.application import MIMEApplication
. . .
# Create a MIME container.
msg = MIMEMultipart()
# Create a MIME text part.
text_part = MIMEText(body_text, _subtype="html")
# Attach the text part to the MIME message.
msg.attach(text_part)
# Add subject, from and to lines.
msg['Subject'] = subject
msg['From'] = sender
msg['To'] = recipient
# Create a new MIME object.
att = MIMEApplication(file_dict["file"], filename)
att.add_header("Content-Disposition", 'attachment', filename=filename)
# Attach the file object to the message.
msg.attach(att)
message = {
"Source": sender,
"Destinations": recipient,
"Data": msg.as_string() # The error occurs here
}
return message
def send_email(message):
aws_region = os.environ['Region']
# Create a new SES client.
client_ses = boto3.client('ses', region)
# Send the email.
try:
#Provide the contents of the email.
response = client_ses.send_raw_email(
Source=message['Source'],
Destinations=[
message['Destinations']
],
RawMessage={
'Data':message['Data']
}
)
If you have any insight as far as what should be done, that would be greatly appreciated. I've looked at similar questions but they did resolve my issue. Thanks for your help!

Describe listener rule count using Boto3

i need to list listener rules count but still i get a output as null without any error.my complete project was getting email notification from when listener rules created on elastic load balancer.
import json
import boto3
def lambda_handler(event, context):
client = boto3.client('elbv2')
response = client.describe_listeners(
ListenerArns=[
'arn:aws:elasticloadbalancing:ap-south-1:my_alb_listener',
],
)
print('response')
Here is the output of my code
Response
null
Response is null because your indentation is incorrect and you are not returning anything from your handler. It should be:
import json
import boto3
def lambda_handler(event, context):
client = boto3.client('elbv2')
response = client.describe_listeners(
ListenerArns=[
'arn:aws:elasticloadbalancing:ap-south-1:my_alb_listener',
],
)
return response

Missing handler error on Lambda but handler exists in script

My lambda_function.py file looks like this:
from urllib.request import urlopen
from google.cloud import bigquery
import json
url = "https://data2.unhcr.org/population/get/timeseries?widget_id=286725&sv_id=54&population_group=5460&frequency=day&fromDate=1900-01-01"
bq_client = bigquery.Client()
def lambda_helper(event, context):
response = urlopen(url)
data_json = json.loads(response.read())
bq_client.load_table_from_json(data_json['data']['timeseries'], "xxx.xxxx.tablename")
But every time I zip it up and upload it to my Lambda I get this error:
{
"errorMessage": "Handler 'lambda_handler' missing on module 'lambda_function'",
"errorType": "Runtime.HandlerNotFound",
"stackTrace": []
}
Is there a reason this error would be thrown even though I clearly have that function written in this module? This is driving me nuts. Thanks for any help!
It should be:
def lambda_handler(event, context):
not
def lambda_helper(event, context):

Invoking an endpoint in AWS with a multidimensional array

I have deployed a Tensorflow-Model in SageMaker Studio following this tutorial:
https://aws.amazon.com/de/blogs/machine-learning/deploy-trained-keras-or-tensorflow-models-using-amazon-sagemaker/
The Model needs a Multidimensional Array as input. Invoking it from the Notebook itself is working:
import numpy as np
import json
data = np.load("testValues.npy")
pred=predictor.predict(data)
But I wasnt able to invoke it from a boto 3 client using this code:
import json
import boto3
import numpy as np
import io
client = boto3.client('runtime.sagemaker')
datain = np.load("testValues.npy")
data=datain.tolist();
response = client.invoke_endpoint(EndpointName=endpoint_name, Body=json.dumps(data))
response_body = response['Body']
print(response_body.read())
This throws the Error:
An error occurred (ModelError) when calling the InvokeEndpoint operation: Received client error (415) from model with message "{"error": "Unsupported Media Type: Unknown"}".
I guess the reason is the json Media Type but i have no clue how to get it back in shape.
I tried this:https://github.com/aws/amazon-sagemaker-examples/issues/644 but it doesnt seem to change anything
This fixed it for me:
The Content Type was missing.
import json
import boto3
import numpy as np
import io
client = boto3.client('runtime.sagemaker',aws_access_key_id=..., aws_secret_access_key=...,region_name=...)
endpoint_name = '...'
data = np.load("testValues.npy")
payload = json.dumps(data.tolist())
response = client.invoke_endpoint(EndpointName=endpoint_name,
ContentType='application/json',
Body=payload)
result = json.loads(response['Body'].read().decode())
res = result['predictions']
print("test")

Connecting to SNS in Boto and Python

I am new to boto and python, trying to connect sns. Here is my sample code:
import boto
sns = boto.connect_sns(aws_access_key_id="my access", aws_secret_access_key="mysecret", region_name='us-east-1')
I am getting error:
Traceback (most recent call last):
File "sns.py", line 5, in <module>
sns=boto.connect_sns(aws_access_key_id="XXXXXXXX",aws_secret_access_key="XXXXXXX",region_name='us-east-1')
AttributeError: 'module' object has no attribute 'connect_sns'
Any help in this regard is greatly appreciated.
These days, it is recommended that you use boto3 rather than boto (v2). Here's some sample code:
import boto3
client = boto3.client('sns', region_name='ap-southeast-2')
response = client.list_topics()
See: boto3 documentation