Describe listener rule count using Boto3 - amazon-web-services

i need to list listener rules count but still i get a output as null without any error.my complete project was getting email notification from when listener rules created on elastic load balancer.
import json
import boto3
def lambda_handler(event, context):
client = boto3.client('elbv2')
response = client.describe_listeners(
ListenerArns=[
'arn:aws:elasticloadbalancing:ap-south-1:my_alb_listener',
],
)
print('response')
Here is the output of my code
Response
null

Response is null because your indentation is incorrect and you are not returning anything from your handler. It should be:
import json
import boto3
def lambda_handler(event, context):
client = boto3.client('elbv2')
response = client.describe_listeners(
ListenerArns=[
'arn:aws:elasticloadbalancing:ap-south-1:my_alb_listener',
],
)
return response

Related

How can i listen Google Cloud Pub/Sub messages continuously in FastAPI app?

I'm using Google Scheduler to send message to the Pub/Sub topic. I want to listen those messages continuously. My code executes only once and it doesn't listen.
main.py
from fastapi import FastAPI, Depends
from typing import List
from core.config import get_db
from sqlalchemy.orm import Session
app = FastAPI()
from concurrent.futures import TimeoutError
from google.cloud import pubsub_v1
subscriber = pubsub_v1.SubscriberClient()
subscription_path = subscriber.subscription_path("project_id", "subscription_id")
def callback(message: pubsub_v1.subscriber.message.Message) -> None:
print(f"Received {message}.")
message.ack()
streaming_pull_future = subscriber.subscribe(subscription_path, callback=callback)
print(f"Listening for messages on {subscription_path}..\n")
with subscriber:
try:
streaming_pull_future.result(timeout=5)
except TimeoutError:
streaming_pull_future.cancel() # Trigger the shutdown.
streaming_pull_future.result() # Block until the shutdown is complete.
#app.get("/")
def home(db: Session = Depends(get_db)):
return {
"message": "Welcome!"
}
Is there a way to listen pubsub messages continuously in FastAPI?
Thanks!
https://cloud.google.com/pubsub/docs/publish-receive-messages-client-library
https://fastapi.tiangolo.com/advanced/websockets/
I've solved problem just by deleting:
with subscriber:
try:
streaming_pull_future.result(timeout=5)
except TimeoutError:
streaming_pull_future.cancel() # Trigger the shutdown.
streaming_pull_future.result() # Block until the shutdown is complete.

Missing handler error on Lambda but handler exists in script

My lambda_function.py file looks like this:
from urllib.request import urlopen
from google.cloud import bigquery
import json
url = "https://data2.unhcr.org/population/get/timeseries?widget_id=286725&sv_id=54&population_group=5460&frequency=day&fromDate=1900-01-01"
bq_client = bigquery.Client()
def lambda_helper(event, context):
response = urlopen(url)
data_json = json.loads(response.read())
bq_client.load_table_from_json(data_json['data']['timeseries'], "xxx.xxxx.tablename")
But every time I zip it up and upload it to my Lambda I get this error:
{
"errorMessage": "Handler 'lambda_handler' missing on module 'lambda_function'",
"errorType": "Runtime.HandlerNotFound",
"stackTrace": []
}
Is there a reason this error would be thrown even though I clearly have that function written in this module? This is driving me nuts. Thanks for any help!
It should be:
def lambda_handler(event, context):
not
def lambda_helper(event, context):

Error when sending message from lambda to DLQ

I am following this article in order to send from a lambda to a DLQ:
Using dead-letter queues in Amazon SQS — Boto3 documentation
The code is as follows
from datetime import datetime
import json
import os
import boto3
from botocore.vendored import requests
QUEUE_NAME = os.environ['QUEUE_NAME']
MAX_QUEUE_MESSAGES = os.environ['MAX_QUEUE_MESSAGES']
dead_letter_queue_arn = os.environ['DEAD_LETTER_QUEUE_ARN']
sqs = boto3.resource('sqs')
queue_url = os.environ['SQS_QUEUE_URL']
redrive_policy = {
'deadLetterTargetArn': dead_letter_queue_arn,
'maxReceiveCount': '10'
}
def lambda_handler(event, context):
# Receive messages from SQS queue
queue = sqs.get_queue_by_name(QueueName=QUEUE_NAME)
response = requests.post("http://httpbin.org/status/500", timeout=10)
if response.status_code == 500:
sqs.set_queue_attributes(QueueUrl=queue_url,
Attributes={
'RedrivePolicy': json.dumps(redrive_policy)
}
)
I am doing in this way because I need implement exponential backoff, but I cannot even send to DLQ becase this error
[ERROR] AttributeError: 'sqs.ServiceResource' object has no attribute 'set_queue_attributes'
Traceback (most recent call last):
  File "/var/task/lambda_function.py", line 24, in lambda_handler
    sqs.set_queue_attributes(QueueUrl=queue_url,
According to the set_queue_attributes() documentation, the object has the attribute set_queue_attributes.
Well in case anyone has the same problem, there is a difference between client and resource, I suppose the error has the necessary information but with AWS for me was difficult to spot, according to this
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/sqs.html#SQS.Client.set_queue_attributes
response = client.set_queue_attributes(
QueueUrl='string',
Attributes={
'string': 'string'
}
)
You should be using the client
import boto3
client = boto3.client('sqs')
My mistake was I already has something from boto related to sqs
sqs = boto3.resource('sqs')
That is the reason the error
[ERROR] AttributeError: 'sqs.ServiceResource' object has no attribute 'set_queue_attributes'
because I need to use client instead resource from sqs

Lambda-API gateway : "message": "Internal server error"

I am using AWS CodeStar (Lambda + API Gateway) to build my serverless API. My lambda function works well in the Lambda console but strangely throws this error when I run the code on AWS CodeStar:
"message": "Internal server error"
Kindly help me with this issue.
import json
import os
import bz2
import pprint
import hashlib
import sqlite3
import re
from collections import namedtuple
from gzip import GzipFile
from io import BytesIO
from botocore.vendored import requests
import logging
logger = logging.getLogger()
logger.setLevel(logging.DEBUG)
def handler(event, context):
logger.info('## ENVIRONMENT VARIABLES')
logger.info(os.environ)
logger.info('## EVENT')
logger.info(event)
n = get_package_list()
n1 = str(n)
dat = {"total_pack":n1}
return {'statusCode': 200,
'headers': {'Content-Type': 'application/json'},
'body': json.dumps(dat)
}
def get_package_list():
url = "http://amazonlinux.us-east-2.amazonaws.com/2/core/2.0/x86_64/c60ceaf6dfa3bc10e730c9e803b51543250c8a12bb009af00e527a598394cd5e/repodata/primary.sqlite.gz"
db_filename = "dbfile"
resp = requests.get(url, stream=True)
remote_data = resp.raw.read()
cached_fh = BytesIO(remote_data)
compressed_fh = GzipFile(fileobj=cached_fh)
with open(os.path.join('/tmp',db_filename), "wb") as local_fh:
local_fh.write(compressed_fh.read())
package_obj_list = []
db = sqlite3.connect(os.path.join('/tmp',db_filename))
c = db.cursor()
c.execute('SELECT name FROM packages')
for package in c.fetchall():
package_obj_list.append(package)
no_of_packages = len(package_obj_list)
return no_of_packages
Expected Result: should return an Integer (no_of_packages).

How to configure authorization mechanism inline with boto3

I am using boto3 in aws lambda to fecth object in S3 located in Frankfurt Region.
v4 is necessary. otherwise following error will return
"errorMessage": "An error occurred (InvalidRequest) when calling
the GetObject operation: The authorization mechanism you have
provided is not supported. Please use AWS4-HMAC-SHA256."
Realized ways to configure signature_version http://boto3.readthedocs.org/en/latest/guide/configuration.html
But since I am using AWS lambda, I do not have access to underlying configuration profiles
The code of my AWS lambda function
from __future__ import print_function
import boto3
def lambda_handler (event, context):
input_file_bucket = event["Records"][0]["s3"]["bucket"]["name"]
input_file_key = event["Records"][0]["s3"]["object"]["key"]
input_file_name = input_file_bucket+"/"+input_file_key
s3=boto3.resource("s3")
obj = s3.Object(bucket_name=input_file_bucket, key=input_file_key)
response = obj.get()
return event #echo first key valuesdf
Is that possible to configure signature_version within this code ? use Session for example. Or is there any workaround on this?
Instead of using the default session, try using custom session and Config from boto3.session
import boto3
import boto3.session
session = boto3.session.Session(region_name='eu-central-1')
s3client = session.client('s3', config= boto3.session.Config(signature_version='s3v4'))
s3client.get_object(Bucket='<Bkt-Name>', Key='S3-Object-Key')
I tried the session approach, but I had issues. This method worked better for me, your mileage may vary:
s3 = boto3.resource('s3', config=Config(signature_version='s3v4'))
You will need to import Config from botocore.client in order to make this work. See below for a functional method to test a bucket (list objects). This assumes you are running it from an environment where your authentication is managed, such as Amazon EC2 or Lambda with a IAM Role:
import boto3
from botocore.client import Config
from botocore.exceptions import ClientError
def test_bucket(bucket):
print 'testing bucket: ' + bucket
try:
s3 = boto3.resource('s3', config=Config(signature_version='s3v4'))
b = s3.Bucket(bucket)
objects = b.objects.all()
for obj in objects:
print obj.key
print 'bucket test SUCCESS'
except ClientError as e:
print 'Client Error'
print e
print 'bucket test FAIL'
To test it, simply call the method with a bucket name. Your role will have to grant proper permissions.
Using a resource worked for me.
from botocore.client import Config
import boto3
s3 = boto3.resource("s3", config=Config(signature_version="s3v4"))
return s3.meta.client.generate_presigned_url(
"get_object", Params={"Bucket": AIRFLOW_BUCKET, "Key": key}, ExpiresIn=expTime
)