Cloud Function with pubsub trigger topic is not working - google-cloud-platform

I have wrote a simple code to print data and context from a pubsub trigger cloud function.
def main(event, context):
"""Background Cloud Function to be triggered by Pub/Sub.
Args:
event (dict): The dictionary with data specific to this type of
event. The data field contains the PubsubMessage message. The
attributes field will contain custom attributes if there are any.
context (google.cloud.functions.Context): The Cloud Functions event
metadata. The event_id field contains the Pub/Sub message ID. The
timestamp field contains the publish time.
"""
import base64
print("""This Function was triggered by messageId {} published at {}
""".format(context.event_id, context.timestamp))
if 'data' in event:
name = base64.b64decode(event['data']).decode('utf-8')
else:
name = 'World'
print('Hello {}!'.format(name))
Cloud function is deployed successfully but whenever I publish a message to the trigger topic in logs I cannot see any function execution statement.
I have already verified that I am calling main function only and publishing to a right pubsub topic.
I cannot see any error statement so I am not able to debug.
Any Suggestion will be helpful

I tested your code function in python 3.8 runtime and all works fine, are you using the same pub/sub topic for push new messages?
This the code that I used on my computer to send pubsub messages.
from google.cloud import pubsub_v1
publisher = pubsub_v1.PublisherClient()
# The `topic_path` method creates a fully qualified identifier
# in the form `projects/{project_id}/topics/{topic_id}`
topic_path = publisher.topic_path("myprojectID", "test")
for n in range(1, 10):
data = u"Message number {}".format(n)
# Data must be a bytestring
data = data.encode("utf-8")
# When you publish a message, the client returns a future.
future = publisher.publish(topic_path, data=data)
print(future.result())
print("Published messages.")
requirements.txt
google-cloud-pubsub
This is full function code
import base64
def hello_pubsub(event, context):
print("""This Function was triggered by messageId {} published at {}
""".format(context.event_id, context.timestamp))
if 'data' in event:
name = base64.b64decode(event['data']).decode('utf-8')
else:
name = 'World'
print('Hello {}!'.format(name))
Expected output
This Function was triggered by messageId 1449686821351887 published at 2020-08-20T21:26:30.600Z
The logs may appears with a delay of 10-30 secs on stackdriver

Related

Cloudformation "update your Lambda function code so that CloudFormation can attach the updated version"

I am deploying the CloudFormation template from this blog post. I had to update the Lambda functions from python 3.6 to 3.9 to get it to work. Now however I get the following error message:
> CloudFormation did not receive a response from your Custom Resource.
> Please check your logs for requestId
> [029f4ea5-cd25-4593-b1ee-d805dd30463f]. If you are using the Python
> cfn-response module, you may need to update your Lambda function code
> so that CloudFormation can attach the updated version.
Below is the lambda code in question - what does it mean to update the Lambda function "so that CloudFormation can attach the updated version"?
import util.cfnresponse
import boto3
import uuid
client = boto3.client('s3')
cfnresponse = util.cfnresponse
def lambda_handler(event, context):
response_data = {}
try:
if event["RequestType"] == "Create":
bucket_name = uuid.uuid4().hex+'-connect'
# response = client.create_bucket(
# Bucket=bucket_name,
# )
response_data["BucketName"] = bucket_name
cfnresponse.send(event, context, cfnresponse.SUCCESS, response_data)
cfnresponse.send(event, context, cfnresponse.SUCCESS, response_data)
except Exception as e:
print(e)
cfnresponse.send(event, context, cfnresponse.FAILED, response_data)
From what I can tell the response format follows the current version of the response module API?
the cfnrespone lib has changed get updated. Old versions of the lib use the request lib. This CF is over 4 years old so it probably don't work due to this.
You can read about the update on the last rows in the README here:
https://github.com/gene1wood/cfnresponse

Is their a way to pass the output of a Lambda function into another Lambda function to be used as a variable?

I have a Lambda function which returns email addresses into the function log of lambda and i have another lambda function that sends scheduled emails.
I am trying to pass the result from the email addresses function into the second scheduled email function to be used as a variable for the recipients of the scheduled emails.
Here is the code for anyone wondering:
This Function retrieves the email/'s from the database
import pymysql
# RDS config
endpoint = '*******'
username = '*******'
password = '*******'
database_name = '******'
#connection config
connection = pymysql.connect(host=endpoint,user=username,passwd=password,db=database_name)
def handler(event, context):
cursor = connection.cursor()
cursor.execute('SELECT `Presenters`.Email FROM `Main` INNER JOIN `Presenters` ON `Main`.`PresenterID` = `Presenters`.`PresentersID` WHERE `Main`.`Read Day` ="Tuesday"')
rows = cursor.fetchall()
for row in rows:
print("{0}".format(row[0]))
This Second Function sends emails using Python
import os
import smtplib
from email.message import EmailMessage
def lambda_handler(event, context):
EMAIL_ADDRESS = "**********"
EMAIL_PASSWORD = os.environ.get('EMAIL_PASSWORD')
msg = EmailMessage()
msg['Subject'] = "*********"
msg['From'] = EMAIL_ADDRESS
msg['To'] = ['************']
msg.set_content('Hi everyone, a new read timetable has been posted for next week so be sure to check it and keep up to date on your reads, Thank you!')
with smtplib.SMTP_SSL('smtp.gmail.com', 465) as smtp:
smtp.login(EMAIL_ADDRESS, EMAIL_PASSWORD)
smtp.send_message(msg)
There is no such way in the lambda service.
If this is something that has to run sequentially, which is most probably has to, I would strongly recommend looking into Step Functions. Step Functions are basically state machines that orchestrate your workflows by calling your lambda functions (also have support for other compute services) sequentially and the output of one function can be passed in the input for the function that is executed after it.

How can i use the specific data that save in GCP Big Query with cloud function to compare with a value when reached then send an alert to user

I am trying to use the cloud function to take the specific data from the big query and I also want it to compare with the threshold value that I set. When the specific data reach the threshold value then it will send an email to the user.
How should I use the Cloud Function to get the data from Big Query and compare it with the threshold value? Is there any suggestion in the coding part for the cloud function
This is the code that I used to send an email when the value has reached the threshold value
def email(request):
import os
from sendgrid import SendGridAPIClient
from sendgrid.helpers.mail import Mail, Email
from python_http_client.exceptions import HTTPError
sg = SendGridAPIClient(os.environ['EMAIL_API_KEY'])
html_content = "<p>Hello World!</p>"
message = Mail(
to_emails="[Destination]#email.com",
from_email=Email('[YOUR]#gmail.com', "Your name"),
subject="Hello world",
html_content=html_content
)
message.add_bcc("[YOUR]#gmail.com")
try:
response = sg.send(message)
return f"email.status_code={response.status_code}"
#expected 202 Accepted
except HTTPError as e:
return e.message

SQS fifo trigger invoke Lambda Function (1 message - 1 invocation)

I have a SQS FIFO queue triggering a Lambda function.
I sent 10 messages (all different) and the lambda was invoked just once.
Details:
SQS
Visibility timeout: 30 min
Delivery delay: 0 secs
Receive Message Wait Time: 0 secs
Lambda:
Batch size: 1
timeout: 3secs
I don't see any errors on Lambda invocations.
I don't want to touch the delivery delay, but if I increase, seems working.
The avg duration time is less than 1,5ms
Any ideas how I can achieve this?
Should I increase the delivery delay or time out?
The message is being sent from a ecs task with the following code:
from flask import Flask, request, redirect, url_for, send_from_directory, jsonify
app = Flask(__name__)
from werkzeug.utils import secure_filename
import os
import random
import boto3
s3 = boto3.client('s3')
sqs = boto3.client('sqs',region_name='eu-west-1')
#app.route('/', methods=['GET'])
def hello_world():
return 'Hello World!'
#app.route('/upload', methods=['POST'])
def upload():
print (str(random.randint(0,9)))
file = request.files['file']
if file:
filename = secure_filename(file.filename)
file.save(filename)
s3.upload_file(
Bucket = os.environ['bucket'],
Filename=filename,
Key = filename
)
resp = sqs.send_message(
QueueUrl=os.environ['queue'],
MessageBody=filename,
MessageGroupId=filename
)
return jsonify({
'msg': "OK"
})
else:
return jsonify({
'msg': "NOT OK"
})
Check if this helps:
The message deduplication ID is the token used for deduplication of sent messages. If a message with a particular message deduplication ID is sent successfully, any messages sent with the same message deduplication ID are accepted successfully but aren't delivered during the 5-minute deduplication interval.
https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/using-messagededuplicationid-property.html
At least it explains why it works when you increase delivery delay.

How to trigger an AWS Lambda function that takes input from a Flutter form when a submit button is pressed

I have a Lambda function, which currently works. It takes three strings as input and makes an entry in DynamoDB.
I have a Flutter app (which is additionally an AWS Amplify project.)
The code for the Lambda function is as follows:
import json
import logging
import os
import time
import uuid
import boto3
dynamodb = boto3.resource('dynamodb')
def create(event, context):
timestamp = str(time.time())
table = dynamodb.Table(os.environ['DYNAMODB_TABLE'])
item = {
'id': str(uuid.uuid1()),
'name': event['name'],
'description': event['description'],
'price': event['price'],
'createdAt': timestamp,
'updatedAt': timestamp,
}
# write the todo to the database
table.put_item(Item=item)
# create a response
response = {
"statusCode": 200,
"body": json.dumps(item)
}
return response
The form in my Flutter application consists of three text fields and a submit button. I want the three fields to take the item's name, description and price, and trigger the AWS Lambda function to create an entry for the DynamoDB table when he button is pressed.
However, I cannot find any documentation on guide as to how I can use Dart's onpressed feature to successfully trigger the function that takes these three strings as input and creates the entry.
I am open to alternate ways of achieving this if this is impossible using my current design.