gcp cloud functions handle in batches - google-cloud-platform

I'm writing a gcp cloud function and is there a feature to handle a batch of messages put on a pub/sub topic. I mean a single run of the cloud function can handle around 10-30 messages put on the queue. From the examples I have seen the cloud function gets invoked for each message. But in AWS I have seen the option where you can batch multiple messages into one Lambda.

With the traditional method Cloud functions + PubSub receiving messages via push, you won't be able to work with batches since every event will trigger the function.
You could perhaps create a different mechanism (trigger) for example a Cloud Scheduler to trigger the cloud function and pull all messages in the queue (pull mechanism): https://cloud.google.com/pubsub/docs/pull

Related

How to launch a script using an alert in gcp?

I want to generate an alert in Monitoring and Login and I want that when that alert is triggered, my script is executed (which I already have generated in my cloud repository), how could I do it?
This could be possible with Cloud Functions this could trigger a declaration of what occurrence should cause your function to execute.
You can use Google Cloud Pub/Sub Triggers, so when an event is presenting on the system a message is published to a Pub/Sub topic that is specified when a function is deployed. Every message published to this topic will trigger function execution with message contents passed as input data.
In the next guide “Alert-based event” you can find the steps to implement this solution.

Access to the Google Cloud Storage Trigger Events "Pub/Sub"?

I have a Google Cloud Storage Trigger set up on a Cloud Function with max instances of 5, to fire on the google.storage.object.finalize event of a Cloud Storage Bucket. The docs state that these events are "based on" the Cloud Pub/Sub.
Does anyone know:
Is there any way to see configuration of the topic or subscription in the console, or through the CLI?
Is there any way to get the queue depth (or equivalent?)
Is there any way to clear events?
No, No and No. When you plug Cloud Functions to Cloud Storage event, all the stuff are handle behind the scene by Google and you see nothing and you can't interact with anything.
However, you can change the notification mechanism. Instead of plugin directly your Cloud Functions on Cloud Storage Event, plug a PubSub on your Cloud Storage event.
From there, you have access to YOUR pubsub. Monitor the queue, purge it, create the subscription that you want,...
The recomended way to work with storage notifications is using Pubsub.
Legacy storage notifications still work, but with pubsub you can "peek" into the pubsub message queue and clear it if you need it.
Also, you can process pubsub events with cloud run - which is easier to develop and test (just web service), easier to deploy (just a container) and it can process several requests in parallel without having to pay more (great when you have a lot of requests together).
Where does pubsub storage notifications go?
You can see where gcloud notifications go with the gsutil command:
% gsutil notification list gs://__bucket_name__
projects/_/buckets/__bucket_name__/notificationConfigs/1
Cloud Pub/Sub topic: projects/__project_name__/topics/__topic_name__
Filters:
Event Types: OBJECT_FINALIZE
Is there any way to get the queue depth (or equivalent?)
In pubsub you can have many subsciptions to topics.
If there is no subsciption, messages get lost.
To send data to a cloud function or cloud run you setup a push subscription.
In my experience, you won't be able to see what happened because it faster that you can click: you'll find this empty 99.9999% of the time.
You can check the "queue" depht in the console (pubsub -> choose you topics -> choose the subscription).
If you need to troubleshoot this, set up a second subscription with a time to live low enough that it does not use a lot of space (you'll be billed for it).
Is there any way to clear events?
You can empty the messages from the pubsub subscription, but...
... if you're using a push notification agains a cloud function it will much faster than you can "click".
If you need it, it is on the web console (opent the pubsub subscription and click in the vertical "..." on the top right).

is it possible to configure Cloud Scheduler to trigger multiple functions in one Job?

I have 2 cloud functions that run every 5 minutes, currently using two different Cloud Scheduler Jobs, is it possible to configure Cloud Scheduler to run them both at the same time using only 1 job instead of 2.
You have several options. The 2 easiest are:
With Cloud Scheduler publish a message in PubSub instead of calling a Cloud Function. Then add 2 push subscription to PubSub to call your Cloud Functions. The message in entry in the topic is duplicated in each subscription (here 2) and thus the functions are called in parallel. Note: The PubSub message format isn't exactly the same as your own specific for Cloud Functions (if you have data to POST to your function) and you need to rework on this entry point part
With Cloud Scheduler you can call Workflows and in your workflow you can call task in parallel. I wrote an article on that this week
In both cases, you can't do this out of the box and you need to use a intermediary component to perform the fan out of the only one scheduling event.

How to manage burst of AWS Cloud Watch events triggering an AWS Lambda function

I have a service which generates a burst of Cloud Watch Events once every hour. These Cloud Watch events (which could be in thousands) each will trigger an AWS Lambda function and ultimately number of concurrent lambdas running can cross the maximum limit. How can I modify my system such that these cloud watch events will be handled gracefully by Lambda functions or if possible somehow I can distribute all these cloud watch events over the rest period of the first service.
P.S. I do not want to modify the limit on concurrent running lambdas.
have you thought about adding these events to SQS instead of consuming Lambda directly and then configure the SQS to call the Lambda function?
This is how you can trigger a Lambda function by SQS queue
https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqs-configure-lambda-function-trigger.html
and you can define the delay in queue consumption by using this article
https://cloudaffaire.com/how-to-configure-delay-queue-in-sqs/
As prior stated you could use SQS, but I too would advise against this because you would still have the same concurrency issue (more here).
Depending on how quickly you require your processing, it may be a good idea to have lambda "poll" an SQS every couple of minutes, then send the batch of messages to a lambda to process via SNS (rather than have lambda trigger off of SQS PutMessage API Calls).

How to trigger background cloud function with specific duration?

I use Cloud pub/sub and Cloud Functions.
Now, I want to publish a message to a topic which will trigger a background cloud function.
But I want to trigger my cloud function after a specific duration, like 30 seconds later.
How can I do this?
update:
Here is my architecture, is it correct?
Now, I want to publish a message to a topic which will trigger a
background cloud function. But I want to trigger my cloud function after a specific duration, like 30 seconds later.
If you setup PubSub to trigger Cloud Functions on publish events, Cloud Functions will be triggered almost immediately. There is no method to insert a delay.
You will need to implement your code as several major steps:
Setup PubSub Topic and Subscriptions. Do not trigger Cloud Functions on new messages. Messages will just sit waiting for delivery. Send messages to this topic.
Create a Cloud Function that processes PubSub subscriptions. Pull messages and process.
Use another service such as Cloud Tasks, Cloud Scheduler or App Engine Tasks to trigger your Cloud function after your desired delay.
You can use Cloud Tasks to schedule some work to happen on a delay.