I have a trigger on Eventarc that is supposed to run after each Cloud Scheduler invocation, which is google.cloud.scheduler.v1beta1.CloudScheduler.RunJob
However, it is not being triggered anyhow!
Other triggers, like force run, are working.
I want to trigger a Cloud Run after a Job execution. Is it possible or I am facing a bug?
If you are expecting your Cloud Run service to be executed at each scheduled invocation of Cloud Scheduler, it isn't possible to do so through Eventarc and Cloud Audit logs.
This is due to Cloud Scheduler not being in the list of services that write audit logs. Adding to that, the RunJob event you are filtering by will only get written if you manually execute a job (using the API), and not by your set CRON schedule.
A manual job run did trigger Eventarc when I tested this scenario, but I had to set my trigger as global.
If you would like to execute the Cloud Run service on a schedule, you can do that by having Cloud Scheduler send a request to the service URL directly. Another alternative is to instead of having Eventarc listen to Audit logs, have it listen to messages on a Pub/Sub topic, which will be sent by Cloud Scheduler. Let me know if this was helpful.
Related
I am having use case where one workflow needs to be executed if certain n no. of workflows gets completed. Can we achieve this using eventarc trigger feature of Google Cloud ?
I am reading this https://cloud.google.com/scheduler/docs/tut-pub-sub
They use the setup like below:
Cloud Scheduler -> PubSub -> Cloud Function-> external Service
and If I have a cron job for calling a service once a day, should I still need this pubsub in between?
I know there is an option for HTTP target type in Cloud Scheduler and I think the below setup without PubSub is good enough.
Cloud Scheduler -> Cloud Function-> external Service
Could you give some advice why I should/should not have the PubSub?
The example that you are looking at is Using Pub/Sub to trigger a Cloud Function so it'll include examples with Pub/Sub there. Instead you can deploy a HTTP Cloud function and use it's URL as the target URL as in below screenshot:
Here, Cloud Scheduler will trigger the function without Pub/Sub.
Currently, I create a service in Cloud Run to retrain ML models. The service will do the retrain process when there is an event from BigQuery called google.cloud.bigquery.v2.JobService.InsertJob. I use the EventArc in GCP to trigger the Retrain Service when that event happened. But, there is a problem. The trigger request to the service multiple times in one event. So, sometimes when the retraining process is done, the trigger requests the service again, and then the retraining process is active again. Is there something that I missed? Picture bellow is my EventArc setup.
As we can see in this picture that there are other requests while the first request is in process.
Eventarc is backed on PubSub. By default, and if you don't hack the default Eventarc configuration, the delivery timeout is set to 10s. (you can update manually the pubsub subscription created by eventarc. The Eventarc engineering team is aware of that not customizable parameter)
That's why, you should have a retry every 10s.
You have 2 solutions to that:
Either create an async process. I mean receive the PubSub message (tbe eventarc event), ack it immediately, and, in background, run your retrain
Or (not my preferred way), update the eventarc pubsub subscription and set the message retention duration to 5 seconds.
I'd like to find out what's the best architecture for a python app that gets triggered when a file is uploaded to Google Cloud Storage, does some processing and outputs a file to Google Drive?
I've tried using Cloud Functions but I'm getting a Function invocation was interrupted. Error: memory limit exceeded. in the logs.
I've also followed this tutorial Trigger Cloud Run with events from Eventarc so I know that one way is with EventArc and Cloud Audit logs.
2 questions:
What other methods are there since I require higher memory limits?
How do I get the bucket name and file name from cloud audit logs? through protoPayload.resourceName?
You can use PubSub. You can create a PubSub notification and create a push subscription to the service that you want.
Http cloud function
App Engine
Cloud Run
Any HTTP service running somewhere (VM, Kubernetes, on prem,...)
EventArc is mainly a wrapper of this process and can call only Cloud Run service (for now)
I use Cloud pub/sub and Cloud Functions.
Now, I want to publish a message to a topic which will trigger a background cloud function.
But I want to trigger my cloud function after a specific duration, like 30 seconds later.
How can I do this?
update:
Here is my architecture, is it correct?
Now, I want to publish a message to a topic which will trigger a
background cloud function. But I want to trigger my cloud function after a specific duration, like 30 seconds later.
If you setup PubSub to trigger Cloud Functions on publish events, Cloud Functions will be triggered almost immediately. There is no method to insert a delay.
You will need to implement your code as several major steps:
Setup PubSub Topic and Subscriptions. Do not trigger Cloud Functions on new messages. Messages will just sit waiting for delivery. Send messages to this topic.
Create a Cloud Function that processes PubSub subscriptions. Pull messages and process.
Use another service such as Cloud Tasks, Cloud Scheduler or App Engine Tasks to trigger your Cloud function after your desired delay.
You can use Cloud Tasks to schedule some work to happen on a delay.