I would like to be able to listen for when a GCE instance is started, stopped, deleted. This is so that I can build a dashboard for users to view the status of machines. How can I do this?
You can use Cloud Function to implement such a workflow. Cloud Functions can't "listen" to GCE events directly but they can be triggered when a message is published to a specific PubSub topic.
Now, GCE VM events are actually logged in Cloud Logging, and logs matching a particular filter can be exported to a PubSub topic.
So in Cloud Logging, you could set an advanced log filter like so:
resource.type="gce_instance"
jsonPayload.event_subtype="compute.instances.stop" OR jsonPayload.event_subtype="compute.instances.start"
This filter will filter stop and start events from all VMs in your project. You can see a list of available events here.
Once you've defined the log filter, you can "create sink" and set it to send the filtered logs to a PubSub topic of your choice. More info on how to set up an export sink here.
Now that your event logs are sent to the PubSub topic, you can go to your PubSub topic list, select your topic and click the "Trigger Cloud Function" button. You'll be guided through setting up the Cloud Function that'll be triggered for every new message in that topic. The suggested function code (in nodejs 8 for example):
exports.helloPubSub = (event, context) => {
const pubsubMessage = event.data;
console.log(Buffer.from(pubsubMessage, 'base64').toString());
};
will log the message data where you'll find the event log info. You can then write your Cloud Function to perform whichever process you want, for example updating a Firestore database with the VM instance status.
Related
Currently, I create a service in Cloud Run to retrain ML models. The service will do the retrain process when there is an event from BigQuery called google.cloud.bigquery.v2.JobService.InsertJob. I use the EventArc in GCP to trigger the Retrain Service when that event happened. But, there is a problem. The trigger request to the service multiple times in one event. So, sometimes when the retraining process is done, the trigger requests the service again, and then the retraining process is active again. Is there something that I missed? Picture bellow is my EventArc setup.
As we can see in this picture that there are other requests while the first request is in process.
Eventarc is backed on PubSub. By default, and if you don't hack the default Eventarc configuration, the delivery timeout is set to 10s. (you can update manually the pubsub subscription created by eventarc. The Eventarc engineering team is aware of that not customizable parameter)
That's why, you should have a retry every 10s.
You have 2 solutions to that:
Either create an async process. I mean receive the PubSub message (tbe eventarc event), ack it immediately, and, in background, run your retrain
Or (not my preferred way), update the eventarc pubsub subscription and set the message retention duration to 5 seconds.
I have a Google Cloud Storage Trigger set up on a Cloud Function with max instances of 5, to fire on the google.storage.object.finalize event of a Cloud Storage Bucket. The docs state that these events are "based on" the Cloud Pub/Sub.
Does anyone know:
Is there any way to see configuration of the topic or subscription in the console, or through the CLI?
Is there any way to get the queue depth (or equivalent?)
Is there any way to clear events?
No, No and No. When you plug Cloud Functions to Cloud Storage event, all the stuff are handle behind the scene by Google and you see nothing and you can't interact with anything.
However, you can change the notification mechanism. Instead of plugin directly your Cloud Functions on Cloud Storage Event, plug a PubSub on your Cloud Storage event.
From there, you have access to YOUR pubsub. Monitor the queue, purge it, create the subscription that you want,...
The recomended way to work with storage notifications is using Pubsub.
Legacy storage notifications still work, but with pubsub you can "peek" into the pubsub message queue and clear it if you need it.
Also, you can process pubsub events with cloud run - which is easier to develop and test (just web service), easier to deploy (just a container) and it can process several requests in parallel without having to pay more (great when you have a lot of requests together).
Where does pubsub storage notifications go?
You can see where gcloud notifications go with the gsutil command:
% gsutil notification list gs://__bucket_name__
projects/_/buckets/__bucket_name__/notificationConfigs/1
Cloud Pub/Sub topic: projects/__project_name__/topics/__topic_name__
Filters:
Event Types: OBJECT_FINALIZE
Is there any way to get the queue depth (or equivalent?)
In pubsub you can have many subsciptions to topics.
If there is no subsciption, messages get lost.
To send data to a cloud function or cloud run you setup a push subscription.
In my experience, you won't be able to see what happened because it faster that you can click: you'll find this empty 99.9999% of the time.
You can check the "queue" depht in the console (pubsub -> choose you topics -> choose the subscription).
If you need to troubleshoot this, set up a second subscription with a time to live low enough that it does not use a lot of space (you'll be billed for it).
Is there any way to clear events?
You can empty the messages from the pubsub subscription, but...
... if you're using a push notification agains a cloud function it will much faster than you can "click".
If you need it, it is on the web console (opent the pubsub subscription and click in the vertical "..." on the top right).
I want to deploy a Pubsub-triggered Cloud Function with message ordering:
https://cloud.google.com/pubsub/docs/ordering
gcloud functions deploy doesn't have an option to set an --enable-message-ordering option:
https://cloud.google.com/sdk/gcloud/reference/functions/deploy
Should I pre-create the subscription before deploying the function? If so, does Cloud Functions have a well-known format for how it matches to a subscription name? It seems maybe the format is: gcf-{function-name}-{region}-{topic-name}, but it also looks like the name format has changed over time, e.g. older deployed functions don't have the region name in the subscription. Is there a stable way to do this?
You must create message ordering pub/sub and Cloud function manually.
Fisrt, Create a pub/sub topic, and then create a subscription that subscribes pub/sub topic with --enable-message-ordering
Second, Create a Cloud function that will serve ordered pub/sub messages.
Last, back to the pub/sub subscription, Edit delivery type to push and specify your cloud function endpoint.
So final diagram is like below.
Publisher -> Pub/sub topic -> Pub/sub subscriber -> Cloud function
You tried to make a connection Pub/sub topic with Cloud function directly.
But for message ordering, Pub/sub needs topic -> subscriber connection.
So only pub/sub topic -> pub/sub subscriber -> Cloud function connection can delivers ordered messages to your function.
When declaring a Pub/Sub topic with a Cloud Function alike:
exports.pubsub = functions.pubsub.topic('some-topic').onPublish((message, context) => {}
The problem is that message ordering is only available for the subscription, but not for the topic.On deploy a push subscription is automatically being created at: cloudpubsub/subscription/list (the one which starts with gcf-*). This only appears to work when manually subscribing to a topic: Enabling message ordering. I haven't yet tried if it would pick up a subscription with the same name; if everything fails, one still could record the messages and then order by timestamp or a "sequence token": https://www.youtube.com/watch?v=nQ9_Xur2aM4.
I would like to receive a notification on my Notification Channel every time in Cloud Build a Build on master fails.
Now there were mentions of using Log Viewer but it seems like there is no immediate way of accessing the branch.
Is there another way where I can create a Monitoring Alert/a Metric which is specific to master?
A easy solution might be to define a logging metric and link an alerting trigger to this.
Configure Slack alerting in Notification channels of GCP.
Define your logging metric trigger in Logs-based Metrics. Make a Counter with Units 1 and filter using the logging query language:
resource.type="build"
severity=ERROR
or
resource.type="build"
textPayload=~"^ERROR:"
Create an Alerting Policy with that metric you've just defined and link the trigger to your Slack notification channel you've configured in step 1.
you can create Cloud Build notifications sending you updates to desired channels, such as Slack or your SMTP server HTTP channel. Also create a PubSub topic when your build's state changes, such as when your build is created, when your build transitions to a working state.
I just went through the pain of trying to get the official GCP slack integration via Cloud Run working. It was too cumbersome and didn't let me customize what I wanted.
Best solution I see is to get Cloud Build setup to send Pub/Sub messages to the cloud-builds topic. With that, you can use the below repo I just made public to filter on the specific branch you want but looking at the data_json['substitutions']['BRANCH_NAME'] field.
https://github.com/Ucnt/gcp-cloud-build-slack-notifier
Not sure how to search this; I'm looking for a way to trigger a Cloud Function whenever a new row is inserted into a database in Cloud SQL. The search for "google cloud function events" (or "triggers") turn up Firebase results, which is not what I want.
There are a series of Cloud Functions that receive data and transform it according to the clients' needs; in the end, after some manipulation, that data ends up in a table. Is there an event I can listen to so I can access the newly inserted rows? If not, I might end up using the Cloud Scheduler and peek regularly into the DB. However, this solution doesn't seem viable for long-term.
I'd appreciate any advice.
Currently there is no official Cloud Function event which could be triggered on changes to a Cloud SQL database. You can check the available events in the Events and Triggers documentation.
You could still do something like it with Cloud Pub/Sub, and it could be done in 2 ways:
1 - The first would be to enable and export logs from the Cloud SQL instance to a Pub/Sub topic by creating a sink on Stackdriver, and have the Cloud Function listen to that topic.
Although this method does not require you to change the way you are inserting data to the DB, it might expose too much information, as all queries will be logged on Stackdriver. It also means you would not have full control of what information is passed to the function, as the message would be the contents of the log entry.
2 - The ideal solution would be to create the Pub/Sub topic and publish to it when you insert new data to the database. This way you have more control over the information sent to the topic. You can find more information about how to set up a new topic in the Cloud Pub/Sub documentation.