Getting Logs from GCP Alerting - google-cloud-platform

I am setting up alerting for GCP VMs. It works fine for email, but I'm trying to use the webhook option. It shows the incident was caught and the webhook triggered. But I don't see an alert on the receiving end. I don't know how to debug since GCP webhooks seems like a blackbox. Does anybody know where I can see the log for the actual webhook call? I'm not sure it is receiving an alert ID from the webhook call.
I'm using this document:
https://cloud.google.com/monitoring/alerts/using-channels-api#api-create-channels
Thanks!
Gary

You configure a webhook notification channel and expect to be notified when incidents occur, but you might not receive any notifications because of following reasons:
1.Private endpoint
You can't use webhooks for notifications unless the endpoint is public.
To resolve this situation, use Pub/Sub notifications combined with a pull subscription to that notification topic.
When you configure a Pub/Sub notification channel, incident notifications are sent to a Pub/Sub queue that has Identity and Access Management controls. Any service that can query for, or listen to, a Pub/Sub topic can consume these notifications. For example, applications running on App Engine, Cloud Run, or Compute Engine virtual machines can consume these notifications.
2.Public endpoint
To identify why the delivery failed, examine your Cloud Logging log entries for failure information.
For example, you can search for log entries for the notification channel resource by using the Logs Explorer, with a filter like the following:
resource.type="stackdriver_notification_channel"
NOTE : Also check whether you have been mentioned in the recipient list, if not mentioned you will not be able to see the alert on the receiving end.
Refer Troubleshooting Alerting policies for information.

Related

How to get google cloud platform alert status via REST apis

I have created an alert policy in google cloud platform. I am getting the emails notifications based on the alert policy.
Now I want to configure an external custom monitoring system for this alert. I want to know the REST APIs that this monitoring system can call at after every 10-20 sec and get the status of this alert.
Please help
If you want to use only Cloud Monitoring, and Alerting Policy you won't achieve this.
It is not possible to configure alert policies to notify (repeatedly) while the policy's conditions are met. Alert policies that are created through Google Cloud Console send a notification only when the condition is met. You can also receive a notification when the condition stops being met.
Additional information can be found in Notifications per incident documentation.
In Cloud Monitoring API v3 - Alerting policies, you can find information that only the creation of the incident is sent.
An alerting policy is a configuration resource that describes the criteria for generating incidents and how to notify you when those incidents are created.
In general, if you want to use Notification Channels to send notifications outside you can use Webhooks or PubSub.
Note
Webhooks only support public endpoints. If you need notifications sent to an endpoint that isn't public, then create a Pub/Sub notification channel and configure a subscription to the Pub/Sub topic. For more information, see Webhook notifications fail when configured for a private endpoint.
As you didn't provide more information it's hard to say if you are not using some built-in features in 3rd party software to integrate with GCP Cloud Monitoring. One of the example is Grafana:
Grafana ships with built-in support for Google Cloud Monitoring. Add it as a data source to build dashboards for your Google Cloud Monitoring metrics.
GCP also might use Prometheus features. Maybe this might give you something similar to what you want.
Prometheus is a monitoring tool often used with Kubernetes. If you configure Cloud Operations for GKE and include Prometheus support, then the metrics that are generated by services using the Prometheus exposition format can be exported from the cluster and made visible as external metrics in Cloud Monitoring.
There are some workarounds, however they won't fulfill what you want.
It is possible to create multiple conditions that identify the same issue. Every time a condition is met, a notification will be received.
It is possible to get users notified when a condition is NOT met, however this might cause spam messages.
The last thing I want to mention is that there is already a Feature Request to add multiple notifications until the condition is gone. More details in FR: Repeat Notifications until condition is gone.
Additional Documentation:
Monitoring Alerts in GCP by integrating Cloud Operations with Notification Channels
Conclusion
Alert policies that are created through Google Cloud Console send a notification only when the condition is met. You can also enable notification to get solved notification.
There is Feature Request to add repeatedly notifications - here
To send notifications to other apps/resources you can use Webhooks or PubSub.

Create an alerting policy based on pub/sub pull subscription

I am new to alert policy creation in google cloud.
I have set up a GKE cluster and enabled upgrade notifications to publish a message to Pub/Sub topic whenever cluster gets upgraded. The Pub/Sub uses pull subscription model. Now whenever a message is published to the pub/sub I need to set-up an alerting policy to pull the message and send an email containing the message content to a distribution channel via email. I need to achieve it without writing Cloud function only through alerting policy?
Can anyone please suggest how to achieve this? Thank you
Alert policy can't read the PubSub messages. The product listen to the logs and when the combination match a policy rule, an action (an alert) is generated.
If you need to send an email on the PubSub message content, you MUST read it (with Cloud Functions, Cloud Run, App Engine or whatever) and:
Either send directly the email with the message content
Or, if you want to use Cloud Alerting, publish a special log format (put a specific key word in the log that you write along to the message content), to let Cloud Alerting detect the log entries and send email alert with the log trace (including your message content)

Google Pubsub Subscription based on particular email id

I am relatively new to GCP platform. I need to create a system wherein my team gets notifications whenever an email is received from there client.
I have to create a system which is publishing messages in pubsub topic filtered by email id.
example : I want to publish only those message/emails in which "To" and "From" fields has "example#gmail.com"
I have referred the online documentations but could not find the workaround.
Is this possible using GCP?
If not is there any other service via which I can achieve the same ?
We are using Gmail as the email client
Thanks
What you can use, is Pub/Sub together with Cloud Functions, which enables to access Gmail programmatically. Your specific scenario could look as following:
User sets up Gmail push notifications: every time a new message arrives at inbox, Gmail will send a notification to Cloud Pub/Sub.
Cloud Pub/Sub delivers the new message notification to Google Cloud Functions.
Upon arrival of the new message notification, a Cloud Functions instance connects to Gmail and retrieves the new message.
Check who send the message, and perform specific actions.
Before setting up a Cloud Function to automatically read your emails, you must authorize its access to Gmail. Have a look for codelab scenario and see step by step how to perform specific actions and adjust the function for your needs. Additionally, you can take a look for official documentation here.

How can I transfer logs from Logs Viewer of GCP to Slack or email?

I have been exploring to transfer the JsonPayload message field from Logs viewer service (which are syslogs of a service) of GCP to a slack network, but owing to this I am not able to find any predefined services (like alerting policies to transfer Payload) available on Stackdriver. I have been able to create a counter or distribution user-metrics for logs but this will only provide me with some int64 value instead of a string value or the actual message body. Is there a way in GCP to actually send a payload of logs over slack or any email?
We had a similar issue where we wanted to be able to send certain events to slack and for fatal issues trigger an issue with our ops team via VictorOps.
Couldn't find anything out there to fit our needs so we just created our own slack / VictorOps Cloudfunction.
https://github.com/patiently/gcloud-slack-logger
In GCP, you can export logs to Pub/Sub, Cloud Storage, or BigQuery. There is no other way within GCP to export logs at the moment.
As of 2022, I found this can be done as follows:
In GCP Logs viewer (not legacy version) choose the create alert button.. One of the options here is a GCP notification channel, which supports slack. Some points here:
The slack channel can't be private as far as I can tell
Slack channel must be in your correct slack space. If your org has multiple slack spaces, make sure GCP is trying to connect to correct one.
Put in the log query criteria you want. THen go into Monitoring and you will see this in Alerting dropdown.

Amazon SNS - Sending SMS, delivery status

I am trying to send messages using Amazon SNS but it's showing atypical behavior. It sends messages to some of the numbers while may or may not to others.
import boto3
client = boto3.client('sns', .....)
client.publish(PhoneNumber, Message)
I am using the publish API to directly send SMS for OTPs without using Topics. Is there a way I can get the delivery status for them? Would region/DND affect the delivery. This is for Indian numbers. I am using Transactional messages for the same.
On the console there is 'Text Messaging Preferences'. One needs to set up, IAM role for CloudWatch Logs access and Report Storage bucket. Once they are set up, Amazon will start storing logs in CloudWatch. Once done it will start showing delivery rates and status in the console UI and detailed logs in CloudWatch.
There are multitudes of reason for which delivery may fail. It lists out all on them. DND can also affect the delivery even if message type is transactional.
Also there is a availalble API to query these logs accordingly.