gcp pub/sub push endpoint with key - google-cloud-platform

I am trying to route logs from gcp to pub/sub, but the endpoint I need to send the logs to requires a license key in the header of the JSON body in order to accest the POST request. How can I be able to add a header to the JSON body getting routed with logs? Or how can I add this key to the push endpoint screen?

This answer was provided by #John Hanley in the comment section:
Subscriptions do not add data. Subscription direct how messages are
delivered to subscribers. You will need to add your custom headers in
the data being sent to Pub/Sub messages. For details on what you can
do, read this link.

At this point in time it is not possible to include custom (Authorization) headers in a pub/sub push subscription. (1) You either use an endpoint within GCP to send your pub/sub messages to, which makes use of GCP as an authentication provider. (2) The other option (not recommended at all) is to use an API key in the url and send it to an endpoint of your choice.
(1) You can make use of other GCP services like cloud functions or cloud run to forward pub/sub messages. Create a new request with the appropriate headers and pass on the message to another endpoint. The downside of this approach is that compute costs may spin out of control, since you fire up a cloud function or cloud run instance every time a single message passes through.
You could consider to write a cloud function in Golang to increase the speed. On the other hand, cloud build can handle up to 80 concurrent requests per instance. Someone
(2) Although I still see it very often, using API keys is bad practice. API keys have no expiration, so when they are stolen, someone can access your API indefinitely. That's why Bearer tokens exists with a limited lifetime.
Neither options are perfect, but I would write a small footprint cloud function in Golang to pass on messages to the endpoint of your choice.

Related

Trigger Google Cloud function in one project on publish in Google PubSub from another project

There is an application already pushing to Google PubSub topic T1 in a different project and I would like to have the Google Cloud function triggered by the publishing on T1. I read the official document https://cloud.google.com/functions/docs/calling/pubsub that it is not possible to trigger cross-project and tried creating the cloud function too from the GCP console but do not see the topics from other projects even though I have access to those projects
I am looking for alternatives. Is it possible to achieve it in a different way?
The trick is to use HTTP communication and not native event communication. I mean you have to change your Background functions (that you plug on a PubSub topic) in a HTTP function.
By doing that, you also have to create a PubSub push subscription on your PubSub topic and to add the Cloud Functions HTTP URL as target.
Don't forget to configure the security part to allow only secure and authenticated communication between PubSub and your Function.
Alternatively, you can follow Piotr advice (in comment) and use other HTTP oriented product, like Cloud RUn. But the principle is still the same: PubSub push subscription is the key!

Can I publish an event into Cloud Pub/Sub outside from GCP

Can I publish an event into Cloud Pub/Sub outside from GCP?
Let me clarify my query a bit. In AWS as we are able to publish events into SNS topics directly by invoking REST API via API Gateway from the non-cloud client (https://github.com/cdk-patterns/serverless/blob/master/the-big-fan/README.md), is there any such method in GCP to publish an event into Pub/Sub?
I can see there is a similar question in SO (Acces Google Pub/Sub from outside of GCP), but it is not fully answered my question I believe. Yes authentication is required and it is a cross-cutting functionality, but what is the basic technic to publish an event in Pub/Sub outside from GCP
Yes, you can publish a message into a topic, and then pull it, or even have PubSub deliver it to you through a Push subscription.
When publishing a message, or pulling it from a subscriber, you can access PubSub through the REST or RPC API. In addition, you can use one of the client libraries.
Here you can find an example of how to publish a message using the gcloud CLI tool, an example with the REST API, python and java among other programming languages.
As mentioned in the question referenced, you will need to authenticate in order to either publish a message, or pull it. You can use the quickstart as a reference on how to do so. Notice that you can follow the quickstart from any computer or VM outside GCP.
Finally, if you're using Push subscriptions to receive your messages, your endpoint will need to be a publicly accessible HTTPS address and have a valid SSL certificate signed by a certificate authority. Again, this endpoint can live outside GCP.

How can I transfer logs from Logs Viewer of GCP to Slack or email?

I have been exploring to transfer the JsonPayload message field from Logs viewer service (which are syslogs of a service) of GCP to a slack network, but owing to this I am not able to find any predefined services (like alerting policies to transfer Payload) available on Stackdriver. I have been able to create a counter or distribution user-metrics for logs but this will only provide me with some int64 value instead of a string value or the actual message body. Is there a way in GCP to actually send a payload of logs over slack or any email?
We had a similar issue where we wanted to be able to send certain events to slack and for fatal issues trigger an issue with our ops team via VictorOps.
Couldn't find anything out there to fit our needs so we just created our own slack / VictorOps Cloudfunction.
https://github.com/patiently/gcloud-slack-logger
In GCP, you can export logs to Pub/Sub, Cloud Storage, or BigQuery. There is no other way within GCP to export logs at the moment.
As of 2022, I found this can be done as follows:
In GCP Logs viewer (not legacy version) choose the create alert button.. One of the options here is a GCP notification channel, which supports slack. Some points here:
The slack channel can't be private as far as I can tell
Slack channel must be in your correct slack space. If your org has multiple slack spaces, make sure GCP is trying to connect to correct one.
Put in the log query criteria you want. THen go into Monitoring and you will see this in Alerting dropdown.

Sharing my GCP pub/sub topic with customer to publish

I am working on an assignment where our customers will sync their crm data to our systems. The sync will be ongoing process. Any best practices or suggestions on google pub/sub for sharing one of our existing (or a new one) topic that our customer will publish too and we consume? Idea here is to keep sync asynchronous.
#Alex-hong is true if your customer is on GCP. If not, you have 2 solutions:
You can generate a service account and send it the JSON key file. Define the right role on the service account (as described by Alex) and let your customer use the Service Account and to publish to the PubSub topic. This solution implies an important development by your customer (Use external JSON key, use new libraries to push messages,...)
You can deploy a Cloud Function/Cloud Run endpoint and let your customer to simply perform an HTTP request. Of course, you can secure the call with Basic Authentication or APIkey (or something like that, that you check in your Cloud Function). The function only check the security and publish to PubSub. It's often simpler and more standard for your customer.
On the last point, it could be possible to set up an ESP in place of the function, but I never tested for publishing directly to PubSub.
You can give the other party the roles/pubsub.publisher role for your specific project if you own the topic. Alternatively, if they own the topic, they can grant you the role roles/pubsub.subscriber which will let you subscribe to that topic.
For more information, see Access Control

Difference between two methods to confirm AWS SNS topic subscription confirmation

I'm attempting to subscribe an SNS topic to a HTTPS endpoint I own. I'm reading the docs on how to process incoming messages from SNS and how the subscription confirmation needs to be done. I see two methods of confirmation:
https://docs.aws.amazon.com/sns/latest/dg/sns-http-https-endpoint-as-subscriber.html - Using the subscribeURL. We can perform a HTTP get request on the "SubscribeURL" attribute value and that would confirm the subscription.
Calling the ConfirmSubscription API - We pass the SNS Topic ARN and the token received when SNS sends a confirm subscription message to the SNS endpoint.
I'm trying to understand what's the difference between the two methods. The most obvious one to me was this - The choice of using the API will require AWS credentials since the request needs to be signed. But seems like the same call will succeed with just the HTTP GET request?
What's the best practice out there (if any) and/or which method is the one being followed by other folks using AWS/SNS?
There isn't a difference -- these two alternatives are in fact the same thing.
The SubscribeURL attribute is a pre-constructed (by the service) link to the ConfirmSubscription action on the SNS API endpoint.
The API accepts GET or POST. No signature is required in this case.
This call requires an AWS signature only when the AuthenticateOnUnsubscribe flag is set to "true".
https://docs.aws.amazon.com/sns/latest/api/API_ConfirmSubscription.html
Before SNS will talk to an endpoint, you need to prove that you control that endpoint. So your options are to write some code that can do it automatically (most of the SDKs support this) or to capture the token, and by returning it via the API call prove that you control it.
This is a one-time procedure, so you do not need to deploy any AWS credentials to your API endpoint - you can do it from a different system.
We generally build the confirmation handler into the application.