Can I publish an event into Cloud Pub/Sub outside from GCP - google-cloud-platform

Can I publish an event into Cloud Pub/Sub outside from GCP?
Let me clarify my query a bit. In AWS as we are able to publish events into SNS topics directly by invoking REST API via API Gateway from the non-cloud client (https://github.com/cdk-patterns/serverless/blob/master/the-big-fan/README.md), is there any such method in GCP to publish an event into Pub/Sub?
I can see there is a similar question in SO (Acces Google Pub/Sub from outside of GCP), but it is not fully answered my question I believe. Yes authentication is required and it is a cross-cutting functionality, but what is the basic technic to publish an event in Pub/Sub outside from GCP

Yes, you can publish a message into a topic, and then pull it, or even have PubSub deliver it to you through a Push subscription.
When publishing a message, or pulling it from a subscriber, you can access PubSub through the REST or RPC API. In addition, you can use one of the client libraries.
Here you can find an example of how to publish a message using the gcloud CLI tool, an example with the REST API, python and java among other programming languages.
As mentioned in the question referenced, you will need to authenticate in order to either publish a message, or pull it. You can use the quickstart as a reference on how to do so. Notice that you can follow the quickstart from any computer or VM outside GCP.
Finally, if you're using Push subscriptions to receive your messages, your endpoint will need to be a publicly accessible HTTPS address and have a valid SSL certificate signed by a certificate authority. Again, this endpoint can live outside GCP.

Related

Trigger Google Cloud function in one project on publish in Google PubSub from another project

There is an application already pushing to Google PubSub topic T1 in a different project and I would like to have the Google Cloud function triggered by the publishing on T1. I read the official document https://cloud.google.com/functions/docs/calling/pubsub that it is not possible to trigger cross-project and tried creating the cloud function too from the GCP console but do not see the topics from other projects even though I have access to those projects
I am looking for alternatives. Is it possible to achieve it in a different way?
The trick is to use HTTP communication and not native event communication. I mean you have to change your Background functions (that you plug on a PubSub topic) in a HTTP function.
By doing that, you also have to create a PubSub push subscription on your PubSub topic and to add the Cloud Functions HTTP URL as target.
Don't forget to configure the security part to allow only secure and authenticated communication between PubSub and your Function.
Alternatively, you can follow Piotr advice (in comment) and use other HTTP oriented product, like Cloud RUn. But the principle is still the same: PubSub push subscription is the key!

GCP Deployment manager - Call HTTP endpoint when everything is deployed

I have a template that creates a few resources in GCP, and I want it to either call an HTTP endpoint or publish a message to a topic whenever the deployment completes. I've been checking different services all day, and couldn't find anything about it.
In AWS, it is quite easy to deploy an SNS message that is published to an SNS topic, which is subscribed to SQS Queue, and that triggers a lambda function. In Azure ARM templates, we can use az CLI to invoke a web request and call an endpoint directly.
I couldn't find any similar in GCP. Is there any way of either calling an HTTP endpoint, Cloud Function or perhaps publishing a message to a topic whenever a deployment is finished?
I really appreciate any help.
The best approach in GCP is to Create a Logging Sink using a filter and the Logging query language to only send the Deployment Manager logs to a PubSub topic.
Here is an example of a sink sending deployment manager logs to a PubSub topic previously created:
gcloud logging sinks create $SINK_NAME pubsub.googleapis.com/projects/$YOUR_PROJECT/topics/$TOPIC \
--log-filter='resource.type="deployment"' --description="my sink"
Be careful to Set the destination permissions or will not see the logs in the PubSub topic.
Once you are getting the logs in the PubSub topic, you can configure Cloud Pub/Sub Triggers to fire up an HTTP call based on content of the log.

gcp pub/sub push endpoint with key

I am trying to route logs from gcp to pub/sub, but the endpoint I need to send the logs to requires a license key in the header of the JSON body in order to accest the POST request. How can I be able to add a header to the JSON body getting routed with logs? Or how can I add this key to the push endpoint screen?
This answer was provided by #John Hanley in the comment section:
Subscriptions do not add data. Subscription direct how messages are
delivered to subscribers. You will need to add your custom headers in
the data being sent to Pub/Sub messages. For details on what you can
do, read this link.
At this point in time it is not possible to include custom (Authorization) headers in a pub/sub push subscription. (1) You either use an endpoint within GCP to send your pub/sub messages to, which makes use of GCP as an authentication provider. (2) The other option (not recommended at all) is to use an API key in the url and send it to an endpoint of your choice.
(1) You can make use of other GCP services like cloud functions or cloud run to forward pub/sub messages. Create a new request with the appropriate headers and pass on the message to another endpoint. The downside of this approach is that compute costs may spin out of control, since you fire up a cloud function or cloud run instance every time a single message passes through.
You could consider to write a cloud function in Golang to increase the speed. On the other hand, cloud build can handle up to 80 concurrent requests per instance. Someone
(2) Although I still see it very often, using API keys is bad practice. API keys have no expiration, so when they are stolen, someone can access your API indefinitely. That's why Bearer tokens exists with a limited lifetime.
Neither options are perfect, but I would write a small footprint cloud function in Golang to pass on messages to the endpoint of your choice.

Google Cloud Functions: Pub/Sub vs Rest triggering

Is Pub/Sub significantly faster way of communicating between, say, Kubernetes Engine (GKE) api server and a Cloud Function (GCF)?
Is it possible to use Pub/Sub to have such communication between GKE from one Google Cloud Project and GCF from another Google Cloud Project?
What is the way to communicate with Cloud Functions from another Google Cloud Project with low latency?
I think a global answer will clarify your questions. For this particular case, there are two ways to trigger a Google Cloud Function (GCF). You can directly make an HTTP request or you can subscribe the GCF to a topic by using Pub/Sub [https://cloud.google.com/functions/docs/calling ].
If your requests are occasional, an HTTP request will be faster because you don't need an intermediary. If that's not the case, then the Pub/Sub subscription queues the messages and ensures the delivery by retrying them until it receives confirmation.
To communicate between Google Kubernetes Engine (GKE) from one Google Cloud Project and Google Cloud Function (GCF) to another Google Cloud Project you can use either option. Trigger the GCF by the HTTP request directly or do it by publishing the message. When publishing, specify the project where you are sending it and the desirable topic in that project.
Also you need to give the proper permission to the service account to access from one project to the other:
For Pub/Sub https://cloud.google.com/pubsub/docs/authentication
For HTTP request
https://cloud.google.com/solutions/authentication-in-http-cloud-functions.
Google Cloud Function HTTP triggers documentation here: https://cloud.google.com/functions/docs/calling/http
Pub/Sub documentation here:
https://cloud.google.com/pubsub/docs/reference/libraries (you can
access to GitHub by the links in the code and see functions examples
for each language)

AWS API gateway as subscriber to Google Cloud pubsub

I want to subscribe to a Google Cloud pub-sub topic (which will push notifications about user's inbox changes). Since Google requires a HTTPS endpoint I created an AWS API Gateway endpoint. But I'm unable to add this endpoint as a subscriber.
I get an error saying - "Sorry, there’s a problem. If you entered information, check it and try again. Otherwise, the problem might clear up on its own, so check back later.".
Any ideas why this is happening and how I can fix this?
Azeem