I have an API that's from a third party (for me and for gcp), I want to know if it is possible to set a pull that get data from the API into PUBSUB, or from a wrapper API that I can set in my environment.
Basic schema:
(third party API) -Pull-> PUBSUB -Cloud Function-> BigQuery
I think in the code of the third party api, you can use the Google Pub Sub client, pull the messages and acknowledge them if there is no error.
The Cloud Pub/Sub client libraries use streaming pull to receive messages instead of pull.
The link for Pub Sub Python client for example :
https://googleapis.dev/python/pubsub/latest/pubsub/subscriber/index.html#pulling-a-subscription-asynchronously
You can also check this topic : How to pull messages from a subscription with GCP Pub/Sub?
You can write a cloud function, either scheduled via Cloud Scheduler or HTTP-triggered, that pulls the data and publishes to PubSub. Your next Cloud Function can then be PubSub triggered to move the data from your PubSub message to BigQuery.
Related
There is an application already pushing to Google PubSub topic T1 in a different project and I would like to have the Google Cloud function triggered by the publishing on T1. I read the official document https://cloud.google.com/functions/docs/calling/pubsub that it is not possible to trigger cross-project and tried creating the cloud function too from the GCP console but do not see the topics from other projects even though I have access to those projects
I am looking for alternatives. Is it possible to achieve it in a different way?
The trick is to use HTTP communication and not native event communication. I mean you have to change your Background functions (that you plug on a PubSub topic) in a HTTP function.
By doing that, you also have to create a PubSub push subscription on your PubSub topic and to add the Cloud Functions HTTP URL as target.
Don't forget to configure the security part to allow only secure and authenticated communication between PubSub and your Function.
Alternatively, you can follow Piotr advice (in comment) and use other HTTP oriented product, like Cloud RUn. But the principle is still the same: PubSub push subscription is the key!
I'm trying to trigger a Cloud Function when I receive a message via Pub/Sub Subscriber. This subscriber is calling to another project (I'm able to see the messages in my project). But now, I need to trigger a Cloud Function to consume the message.
Do you have any idea how I should follow?
You can't plug a Cloud Functions in project A on a PubSub topic in a project B. You need to deploy a HTTP functions and to create a push subscription to call the function.
Note: you can create your push subscription in the project A or in the project B. All depends who will be charged for the subscription cost.
Can I publish an event into Cloud Pub/Sub outside from GCP?
Let me clarify my query a bit. In AWS as we are able to publish events into SNS topics directly by invoking REST API via API Gateway from the non-cloud client (https://github.com/cdk-patterns/serverless/blob/master/the-big-fan/README.md), is there any such method in GCP to publish an event into Pub/Sub?
I can see there is a similar question in SO (Acces Google Pub/Sub from outside of GCP), but it is not fully answered my question I believe. Yes authentication is required and it is a cross-cutting functionality, but what is the basic technic to publish an event in Pub/Sub outside from GCP
Yes, you can publish a message into a topic, and then pull it, or even have PubSub deliver it to you through a Push subscription.
When publishing a message, or pulling it from a subscriber, you can access PubSub through the REST or RPC API. In addition, you can use one of the client libraries.
Here you can find an example of how to publish a message using the gcloud CLI tool, an example with the REST API, python and java among other programming languages.
As mentioned in the question referenced, you will need to authenticate in order to either publish a message, or pull it. You can use the quickstart as a reference on how to do so. Notice that you can follow the quickstart from any computer or VM outside GCP.
Finally, if you're using Push subscriptions to receive your messages, your endpoint will need to be a publicly accessible HTTPS address and have a valid SSL certificate signed by a certificate authority. Again, this endpoint can live outside GCP.
Is Pub/Sub significantly faster way of communicating between, say, Kubernetes Engine (GKE) api server and a Cloud Function (GCF)?
Is it possible to use Pub/Sub to have such communication between GKE from one Google Cloud Project and GCF from another Google Cloud Project?
What is the way to communicate with Cloud Functions from another Google Cloud Project with low latency?
I think a global answer will clarify your questions. For this particular case, there are two ways to trigger a Google Cloud Function (GCF). You can directly make an HTTP request or you can subscribe the GCF to a topic by using Pub/Sub [https://cloud.google.com/functions/docs/calling ].
If your requests are occasional, an HTTP request will be faster because you don't need an intermediary. If that's not the case, then the Pub/Sub subscription queues the messages and ensures the delivery by retrying them until it receives confirmation.
To communicate between Google Kubernetes Engine (GKE) from one Google Cloud Project and Google Cloud Function (GCF) to another Google Cloud Project you can use either option. Trigger the GCF by the HTTP request directly or do it by publishing the message. When publishing, specify the project where you are sending it and the desirable topic in that project.
Also you need to give the proper permission to the service account to access from one project to the other:
For Pub/Sub https://cloud.google.com/pubsub/docs/authentication
For HTTP request
https://cloud.google.com/solutions/authentication-in-http-cloud-functions.
Google Cloud Function HTTP triggers documentation here: https://cloud.google.com/functions/docs/calling/http
Pub/Sub documentation here:
https://cloud.google.com/pubsub/docs/reference/libraries (you can
access to GitHub by the links in the code and see functions examples
for each language)
I would like to be able to forward a Google Cloud Platform (GCP) Pubsub subscription's messages to another GCP topic, which is possibly in another GCP Project.
Is this possible via Push endpoint URL (What would the url of the destination topic be?)
Some other easy configuration via API?
(workaround) Alternatively, how would you provide this functionality?
There is a Dataflow template called Cloud_PubSub_to_Cloud_PubSub that read messages from a Pubsub subscription (inputSubscription) and publish the messages to a Pubsub topic (outputTopic).
Be sure that the service account used to launch the Dataflow job has the right permissions (roles/pubsub.subscriber in the project where inputSubscription is defined or directly on inputSubscription, and roles/pubsub.publisher in the project containing outputTopic or directly on outputTopic).
You can take a look at the code on GitHub if you are familiar with Java.
The only way to do this would be to publish the messages to the other topic in a subscriber you write when it receives the messages. There is no automated way to forward messages from one topic to another.