Google Cloud Functions: Pub/Sub vs Rest triggering - google-cloud-platform

Is Pub/Sub significantly faster way of communicating between, say, Kubernetes Engine (GKE) api server and a Cloud Function (GCF)?
Is it possible to use Pub/Sub to have such communication between GKE from one Google Cloud Project and GCF from another Google Cloud Project?
What is the way to communicate with Cloud Functions from another Google Cloud Project with low latency?

I think a global answer will clarify your questions. For this particular case, there are two ways to trigger a Google Cloud Function (GCF). You can directly make an HTTP request or you can subscribe the GCF to a topic by using Pub/Sub [https://cloud.google.com/functions/docs/calling ].
If your requests are occasional, an HTTP request will be faster because you don't need an intermediary. If that's not the case, then the Pub/Sub subscription queues the messages and ensures the delivery by retrying them until it receives confirmation.
To communicate between Google Kubernetes Engine (GKE) from one Google Cloud Project and Google Cloud Function (GCF) to another Google Cloud Project you can use either option. Trigger the GCF by the HTTP request directly or do it by publishing the message. When publishing, specify the project where you are sending it and the desirable topic in that project.
Also you need to give the proper permission to the service account to access from one project to the other:
For Pub/Sub https://cloud.google.com/pubsub/docs/authentication
For HTTP request
https://cloud.google.com/solutions/authentication-in-http-cloud-functions.
Google Cloud Function HTTP triggers documentation here: https://cloud.google.com/functions/docs/calling/http
Pub/Sub documentation here:
https://cloud.google.com/pubsub/docs/reference/libraries (you can
access to GitHub by the links in the code and see functions examples
for each language)

Related

Trigger Google Cloud function in one project on publish in Google PubSub from another project

There is an application already pushing to Google PubSub topic T1 in a different project and I would like to have the Google Cloud function triggered by the publishing on T1. I read the official document https://cloud.google.com/functions/docs/calling/pubsub that it is not possible to trigger cross-project and tried creating the cloud function too from the GCP console but do not see the topics from other projects even though I have access to those projects
I am looking for alternatives. Is it possible to achieve it in a different way?
The trick is to use HTTP communication and not native event communication. I mean you have to change your Background functions (that you plug on a PubSub topic) in a HTTP function.
By doing that, you also have to create a PubSub push subscription on your PubSub topic and to add the Cloud Functions HTTP URL as target.
Don't forget to configure the security part to allow only secure and authenticated communication between PubSub and your Function.
Alternatively, you can follow Piotr advice (in comment) and use other HTTP oriented product, like Cloud RUn. But the principle is still the same: PubSub push subscription is the key!

Can I publish an event into Cloud Pub/Sub outside from GCP

Can I publish an event into Cloud Pub/Sub outside from GCP?
Let me clarify my query a bit. In AWS as we are able to publish events into SNS topics directly by invoking REST API via API Gateway from the non-cloud client (https://github.com/cdk-patterns/serverless/blob/master/the-big-fan/README.md), is there any such method in GCP to publish an event into Pub/Sub?
I can see there is a similar question in SO (Acces Google Pub/Sub from outside of GCP), but it is not fully answered my question I believe. Yes authentication is required and it is a cross-cutting functionality, but what is the basic technic to publish an event in Pub/Sub outside from GCP
Yes, you can publish a message into a topic, and then pull it, or even have PubSub deliver it to you through a Push subscription.
When publishing a message, or pulling it from a subscriber, you can access PubSub through the REST or RPC API. In addition, you can use one of the client libraries.
Here you can find an example of how to publish a message using the gcloud CLI tool, an example with the REST API, python and java among other programming languages.
As mentioned in the question referenced, you will need to authenticate in order to either publish a message, or pull it. You can use the quickstart as a reference on how to do so. Notice that you can follow the quickstart from any computer or VM outside GCP.
Finally, if you're using Push subscriptions to receive your messages, your endpoint will need to be a publicly accessible HTTPS address and have a valid SSL certificate signed by a certificate authority. Again, this endpoint can live outside GCP.

GCP Cloud Scheduler Permission Errors with Service Account

I have created a set of cloud functions that work to ingest data into google cloud storage. The functions have been set with a get http request to only accept internal traffic.
However, when I use cloud scheduler to to invoke the functions I continually get permissions errors even while after specifying a service account for each of the functions with the proper permissions. I have set each of the functions to be in the us-central1 region and have researched the docs and Stack overflow with no success so far. Can I receive some assistance with this?
Cloud Scheduler is a serverless product. This means it doesn't belong to your project and not send the request to your Cloud Function through the VPC. In addition, Cloud Scheduler isn't yet supported in VPC SC
Thus, you can't. The workaround is to allow all ingress traffic on cloud function and to uncheck allow-unauthenticated access. Therefore, your function is callable from elsewhere (from internet) BUT you need a valid authentication to invoke it.
Use your service account and add it to Cloud Scheduler for invoking your function. Grant it the sufficient role for this
Alternative
However, if you would like initially not deploy your function publicly accessible on internet (allow internal traffic only ingress mode), there is an alternative.
Change your Cloud Scheduler to publish a PubSub message instead of calling directly your function. Then, deploy your function linked to PubSub topic instead of in HTTP target mode.
You might have some update to perform in your code, especially if you have parameters to handle (initially in the query or the body, now all is in the PubSub message published by Cloud Scheduler). But your function in only callable by your PubSub topic and no other way.
According to the documentation, in order to trigger a Cloud Function from Cloud Scheduler you have to use Pub/Sub. These are the steps:
Create the Cloud Function and make it trigger by a Pub/Sub topic.
Create the Pub/Sub topic.
Create the Cloud Scheduler job that will invoke the Pub/Sub trigger.
Once you do that you will be able to test-run the Cloud Scheduler job and verify whether it's working now. The final schema is something like this:
Cloud Scheduler job => Pub/Sub topic => Cloud Function
Once it's working remember to revert the roles granted to the Cloud Scheduler service account, as this method doesn't require them.
Here I found a blog post that does the same but with a more practical approach that you can follow from a CLI.

Does google cloud have something like "lambda#edge origin response"

Aws let setup lambda#edge function executed after a response is received from the origin(see picture bellow), can I do it with GCP?
UPDATE
This can be done now with the Cloud Functions service in Google Cloud Platform, as stated in the official documentation 1.
"Cloud Functions allows you to trigger your code from Google Cloud, Firebase, and Google Assistant, or call it directly from any web, mobile, or backend application via HTTP."
In this document can find the features comparison of AWS and Azure with matching features in GCP, including the one just mentioned of GCP Cloud Run matching AWS Lambda 2.
Finally, can find documentation on how to write Cloud Functions, the different programming languages used, and examples for the different usages here 3.

google cloud analogy to AWS Lambda

Does google cloud have an analogous functionality to AWS Lambda?
In particular I would like compute resources to be opened up and jobs scheduled via https events.
I'm also interested in any other cloud hosting providers which have similar functionality.
I just found out that there is something that looks interesting in the latest documentation of the SDK's command line tool gcloud.
https://cloud.google.com/sdk/gcloud/reference/alpha/functions/
This sounds exciting.
UPDATE: Google just released some official documentation of an alpha version of Cloud Functions. For now, functions can be written in Javascript using Node, and triggered by Pub/Sub, Cloud Storage, direct HTTP stimuli or manually for debugging purposes.
Google Cloud Storage has Object Change Notification. Only web hooks are currently supported at this time.
A client application can send a request to watch for a bucket's change notification events in order to be notified about changes to a bucket's objects. After a notification channel is initiated, Google Cloud Storage notifies the application any time an object is added, updated, or removed from the bucket.
For example, when you add a new picture to a bucket, an application could be notified to create a thumbnail.
More info can be found at: https://cloud.google.com/storage/docs/object-change-notification
Regarding other providers that have similar functionality, check out IronWorker. You can kick off IronWorker tasks via https endpoints using the webhook endpoint and you can run jobs on multiple clouds. Here's a comparison of Lambda vs IronWorker.
And yes, I work for Iron.io.
Lately Google announced alpha release of Google Cloud Functions which supports http interface.
There is Google Cloud Functions and Microsoft Azure functions, they are both fairly new (Microsoft announced Azure function on March 31 2016)
if you need Lambda with HTTP interface then look at Nano Lambda
They can deploy to any cloud and on premise.