I have created a set of cloud functions that work to ingest data into google cloud storage. The functions have been set with a get http request to only accept internal traffic.
However, when I use cloud scheduler to to invoke the functions I continually get permissions errors even while after specifying a service account for each of the functions with the proper permissions. I have set each of the functions to be in the us-central1 region and have researched the docs and Stack overflow with no success so far. Can I receive some assistance with this?
Cloud Scheduler is a serverless product. This means it doesn't belong to your project and not send the request to your Cloud Function through the VPC. In addition, Cloud Scheduler isn't yet supported in VPC SC
Thus, you can't. The workaround is to allow all ingress traffic on cloud function and to uncheck allow-unauthenticated access. Therefore, your function is callable from elsewhere (from internet) BUT you need a valid authentication to invoke it.
Use your service account and add it to Cloud Scheduler for invoking your function. Grant it the sufficient role for this
Alternative
However, if you would like initially not deploy your function publicly accessible on internet (allow internal traffic only ingress mode), there is an alternative.
Change your Cloud Scheduler to publish a PubSub message instead of calling directly your function. Then, deploy your function linked to PubSub topic instead of in HTTP target mode.
You might have some update to perform in your code, especially if you have parameters to handle (initially in the query or the body, now all is in the PubSub message published by Cloud Scheduler). But your function in only callable by your PubSub topic and no other way.
According to the documentation, in order to trigger a Cloud Function from Cloud Scheduler you have to use Pub/Sub. These are the steps:
Create the Cloud Function and make it trigger by a Pub/Sub topic.
Create the Pub/Sub topic.
Create the Cloud Scheduler job that will invoke the Pub/Sub trigger.
Once you do that you will be able to test-run the Cloud Scheduler job and verify whether it's working now. The final schema is something like this:
Cloud Scheduler job => Pub/Sub topic => Cloud Function
Once it's working remember to revert the roles granted to the Cloud Scheduler service account, as this method doesn't require them.
Here I found a blog post that does the same but with a more practical approach that you can follow from a CLI.
Related
There is an application already pushing to Google PubSub topic T1 in a different project and I would like to have the Google Cloud function triggered by the publishing on T1. I read the official document https://cloud.google.com/functions/docs/calling/pubsub that it is not possible to trigger cross-project and tried creating the cloud function too from the GCP console but do not see the topics from other projects even though I have access to those projects
I am looking for alternatives. Is it possible to achieve it in a different way?
The trick is to use HTTP communication and not native event communication. I mean you have to change your Background functions (that you plug on a PubSub topic) in a HTTP function.
By doing that, you also have to create a PubSub push subscription on your PubSub topic and to add the Cloud Functions HTTP URL as target.
Don't forget to configure the security part to allow only secure and authenticated communication between PubSub and your Function.
Alternatively, you can follow Piotr advice (in comment) and use other HTTP oriented product, like Cloud RUn. But the principle is still the same: PubSub push subscription is the key!
I have a template that creates a few resources in GCP, and I want it to either call an HTTP endpoint or publish a message to a topic whenever the deployment completes. I've been checking different services all day, and couldn't find anything about it.
In AWS, it is quite easy to deploy an SNS message that is published to an SNS topic, which is subscribed to SQS Queue, and that triggers a lambda function. In Azure ARM templates, we can use az CLI to invoke a web request and call an endpoint directly.
I couldn't find any similar in GCP. Is there any way of either calling an HTTP endpoint, Cloud Function or perhaps publishing a message to a topic whenever a deployment is finished?
I really appreciate any help.
The best approach in GCP is to Create a Logging Sink using a filter and the Logging query language to only send the Deployment Manager logs to a PubSub topic.
Here is an example of a sink sending deployment manager logs to a PubSub topic previously created:
gcloud logging sinks create $SINK_NAME pubsub.googleapis.com/projects/$YOUR_PROJECT/topics/$TOPIC \
--log-filter='resource.type="deployment"' --description="my sink"
Be careful to Set the destination permissions or will not see the logs in the PubSub topic.
Once you are getting the logs in the PubSub topic, you can configure Cloud Pub/Sub Triggers to fire up an HTTP call based on content of the log.
According to the "Authenticating service-to-service" documentation for Cloud Run, to use Pub/Sub and Cloud Scheduler on a service, unauthenticated access must be disabled because they rely on HTTP calls because of the zero scaling capability of Cloud Run services.
My services allow internal and Load Balancer traffic and must be publicly available for frontend clients, but they also must be able to communicate with each other privately with Pub/Sub.
Is there a way to achieve this? It feels unnatural to create a separate private service just for using Pub/Sub.
It's a missing piece. You can't plug in your VPC PubSub push subscription and Cloud Scheduler (but also Cloud Task, Cloud Build, Workflows,...). I asked Google Cloud few months ago, and it should be fixed by a new network features, soon. At least in 2021!
So, in your case, if your Cloud Run service is accessible from the public internet through a Load Balancer, you can use this public endpoint to call the path that you want on your service and thus perform the process.
If your Cloud Run in only accessible from ingress=internal, you can't for now.
When deploying a background Cloud Function for Cloud Pub/Sub via:
gcloud functions deploy function_name --runtime python37 --trigger-topic some_topic
A subscription gets automatically created with a push endpoint (likely App Engine standard endpoint, but those are claimed to be without the need of
domain verification https://cloud.google.com/pubsub/docs/push#other-endpoints).
For the generated subscription/endpoint there doesn't seem like a way to register/verify the domain
(https://www.google.com/webmasters/verification, https://console.cloud.google.com/apis/credentials/domainverification)
As a result the user cannot alter the subscription, for instance:
gcloud alpha pubsub subscriptions update some_subscription --ack-deadline=10
Would yield something like "INVALID_ARGUMENT: The supplied HTTP URL is not registered in the subscription's parent project"
Doing it via the web interface on GCP would also yield an error: "pubsub error INVALID_ARGUMENT"
When connecting to a Cloud Pub/Sub topic via Cloud Functions, a subscription is created on your behalf. This subscription is managed by Cloud Functions and therefore it is not possible to update the acknowledgement deadline.
Is Pub/Sub significantly faster way of communicating between, say, Kubernetes Engine (GKE) api server and a Cloud Function (GCF)?
Is it possible to use Pub/Sub to have such communication between GKE from one Google Cloud Project and GCF from another Google Cloud Project?
What is the way to communicate with Cloud Functions from another Google Cloud Project with low latency?
I think a global answer will clarify your questions. For this particular case, there are two ways to trigger a Google Cloud Function (GCF). You can directly make an HTTP request or you can subscribe the GCF to a topic by using Pub/Sub [https://cloud.google.com/functions/docs/calling ].
If your requests are occasional, an HTTP request will be faster because you don't need an intermediary. If that's not the case, then the Pub/Sub subscription queues the messages and ensures the delivery by retrying them until it receives confirmation.
To communicate between Google Kubernetes Engine (GKE) from one Google Cloud Project and Google Cloud Function (GCF) to another Google Cloud Project you can use either option. Trigger the GCF by the HTTP request directly or do it by publishing the message. When publishing, specify the project where you are sending it and the desirable topic in that project.
Also you need to give the proper permission to the service account to access from one project to the other:
For Pub/Sub https://cloud.google.com/pubsub/docs/authentication
For HTTP request
https://cloud.google.com/solutions/authentication-in-http-cloud-functions.
Google Cloud Function HTTP triggers documentation here: https://cloud.google.com/functions/docs/calling/http
Pub/Sub documentation here:
https://cloud.google.com/pubsub/docs/reference/libraries (you can
access to GitHub by the links in the code and see functions examples
for each language)