I'm working on a Cloud Run docker application that handles a few long-running data integration processes.
I'm struggling to come up with a way to locally run/test my submissions to Cloud Tasks before actually deploying the container to Cloud Run.
Is there any way to do this?
A local emulator for Cloud Tasks is not available yet, in some cases you can substitute Cloud Tasks with Pub/Sub.
Also, consider to use non Google solutions such as Cloud-Tasks-In-Process-Emulator, gcloud-tasks-emulator 0.5.1 or Cloud tasks emulator.
As I can understand you want to test the cloud task locally! Yes it is possible by using ngrok. By using ngrok you can access your local application on public and for cloud task you need the public url for handling task.
Related
I'd like to call a Cloud Run app from inside a Cloud Function multiple times, given some logic. I've googled this quite a lot and don't find good solutions. Is this supported?
I've seen the Workflows Tutorials, but AFAIK they are meant to pass messages in series between different GPC services. My Cloud Function runs on a schedule every minute and it would only need to call the Cloud Run app a few times per day given some event. I've thought about having the entire app run in Cloud Run instead of the Cloud function. However, I think having it all in Cloud Run would be more expensive than running the Cloud function.
I went through your question, I have an alternative in my mind if you agree to the solution. You can use Cloud Scheduler to securely trigger a Cloud Run service asynchronously on a schedule.
You need to create a service account to associate with Cloud
Scheduler, and give that service account the permission to invoke
your Cloud Run service, i.e. Cloud Run invoker (You can use an
existing service account to represent Cloud Scheduler, or you can
create a new one for that matter)
Next, you have to create a Cloud Scheduler job that invokes your
service at specified times. Specify the frequency, or job interval,
at which the job is to run, using a configuration string. Specify the
fully qualified URL of your Cloud Run service, for example
https://myservice-abcdef-uc.a.run.app The job will send requests to
this URL.
Next, specify the HTTP method: the method must match what your
previously deployed Cloud Run service is expecting. When you deploy
the service using Cloud Scheduler, make sure you do not allow
unauthenticated invocations. Please go through this
documentation for details and try to implement the steps.
Back to your question, yes it's possible to call your Cloud Run service from inside Cloud Functions. Here, your Cloud Run service calls from another backend service i.e. Cloud Functions directly( synchronously) over HTTP, using its endpoint URL. For this use case, you should make sure that each service is only able to make requests to specific services.
Go through this documentation suggested by #John Hanley as it provides you with the steps you need to follow.
For instance, I want to run the following gcloud CLI command,
gcloud run services delete [SERVICE]
But, from a triggered Google Cloud Function.
I've looked in a few places and have found a few similar things,
https://www.googlecloudcommunity.com/gc/Infrastructure-Compute-Storage/Automatic-Resource-Deletion/m-p/172865
https://github.com/PolideaInternal/cats-love-money
Create a Google function from a Google cloud function
But, I find them a bit tricky to follow and/or replicate.
The Google Cloud CLI is a Python program. That means a lot of dependencies and a requirement for a shell and OS environment. Cloud Functions does not provide either.
A better option for running the CLI is Cloud Run. This provides the additional benefit of being able to test the Cloud Run container locally. You will need to wrap the CLI with an HTTP server responding to HTTP requests which then execute the CLI.
However, most CLI commands can be easily duplicated with the Python SDKs and/or direct REST API calls which are supported by Cloud Functions. This will require a solid understanding of the services, APIs, and language.
Recently I started working with GCP workflows, and functions. We are using serverless framework for the functions and we can run them in our on computers with the command serverless invoke local --function <function_name> so we don't have to spend cloud executions.
What I'm looking now is if there is a way to do the same thing with GCP workflows, to run them in our own computers instead of invoking them inside the cloud.
I already read the resources from google and from many different articles but I still not find the trick (if it actually exists)
Today, there is no emulator for Cloud Workflows. But if you can afford to deploy your cloud functions on GCP, Cloud workflows has a generous free tier: 5000 steps for free
I am building a python app in Google cloud. This involves delayed execution of tasks.
It seems, Cloud tasks are limited to App Engine.
Can we use cloud tasks from GCE VMs or containers running in GCP/other clouds VMs?
Even google docs have only for push queues with app engine.
Does cloud tasks support pull queues?
[EDIT]
I tried looking at their cloud discovery files. v2beta1 has pull references but v2 does not. I believe GCP don't want to support this in future :-(.
Cloud Tasks does not support pull queues, but just launched a Beta feature for HTTP Targets which allows Cloud Tasks to push tasks to any HTTP endpoint. There's even functionality for Cloud Tasks to include an authentication token based on an associated service account: https://cloud.google.com/tasks/docs/creating-http-target-tasks
This would allow you to push to GCE, or really any service that can operate as a webhook. If you were to use the new Cloud Run Beta product, verifying these tokens is handled for you.
Cloud Pub/Sub provides support for pull-based processing.
I want to schedule restart for an app so is there any way that applications will be restarted automatically after specific timelimit in PCF?
I am not sure if there is anything within PCF that can execute CF Commands. My suggestion is to have a CI/CD Job Configured (Jenkins-Job for example) that will execute cf restart <app_name> at scheduled Intervals
I've been working on a scheduler service which you can register at Cloud Foundry containing service plans also for restarting apps. There are also other service plans like just triggering an arbitrary http-endpoint. I'd be happy if you try it out and provide me feedback. Just check it out in GIT: https://github.com/grimmpp/cloud-foundry-resource-scheduler
I've also started to describe what is provides and how it can be installed and configured. For using it you just need to create service instances in the marketplace of cloud foundry and specify some parameters for e.g. how often or when it should be called. ...