I'm trying to figure out if it's feasible to create a serverless web app in which an API function creates a job that is scheduled to run once at a specific time and date.
I've looked at the three main providers, AWS, Google Cloud and Microsoft Azure. All three provide everything needed for a serverless web app in general, but I'm not sure I understand if any of them support what I described above.
AWS has CloudWatch, which has an API. However, there is nothing about Events in the API doc, it looks like Events can only be created by hand in the console or via Terraform.
Google Cloud has the Scheduler. However, there is no mention of an API in the docs. It does support Terraform too, though.
Microsoft has the Azure Scheduler, and that one seems to support creating jobs via an API.
Doesn't Terraform require an API, so am I missing anything?
I'm completely new to serverless web apps. Is this even the correct approach to do this?
Edit:
I just realized that it's possible to create Amazon CloudWatch events via an API, however, it's called EventBridge... That makes me think I might have missed something in Google Cloud as well. However, I'm still wondering if this is the right approach?
To provide a little more detail on what I want to do:
A user creates an event in the web frontend.
My API function that the frontend calls creates some cloud version of a cronjob that is to be run once at a specific time and date
The job triggers another function that does something with a third party API at the time specified by the user
On Google Cloud, you can deploy your app on serverless services (Cloud Run, Cloud Function or App Engine). Then, you can set up a Cloud Scheduler. Cloud Scheduler can call an HTTP URL and then to trigger you serverless service.
About the API accessibility of Google Cloud services, "All is API". So you can do all what you can on the console or with the GCLOUD cli, with API calls.
Related
Aws let setup lambda#edge function executed after a response is received from the origin(see picture bellow), can I do it with GCP?
UPDATE
This can be done now with the Cloud Functions service in Google Cloud Platform, as stated in the official documentation 1.
"Cloud Functions allows you to trigger your code from Google Cloud, Firebase, and Google Assistant, or call it directly from any web, mobile, or backend application via HTTP."
In this document can find the features comparison of AWS and Azure with matching features in GCP, including the one just mentioned of GCP Cloud Run matching AWS Lambda 2.
Finally, can find documentation on how to write Cloud Functions, the different programming languages used, and examples for the different usages here 3.
I'm creating a SaaS product which allows users to create their own apps. For each application I need to deploy a serverless app or just a lambda function with cloud watch events & api gateway configurations through cloud formation stack when a particular request hits my backend API (which is also a serverless application). I need to know the best practice or the recommended way of achieving that task.
There are two approaches I found by doing a bit of research.
1. Using aws sdk and create a cloud formation stack. (without using the serverless framework)
2. Using serverless-platform-sdk which deploys an app on serverless-platform (beta) which I'm not completely sure about.
Is Pub/Sub significantly faster way of communicating between, say, Kubernetes Engine (GKE) api server and a Cloud Function (GCF)?
Is it possible to use Pub/Sub to have such communication between GKE from one Google Cloud Project and GCF from another Google Cloud Project?
What is the way to communicate with Cloud Functions from another Google Cloud Project with low latency?
I think a global answer will clarify your questions. For this particular case, there are two ways to trigger a Google Cloud Function (GCF). You can directly make an HTTP request or you can subscribe the GCF to a topic by using Pub/Sub [https://cloud.google.com/functions/docs/calling ].
If your requests are occasional, an HTTP request will be faster because you don't need an intermediary. If that's not the case, then the Pub/Sub subscription queues the messages and ensures the delivery by retrying them until it receives confirmation.
To communicate between Google Kubernetes Engine (GKE) from one Google Cloud Project and Google Cloud Function (GCF) to another Google Cloud Project you can use either option. Trigger the GCF by the HTTP request directly or do it by publishing the message. When publishing, specify the project where you are sending it and the desirable topic in that project.
Also you need to give the proper permission to the service account to access from one project to the other:
For Pub/Sub https://cloud.google.com/pubsub/docs/authentication
For HTTP request
https://cloud.google.com/solutions/authentication-in-http-cloud-functions.
Google Cloud Function HTTP triggers documentation here: https://cloud.google.com/functions/docs/calling/http
Pub/Sub documentation here:
https://cloud.google.com/pubsub/docs/reference/libraries (you can
access to GitHub by the links in the code and see functions examples
for each language)
Does google cloud have an analogous functionality to AWS Lambda?
In particular I would like compute resources to be opened up and jobs scheduled via https events.
I'm also interested in any other cloud hosting providers which have similar functionality.
I just found out that there is something that looks interesting in the latest documentation of the SDK's command line tool gcloud.
https://cloud.google.com/sdk/gcloud/reference/alpha/functions/
This sounds exciting.
UPDATE: Google just released some official documentation of an alpha version of Cloud Functions. For now, functions can be written in Javascript using Node, and triggered by Pub/Sub, Cloud Storage, direct HTTP stimuli or manually for debugging purposes.
Google Cloud Storage has Object Change Notification. Only web hooks are currently supported at this time.
A client application can send a request to watch for a bucket's change notification events in order to be notified about changes to a bucket's objects. After a notification channel is initiated, Google Cloud Storage notifies the application any time an object is added, updated, or removed from the bucket.
For example, when you add a new picture to a bucket, an application could be notified to create a thumbnail.
More info can be found at: https://cloud.google.com/storage/docs/object-change-notification
Regarding other providers that have similar functionality, check out IronWorker. You can kick off IronWorker tasks via https endpoints using the webhook endpoint and you can run jobs on multiple clouds. Here's a comparison of Lambda vs IronWorker.
And yes, I work for Iron.io.
Lately Google announced alpha release of Google Cloud Functions which supports http interface.
There is Google Cloud Functions and Microsoft Azure functions, they are both fairly new (Microsoft announced Azure function on March 31 2016)
if you need Lambda with HTTP interface then look at Nano Lambda
They can deploy to any cloud and on premise.
I would like to try Amazon SWF development on my pc. But, It seems to me that I must register to Amazon AWS cloud in order to use Amazon Simple Workflow. What I need is to run all Amazon stuff in a single machine without the need of Amazon hosted services (or authentication).
The documentation for Amazon Simple Workflow Service is here.
Can you please help?
You could run the workflow locally using the aws test framework. In this case everything (activites and decision workers) will run locally on the instance that you will be running the test framework, only that no swf api will be invoked. So you can test the orchestration logic of the decision worker without needing a aws account.
Using cloud services offline doesn't make much sense. You will definitely need to sign up for an AWS account to be able to play with SWF. There is a free tier which should be sufficient for your development.