I'm new to google cloud technologies.
If I have a streaming web API online where I want to access JSON data from and write it into Cloud Pub/Sub, Do I need to create a python app engine or something that reads that Web API and writes the data into Cloud Pub/Sub or is there an easier way?
Basically just poling a web API and getting JSON data.
There's multiple ways that this could be achieved:
Cloud Function: With the help of triggers, you could have the cloud function grab the data and write it to pub/sub using its API Service. Cloud functions uses a Nodejs runtime so you will need to use javascript to code the function.
Direct Stream: You can altogether skip the extra work and simply adapt your web service to use the Pub/Sub API service and write the data directly to it.
You can look into the Pub/Sub Client API Documentation which offer sample and guidance on how to properly use the API.
Hope this is helpful.
Related
Does GCP Dataflow support transformation of calling external API endpoint using SOAP xml POST?
I couldn't find any Apache Beam library supporting this need directly.
If not SOAP, can I do REST API callouts through Apache Beam transformations?
It's possible to do api calls in a DoFn.
You only need using your api client class inside the DoFn and process your call.
You should care about performance in this case, due to network access of your api calls.
If you have the possibility to retrieve the data from the api directly in the input connector (IO), you will have better performances.
I'm trying to figure out if it's feasible to create a serverless web app in which an API function creates a job that is scheduled to run once at a specific time and date.
I've looked at the three main providers, AWS, Google Cloud and Microsoft Azure. All three provide everything needed for a serverless web app in general, but I'm not sure I understand if any of them support what I described above.
AWS has CloudWatch, which has an API. However, there is nothing about Events in the API doc, it looks like Events can only be created by hand in the console or via Terraform.
Google Cloud has the Scheduler. However, there is no mention of an API in the docs. It does support Terraform too, though.
Microsoft has the Azure Scheduler, and that one seems to support creating jobs via an API.
Doesn't Terraform require an API, so am I missing anything?
I'm completely new to serverless web apps. Is this even the correct approach to do this?
Edit:
I just realized that it's possible to create Amazon CloudWatch events via an API, however, it's called EventBridge... That makes me think I might have missed something in Google Cloud as well. However, I'm still wondering if this is the right approach?
To provide a little more detail on what I want to do:
A user creates an event in the web frontend.
My API function that the frontend calls creates some cloud version of a cronjob that is to be run once at a specific time and date
The job triggers another function that does something with a third party API at the time specified by the user
On Google Cloud, you can deploy your app on serverless services (Cloud Run, Cloud Function or App Engine). Then, you can set up a Cloud Scheduler. Cloud Scheduler can call an HTTP URL and then to trigger you serverless service.
About the API accessibility of Google Cloud services, "All is API". So you can do all what you can on the console or with the GCLOUD cli, with API calls.
I am new to Google Clout Iot core. I just tried the iot core, registered the device and published some data on a topic.
Now i am wondering is only Google functions are used to get/subscribe data from it? or can i also used the data published from a device directly into my applications e.g. node.js or python? like it is usually subscribed through a normal MQTT topic? I dont need a serverless architecture i want to use the data from google iot into my server and store/use it from there..
thanks a lot!
When a message arrives over MQTT or HTTP from a device over IoT Core, that message is published on a GCP Pub/Sub topic using the GCP Pub/Sub infrastructure. What this means is that you can develop a consuming application however you like as long as that application consumes the message from Pub/Sub.
The diagram shown here from the documentation is a good reference.
GCP exposes the API for Pub/Sub to be a subscriber. There are language bindings for a variety of languages.
Cloud Functions is typically seen as a serverless consumer of Pub/Sub messages as it scales to zero and can scale up if there is bursty load. However, there is nothing to prevent you from writing Compute Engine hosted applications or Container hosted applications which can act as subscribers.
References:
Cloud Pub/Sub documentation
Cloud Pub/Sub Client Libraries
I think your question is more on the architectural side of things. You can use IoT Core with something like Pub/Sub and write the data to Cloud Storage in order to later consume it with an application.
I would recommend you to check some of the diagrams from this page if you are interested in knowing which products you could use to best fit your needs.
I hope you find this useful!
Is Pub/Sub significantly faster way of communicating between, say, Kubernetes Engine (GKE) api server and a Cloud Function (GCF)?
Is it possible to use Pub/Sub to have such communication between GKE from one Google Cloud Project and GCF from another Google Cloud Project?
What is the way to communicate with Cloud Functions from another Google Cloud Project with low latency?
I think a global answer will clarify your questions. For this particular case, there are two ways to trigger a Google Cloud Function (GCF). You can directly make an HTTP request or you can subscribe the GCF to a topic by using Pub/Sub [https://cloud.google.com/functions/docs/calling ].
If your requests are occasional, an HTTP request will be faster because you don't need an intermediary. If that's not the case, then the Pub/Sub subscription queues the messages and ensures the delivery by retrying them until it receives confirmation.
To communicate between Google Kubernetes Engine (GKE) from one Google Cloud Project and Google Cloud Function (GCF) to another Google Cloud Project you can use either option. Trigger the GCF by the HTTP request directly or do it by publishing the message. When publishing, specify the project where you are sending it and the desirable topic in that project.
Also you need to give the proper permission to the service account to access from one project to the other:
For Pub/Sub https://cloud.google.com/pubsub/docs/authentication
For HTTP request
https://cloud.google.com/solutions/authentication-in-http-cloud-functions.
Google Cloud Function HTTP triggers documentation here: https://cloud.google.com/functions/docs/calling/http
Pub/Sub documentation here:
https://cloud.google.com/pubsub/docs/reference/libraries (you can
access to GitHub by the links in the code and see functions examples
for each language)
I was wondering if google cloud print can be configured to send to a custom web service. I was thinking about sending documents that need to be OCR'd to my web service to process. Can this be done directly with google cloud print?
Yes. You'll essentially create a Google Cloud Print printer, in software. When the new print job arrives (a PDF with metadata) then you do whatever you want to with it.
The API documentation is very clear
Some example code (Go) can be found in the official Google Cloud Print CUPS Connector
More example code (Python) can be found in the Google Cloudprint Proxy
The GCP Developers mailing list is useful