I'm not actually whitelisted yet to use google cloud functions, but I have this question:
http functions have access to my google cloud context?
I wan't to run untrusted javascript code so I want to use a function as a sandbox, where the user can just run simple javascripts.
If I understand your request correctly, you are looking to have Cloud HTTP Functions evaluate user-provided Javascript code on the server side.
By your description, the only real ways the function would be able to evaluate the user's code would be essentially using eval or new Function(). To confirm the risks I mentioned, I created a cloud function that simply passes the POST request body to eval. Without any dependencies, I could issue HTTP requests on behalf of the cloud function which could be quite bad.
Given that most useful cloud functions would have "#google-cloud" as a dependency, the user could gain access to that context. I was able to require #google-cloud and get all the information accessible to that object (application credentials, application information, etc.). Having such information available to a malicious user is considerably worse that the first test. In addition, Cloud Functions are authenticated by default, presumably by default application credentials, thus gaining all the abilities of the gcloud client library.
In the end, the safest way to run user-provided code on the server would be within a container. This would essentially lock the user's code into a Linux box where the resources and networking capabilities can be entirely governed by you. On the Google Cloud Platform, you're best means of accomplishing this would likely be using App Engine as a front-end to handle user requests and Compute Engine VMs to create and run containers for user code. It's more complex but doesn't risk destroying your Google Cloud Platform project.
Related
Our many end users will, through a web browser, read and write in partly overlapping data.
When a user makes a change, a related change should be broadcasted to relevant other users.
Example use case: Several end users, each on their own device, look at a calendar with available time blocks to make an appointment. One of them creates an appointment, causing that a time block is not available for others anymore. The calendar on the screens of those others is updated accordingly and immediately.
Technically this would mean:
Browser sends 'create appointment' event through WebSocket
This event spins up a Cloud Function, which does the following (and then terminates):
Reserve the required capacity in the database
If this causes that the used time block is not available anymore for other users: Broadcast a 'not available anymore' event through the WebSockets of those other users that are viewing this time block.
In Google Cloud this is possible using an Apigee Java callout, where the Java (if needed) calls a Cloud Function, as described on https://cloud.google.com/apigee/docs/api-platform/develop/how-create-java-callout. However, Apigee runs in Kubernetes (https://cloud.google.com/apigee/docs/hybrid/kubernetes-resources), causing the overhead of containers being up at moments when they are not or sparsely used.
Google Clouds API Gateway https://cloud.google.com/api-gateway doesn't support WebSockets: https://issuetracker.google.com/issues/176472002?pli=1
Is there a way to accomplish our goal through a Cloud Function, without any container?
I'm studying GCP and reading about different ways to communicate and manage cloud functions I end up wondering when to use each of the services that offer GCP.
So, I have been reading about GCP Composer, GCP Workflows, Cloud Pub/Sub and I don't see clearly when to use each one, or use simple HTTP calls.
I understand that it depends a lot on the application that you are building, but for example, If I'm building a payment gateway and some functions should be fired after the payment was verified, like sending emails, making not related business logic, adding the purchase to a sales platform. So which one should be the way I manage this flow and in which case would be better to use the others? Should I use events to create an async flow with Pub/Sub, or use complex solutions like composer and workflows? or just simple HTTP calls?
As always, it depends!! Even in your use case, it depends! Ok, after a payment you want to send an email, make business logic, adding the order to your databases,...
But, is all theses actions can be done in parallel, or you need to execute them in a certain order and if a step fails, you stop the process?
In the first case, you can use Cloud PubSub with 1 message published (payment OK) and then a fan out to several functions in parallel. Else, you can use workflow to test the response of the fonction and then to call, or not the following fonctions. With composer you can perform much more checks and actions.
You can also imagine to send another email 24h after to thank the customer for their order, and use Cloud Task to delayed an action.
You talked about Cloud Functions, but you also have other solutions to host code on GCP: App Engine and Cloud Run. Cloud function is, most of the time, single purpose. Sending an email is perfect for a function.
Now, if you have "set of functions" to browse your stock, view the object details, review the price, and book an object (validate an order "books" the order content in your warehouse), the "functions" are all single purpose but related to the same domain: warehouse management. Thus you can create a webserver that propose different path to manage the warehouse (a microservice for the warehouse if you prefer) and host it on CloudRun or App Engine.
Each product has its strength and weakness. You will also see this when you will learn about the storage on GCP. Most of the time, you can achieve things with several product, but if you don't use the right one, it will be slower, or cost much more.
I am familiar with firebase platform, but I am relatively a new user of the google cloud platform as whole.
I am working on a project built using a microservices structure, and I do have so many question for which I cannot find an answer or better I cannot find any example.
Unfortunately all the example that I am able to find are way to simple to be able to extrapolate a viable answer for my issues.
I adopted the new cloud run offer, and I decided to play with the full managed version (not kubernetes). I built few microservices (each service is built using express for node or flask for python - depending on what the services does). Each microservices expose it's own endpoint and has it's own api to call the methods - and I use a service account to allow the application to perform the internal calls.
I now want to expose the application to the external (specifically to my client built using vuejs technology), and I was trying to leverage another google product to create and expose an api: the google endpoints.
My question (specifically referred to the cloud run structure) is related to how is possible and what I need to do to create an api endpoints to communicate with the client app, that internally calls multiple services and combine their response in one.
Just to be clear, let's make an example:
Cloud run service 1 -> crud user api
Cloud run service 2 -> crud product api
Cloud endpoint external visible api -> get user from service 1, and after get products from service 2 and return the combined response all green products for user Jane Doe.
How I can aggregate the response directly in the endpoint gateway, check for failure and if everything goes smooth send the aggregate response to the client?
I need to build the aggregate endpoint in something else, like a cloud function for example? or I can do it directly in the google endpoints gateway?
Note that for cloud run the google endpoints is another cloud run container.
Thanks guys for some help, running pretty much out of option here.
As per my understanding, API Gateway should just work as a proxy, presenting all micro services as a single endpoint. To this scenarios I think you can have following 2 approaches :
1: Implement a new micro service (or on any of the existing one) which will do invocations and aggregation of responses.
2: Client(like UI) can invoke the services and do the aggregation on their side as well.
I feel, it is not a good idea to do it at api-gateway.
In my opinion, from an architectural point of view, the best option for you is to create a new microservice which will take the responses from the other two and then, it will aggregate them.
I understand that you want to aggregate the responses in a api-geteway and you are not able to find code examples for it. Here I was able to find a guide on what are you wanting to implement. The full code implementation can be found in this repository.
Keep in mind though, this idea of implementation is not a best practice.
This is ok, only if those two services that are going to be combined are independent. Meaning there is no functional/business relation between them and the concurrency or inconsistency problem will not occur in the process of aggregating.
I'm looking through Google Cloud Functions docs and I wonder if it is possible to restrict access to HTTP cloud function to the given network? I would like to avoid anyone to exhaust the free quota.
Is there any firewall rules or similar mechanism for Cloud Functions?
I don't believe there is any in-built security restrictions at the moment.
In terms of avoid quota exhaustion you could pass a header or parameter with some kind of shared secret. Even a fixed string value would help avoid this problem.
You can add authentication to a cloud function by using firebase authentication. Here's a github example of how to do to it: https://github.com/firebase/functions-samples/tree/master/authorized-https-endpoint
Note however that the authentication code is executed by your function, so rejecting unauthorized access would still consume a small portion of your free resource allowance.
The Google Function Authorizer module might be what you're looking for. It provides "a simple user authentication and management system for Google Cloud HTTP Functions." It doesn't seem to have a lot of users yet, but the project seems simple enough that you could at least use it as a basis to modify or implement your own solution if you prefer.
This article was helpful for me.
https://cloud.google.com/solutions/authentication-in-http-cloud-functions
Anyone can still invoke the function but it must contain credentials from a user that has access to the resources accessed by the function.
Before that I was doing something very simple that is probably not great for production but does provide a little bit more security that just leaving it open publicly. I call my function with a password in the payload and if it doesn't match one of the passwords I hardcoded on the function it just fails with a 403.
If you need to restrict to IP range then you can follow instructions here: https://sukantamaikap.com/posts/load-balancing-cloud-functions
The UI of Google Cloud has unfortunately changed and you need to do some searching before you get all done, but I managed to set it up. But note that the related services will cost roughly 25 eur per month at minimum.
You can estimate the pricing here:
https://cloudpricingcalculator.appspot.com/
You need to search for "Cloud Load Balancing and Network Services" and then enable "Cloud Load Balancing", "Google Cloud Armor", and "IP addresses".
Alternatively, in some cases it might be sufficient if you set the name of the function or some suffix to the name complex enough so that it will be effectively like a sort of password. Something like MyGoogleCloudFunc-abracadabra. Then it will not restrict the network but perhaps outsiders would not know the secret name anyway.
I found this question. It is about calling other modules inside Google Cloud infrastructure.
How do I call other Google APIs from a Cloud Function?
So my question is it possible to trigger Google Cloud Functions using this approach? And how perfromant this solution will be if it is possible?
I think that it probably can be used as code sharing mechanism, because I didn't see any information about this issue regarding GCF.
Regarding your question about triggering: Cloud Functions can be triggered in a variety of ways, including HTTP (web calls such as REST), Pub/Sub, and changes to Cloud Storage (e.g. uploaded files). The set of triggers is likely to expand over time. The latest information can be found at https://cloud.google.com/functions/docs/calling/
Regarding your question about performance: Cloud Functions, at least the current iteration, run JavaScript inside a Node application. They auto-magically scale. New instances are spun up as demand grows. They should meet the performance needs of most use cases.
Regarding your comment on code sharing: Yes. You could create a function and expose it, such as with HTTP, so that it can be used by multiple applications. You'll need to do any authentication and authorization checking per call, though.