I have a web app which connect to a Cloud Function (not Firebase, but GCP). The cloud Function is Python, it is massive and it connect to SQL on Google as well.
Now, I bought a domain from Google, and I need to host a simple static website, that will access this Google Function using a Function URL, and show data to client.
It need to be fast and serve many users. ( It is sort of a search engine)
I would like to avoid Firebase Hosting for multiple reasons, but mainly because i want to stay inside the GCP, where i deploy with and monitor everything.
I realized that my options to host this static(?) website with my custom domain in GCP are :
Load Balancer - which is expensive over kill solution.
Cloud Storage - which (i might be wrong) will be very limiting later if i need to manage paying users. ( or can i just send user ID to the Function using parameters?)
Cloud Run - which i am not sure exactly yet what it does.
What is a solution that fit a light web app(html/JS) that can Auth users but connect to a massive Cloud Function using the Cloud Function URL with simple REST?
Also - can i change the URL of that Cloud Function to be my domain without Balancer ? Currently it is like project-348324
Related
I'm building a simple analytic service that needs to work for multiple countries. It's likely that someone from a restricted jurisdiction (e.g. Iran) hits the endpoint. I am not offering any service that would fall under sanctions-related restrictions, but it seems like Cloud Run endpoints do not allow traffic from places like Iran. I tried various configurations (adding a domain mapping, an external HTTPS LB, calling from Firebase, etc) and it doesn't work.
Is there a way to let read-only traffic through from these territories? Or is there another Google product that would allow this? It seems like the Google Maps prohibited territory list applies to some services, but not others (e.g. Firebase doesn't have this issue).
You should serve traffic through Load Balancer with Cloud Armour policy. Cloud Armour provide a feature for filtering traffic based on location.
I have a Cloud Function which i want to secure by allowing only access from my domain to all users. I am exploring this for days.
Google seems to limit many options and instead you are forced to buy and use more products, for example for this you need a Network Balancer, which is a great product but a monster to smaller businesses, and not everyone needs it (or wants to pay for it).
So, how do you secure a Function on the Console, without IAM (no signin needed), to only allow a certain domain calls before you expand to a Balancer ?
I do see that Google has something called Organization policies for project which supposed to restrict a domain, but the docs are not clear and outdated (indicate UI that doesn't exist)
I know that Firebase has the Anonymous User, which allow a Function to check a Google ID of an anonymous user, but everything online is a Firebase thing, and no explanation anywhere how to do this using normal Function with Python.
EDIT
I do use Firebase Hosting, but my Function is Python and it's handled from the GCP, not a Firebase Function.
Solved, you can use API Gateway, with API key, restrict the key to your domain only, and upload a config with your Function url, so you access it with a API url+key, and nobody else can just run it.
See here Cloud API Gateway doesn't allow with CORS
I wish i could connect it to a domain as well, but we can't, google seems to want everyone to use the expensive Balancer, or Firebase (charged in this case on a Function use for every website visit)
I want to achieve the following configuration:
https://example.com - serve Google Cloud Storage BacketA
https://example.com/files/* - serve Google Cloud Storage BacketB
https://example.com/api/* - serve google functions -> https://us-central1-{my-app-name}.cloudfunctions.net/api
I have an issue with step 2. How to specify backend as cloud functions endpoint? How to point to google function in backend configuration?
How I can do that?
You need to register your domain with your firebase project,
these two articles should get you on the right foot:
Functions overview
Serverless Overview
You need to go to Hosting, after getting started, it will provide you a way to attach your custom domain to your project, you will need to validate the domain, last time I made it, it took 72 hours; the steps needed can be found here.
All the information you need is there.
Assume I want to deploy multiple micro-services by using google cloud run and those micro services will be connected each other. My questions are follows
Does each micro-service deploy separately by creating google cloud run service
Then how each micro service call each other (by using public IPs)
How to connect different micro-service with different dbs such as Mongo DB, CassandraDb. Is there way we can create NoSQL Db in compute engine and access through google cloud run.
Does each micro-service deploy separately by creating google cloud run service
Yes, each microservice is individual and has it's own http/s endpoint if you need it.
If you need to deploy more more in bulk, you can always use a CI/CD tool.
Then how each micro service call each other (by using public IPs)
When you deploy your service for the first time with an HTTP trigger you are provided with an unique url (similar to what happens with cloud functions). You can then invoke your service via HTTP as usual.
Of course if you have many services, calling them blindly it's not the best option, I advice you to use a service mesh (istio) and/or an api-gateway (cloud endpoints) in order o have better control and flexibility on your apis.
How to connect different micro-service with different dbs such as
Mongo DB, CassandraDb. Is there way we can create NoSQL Db in compute
engine and access through google cloud run.
I don't see why not but please consider the list of this known limitations of cloud run (managed): here
Basically it doesn't support a VPC connector, so you can't do it over a private ip. Also consider many of the managed db gcp offers, maybe datastore is good enough for you use case ?
I was working on a Arduino project that provides data to outside public but I want to keep only for family members or some guests. So I wanted to set up login authentication page where user can login and see data over a php website hosted locally (available over internet via port forwarding) . But since login would require database to store username/password I want it to be stored on dynamoDb. I think storing online on AWS is a good idea since db will grow over time but php page can be stored and moved easily . Another reason I would like to try is whilst I will get to learn how to use NoSql on dynamo db!
Please guide me the right path to host php locally that uses Amazon dynamoDb for storing logins?
You can certainly connect to DynamoDB from outside of the Amazon environment as long as you have your credentials. The PHP Getting Started Guide should give you most of what you need.
When you're ready to move to an EC2 instance, a t2.nano machine is about USD $4.32 per month. That would let you setup a full PHP server that could also talk to the database and you wouldn't have to have it locally.