How to locate the IP address that Google Workflows uses? - google-cloud-platform

I am trying to use Google Workflows to make HTTP POST requests to a service that uses a whitelisted IP list. How can I find an IP, or range of IPs that I could give to the vendor?

You can whitelist all the IP range reserved by Google Cloud. But at the end, it's like if you allow all users. Indeed, any users (or attackers) that use Google Cloud services will use one of the Google IP ranges, anyone will be able to access your service.
The best solution is to use a Cloud Functions or a Cloud Run as proxy. Cloud Workflows call that proxy internally at Google Cloud. Then, on the proxy service, you can plug a serverless VPC connect (with egress param set to all traffic) and a CLoud NAT to reserve a static public IP ONLY FOR YOU, and you will be able to allowlist it securely because ONLY YOU will be able to use it.
Here the doc of Cloud Run, but it's pretty similar on Cloud Functions.

Related

Google Cloud Functions access control based on source IP address

I would like to configure access control based on source IP address with Google Cloud Functions.
(Only allowed IPs can reach Google Could Functions.)
I suppose there is no way for Google Cloud Functions itself to limit client IPs.
So I have an idea to put some gateways in front of the Cloud Functions, such as Apigee, API Gateway and Cloud Endpoints.
I found that only Apigee has souce IP access control, but I wonder that Apigee is too rich for my simple workload.
https://cloud.google.com/apigee/docs/api-platform/reference/policies/access-control-policy?hl=ja
Is it possible to use API Gateway or Cloud Endpoints to configure source IP based access control?

Allow traffic from certain machines - Google Cloud Armor

I have a Google Cloud Run services and i would need to allow traffic from certain machine only.
I use Google cloud armor to allow IPs to access the Cloud Run service.
I have problem in adding dynamic IPs of certain machine as it keeps changing. I also searched on adding mac address to allow, but Cloud armor does not have that feature.
You cannot use MAC addresses for the Internet. The service (Cloud Armor) will never see the client's MAC address, only the MAC address of the last router (which would be a Google router). Google Cloud VPCs do not expose layer 2 information.
Cloud Run is a public service with a public URL. Restricting traffic based upon IP address is not supported by Cloud Run. You can put an HTTP Load Balancer and Cloud Armor in front, but that would not prevent traffic that goes directly to the service.
There are much better techniques to control access to public services. Google Cloud implements authorization using OAuth via Identity Aware Proxy (IAP). That is the correct method to use. Given that your clients have changing IP addresses, that is your best solution.
If I needed access control based upon IP address, I would run my service on Compute Engine using either Container Optimized OS, Docker or just natively using Apache/Nginx. You can dynamically update VPC firewall rules as the client's IP address changes with custom code.

Secure GCP servless with IaaS appliance

Is it possible to access GCP PaaS (App Engine , Cloud Function, Cloud Run) internally (throught VPC)
I see in this doc : https://cloud.google.com/vpc/docs/configure-serverless-vpc-access
"Serverless VPC Access only allows requests to be initiated by the serverless environment. Requests initiated by a VM must use the external address of your serverless service—see Private Google Access for more information."
But searching for something like "Serverless VPC Access allows in/out requests"
You have 2 ways: in and out
Request TO serverless APP
You can use ingress control with Cloud Functions and Cloud Run services. You can say: I want that only connections from my VPC (or VPC SC perimeter) access to my serverless APP. With App Engine, you have firewall rules but doesn't work with private IP.
Request FROM serverless APP
Here you want to reach private resource exposed only on your VPC with a private IP. And with Cloud Run, Cloud Functions and App Engine, you can plug a serverless VPC connector to achieve this.
EDIT 1
With your appliance firewall deployed on Google Cloud, App Engine isn't the perfect product for this. Indeed, with App Engine you can't control the ingress traffic, and you always accept the traffic from the internet, even if you have a stuff (here your appliance) already on Google Cloud Network with a private IP.
The solution here (to test, depends on the appliance capacity) is to use Cloud NAT and to route all the traffic of the subnet on which the appliance is deployed, and to use a reserved static IP.
Then, on App Engine, you can set a firewall rule to accept only traffic from this reserved static IP.
The latency will increase with all these layers...

Connecting Google Cloud Run Service to Google Cloud SQL database

I have 2 google cloud services:
Google Cloud Run Service (Node Js / Strapi)
Google Cloud SQL Service (Mysql)
I have added the Cloud SQL connection to the Google Cloud Run Service from the UI, and have a public IP for the Google Cloud SQL Service. On top of that I have added the Run Service IP to the Authorised networks of SQL Service.
If I try and connect from another server (external from Google cloud) I can easily connect to the Google Cloud SQL Service and execute queries.
But if I try and connect from inside the GCloud Run Service with exactly the same settings (Ip, database_name, etc) my connection hangs and I get a timeout error in the logs...
How to properly allow Gcloud SQL to accept connections from GCloud RUN?
I looked for other answers in here, but they all look very old (around 2015 )
You can use 3 modes to access to your database
Use the built-in feature. In this case, you don't need to specify the IP address, it's a linux socket that is open to communicate with the database as described in the documentation
Use Cloud SQL private IP. This time, no need to configure a connection in the Cloud Run service, you won't use it because you will use the IP, not the linux socket. This solution required 2 things
Firstly attach your database to your VPC and give it a private IP
Then, you need to route the private IP traffic of Cloud Run through your VPC. For this you have to create, and then to attach to the Cloud RUn service, a serverless VPC Connector
Use CLoud SQL public IP. This time again, no need to configure a connection in the Cloud Run service, you won't use it because you will use the IP, not the linux socket. To achieve this, you need more steps (and it's less secure)
You need to route all the egress traffic of Cloud Run through your VPC. For this you have to create, and then to attach to the Cloud RUn service, a serverless VPC Connector
Deploy your Cloud Run service with the Serverless VPC Connector and the egress connectivity param to "all"
Then create a Cloud NAT to route all the VPC Connector ip range traffic to a single IP (or set of IPs) (The link is the Cloud Functions documentation, but it works exactly in the same way)
Finally authorize the Cloud NAT IP(s) on Cloud SQL authorized networks.
In your case, you have whitelisted the Cloud Run IP, but it's a shared IP (other service can use the same!! Be careful) and it's not always the same, there is a pool of IP addresses used by Google cloud.

Google Cloud Platform - Pub/Sub push to private (VPN) on-premise listeners?

Official documentation for Pub/Sub service states that Push is available to listeners that are available on public network:
An HTTPS server with non-self-signed certificate accessible on the public web.
That sounds pretty clear - but I wonder if I haven't miss something. Is it in any way possible to have Pub/Sub service push messages to on-premise machines, that are not on public internet?
You should be able to achieve this with cloud Nat
Reserve a static IP
Link your DNS with this IP
Create a subnet
Create a route from this subnet to your VPN
Create a Nat with your external IP and which forward request to your subnet
Deploy an OnPrem webserver (apache, nginx) with valid certificate for your DNS
Update your OnPrem route for reaching your webserver and don't forget to route the flow back!
Is it in any way possible to have Pub/Sub service push messages to
on-premise machines, that are not on public internet?
Not easily, if at all. You might be able to use a Reverse Proxy. This introduces several layers to manage: proxy configuration, proxy compute instance, SSL Certificates, VPC routing, on-prem router, etc. See guillaume blaquiere's answer.
On-prem resource can reach Pub/Sub via public Internet or via VPN to private.googleapis.com but Pub/Sub cannot connect to on-prem or VPC resources configured with private IP addresses.
Cloud Pub/Sub push subscriptions require a publicly accessible HTTPS endpoint. If you want to reach on-premise machines, that would have to be done via a proxy/router accessible via the public internet (as others have mentioned). Cloud Pub/Sub does not currently support VPC for push subscriptions.
Please see the note section under https://cloud.google.com/pubsub/docs/push
Previous answers are outdated. You can use restricted Virtual IP with Private Google Access to provide a private network route for requests to Google Cloud services without exposing the requests to the internet.