How do I connect to Google Cloud SQL from Google Cloud Run via TCP? - google-cloud-platform

Based on my current understanding, when I enable a service connection to my Cloud SQL instance in one of my revisions, the path /cloudsql/[instance name]/.s.PGSQL.5432 becomes populated. This is a UNIX socket connection.
Unfortunately, a 3rd party application I'm using doesn't support UNIX socket connections and as such I'm required to connect via TCP.
Does the Google Cloud SQL Proxy also configure any way I can connect to Cloud SQL via something like localhost:5432, or other equivalent? Some of the documentation I'm reading suggests that I have to do elaborate networking configuration with private IPs just to enable TCP based Cloud SQL for my Cloud Run revisions, but I feel like the Cloud Proxy is already capable of giving me a TCP connection instead of a UNIX socket.
What is the right and most minimal way forward here, obviously assuming I do not have the ability to modify the code I'm running.
I've also cross posted this question to the Google Cloud SQL Proxy repo.

The most secure and easiest way is to use the private IP. It's not so long and so hard, you have 3 steps
Create a serverless VPC connector. Create it in the same region as your Cloud Run service. Note the VPC Network that you use (by default it's "default")
Add the serverless VPC Connector to Cloud Run service. Route only the private IPs through this connector
Add a private connection to your Cloud SQL database. Attached it in the same VPC Network as your serverless VPC Connector.
The Cloud configuration is over. Now you have to get the Cloud SQL private IP of your instance and to add it in parameters of your Cloud Run service to open a connection to this IP.

Related

How to connect to Cloud SQL from Cloud Run instance while also using Serverless-VPC-Connector?

I'm running a cloud run service with a working Cloud-SQL connection using the proxy to connect to the Cloud-SQL instance. The Cloud-SQL instance does not have a private IP configured.
Now there is a new requirement that this service needs to connect to a DB outside of GCP, for which it needs a static egress-IP that can be whitelisted. I attempted to achieve this via a serverless-VPC-connector (https://cloud.google.com/run/docs/configuring/static-outbound-ip).
Problem: When I add the VPC-connector to the service, and configure it to route all traffic through the vpc-connector, the service fails to deploy because it cannot connect to Cloud-SQL via the proxy anymore:
CloudSQL connection failed. Please see https://cloud.google.com/sql/docs/mysql/connect-run for additional details: Post "https://sqladmin.googleapis.com/sql/v1beta4/projects/<>/instances/<>/createEphemeral?alt=json&prettyPrint=false": context deadline exceeded
I was able to get this exact setup to work for a cloud function (identical external DB, CloudSQL, and vpc connector), and I'm at a loss as to why this wouldn't work for Cloud Run, and I'm wondering if there is additional configuration required which I'm missing?
Is it possible to connect to Cloud-SQL with the proxy, while at the same time using a VPC-connector to achieve a static egress IP?
If you need a static egress public IP, you only did the half of the path. Now, you need to add a Cloud NAT, that gets the VPC connector range and NAT the traffic to the internet. In this NAT, you can define a public static IP which will be used for all outgoing communications.
In any cases, if you want to reach Cloud SQL, it's better to use Cloud SQL proxy because the connection is secured (encrypted channel). If you use directly the public IP, add a SSL certificat to encrypt the communication over the public internet. (same thing if your instance is located outside GCP)

GCP dataflow and On-premDB

can we connect to On-Prem Sql DB and cloud Dataflow(GCP) without API? Our databases do not provide API’s for data extraction.
Please help me in this , we are stuck on this for a quite sometime.
Yes, you can do this. If you have a look to the Beam documentation, you have several Database built-in connector, like a JDBC IO connector. So you can connect any database with an IP:PORT and the correct drivers.
Now, a security topic: You can choose to add a public IP to your on-prem database to access it with Dataflow. If you do this, it's secure (firstly), and then your dataflow worker node need to have a public IP (or you need to set up a Cloud NAT) to access to internet
A better solution is to create a VPN (or an interconnect) from the same VPC as this one where you run the dataflow worker. Like this, you can use the on-prem Database private IP address to reach it and it's more secure!

proxy(?) server for connecting to cloud sql instance (GCP)

I have a postgresql database on the google cloud platform (cloud SQL). I'm currently managing this database through pgadmin, installed on my laptop. I've added the IP address of my laptop to the whitelist on the cloud sql settings page. This all works.
The problem is: when I go somewhere else and I connect to a different network, the IP address changes and I cannot connect to the postgresql database (through pgadmin) from my laptop.
Is there someone who knows a (secure) solution, involving a proxy server (or something else), to connect from my laptop (and only my laptop) to my postgresql database, even if I'm not on a whitelisted network (IP address)? Maybe I can set up a VM instance and install a proxy server and use this? But I have no clue where to start (or search for).
You have many options for connecting to a Cloud SQL instance from an external applications such a Public IP address with SSL, Public IP address without SSL, Cloud SQL proxy, etc. You can see all of them here.
Between all connection options there exists Cloud SQL Proxy, it basically provides secure access to your instances without the need for Authorized networks or configuring SSL on your part.
You only need to follow the steps listed here and you will be able to connect your Cloud SQL instance using the proxy.
Enable Cloud SQL Admin API on your console.
Install the proxy client on your local machine (Linux):
wget https://dl.google.com/cloudsql/cloud_sql_proxy.linux.amd64 -O cloud_sql_proxy
chmod +x cloud_sql_proxy
Determine how you will authenticate the proxy. You can use use a service account or let Cloud SDK take care of the authentication.
However, if required by your authentication method, create a service account.
Determine how you will specify your instances for the proxy. Your options for instance specification depend on your operating system and environment
Start the proxy using either TCP sockets or Unix sockets.
Take note that as of this writing, Cloud SQL Proxy does not support Unix sockets on Windows.
Update your application to connect to Cloud SQL using the proxy.

GCP Firewall allow connection from cloud build to compute engine instance

We have a GCE VM with MySQL server. Firewall rules deny incoming connections from external IP. Our cloud build process requires to perform DB migrations so it needs to connect to MySQL from Cloud Build. I want to add a Firewall rule to allow only cloud builder to connect through 3306 from the external IP address.
Cloud Build does not run on internal network so there is no way to connect from the internal IP.
I tried adding a rule for service account scope but I can't see cloud build service account in the list.
Currently, as mentioned by #guillaume blaquiere, there is a Feature Request. I recommend you to follow it (star) to receive all the updates there. Seems that the FR has been sent to the Cloud Build engineering team and they will evaluate it. Also note that there is not yet an ETA of the implementation.

Google Cloud Composer and Google Cloud SQL Proxy

I have a project with Cloud Composer and Cloud SQL.
I am able to connect to Cloud SQL because i edited the yaml of airflow-sqlproxy-service and added my Cloud SQL instance on cloud proxy used for the airflow-db, mapping to port 3307.
The workers can connect to airflow-sqlproxy-service on port 3307 but i think the webserver can't connect to this.
Do i need to add some firewall rule to map the 3307 port so the webserver or the UI can connect to airflow-sqlproxy-service?
https://i.stack.imgur.com/LwKQK.png
https://i.stack.imgur.com/CJf7Q.png
https://i.stack.imgur.com/oC2dJ.png
Best regards.
Composer does not currently support configuring additional sql proxies from the webserver. One workaround for cases like this is to have a separate DAG which loads Airflow Variables with the information needed from the other database (via the workers which do have access), then generate a DAG based on the Variable which the webserver can access.
https://github.com/apache/incubator-airflow/pull/4170 recently got merged (not currently available in Composer), which defines a CloudSQL connection type. This might work for these use cases in the future.