How to Connect to Cloud SQL using Python? - google-cloud-platform

Connecting to Cloud SQL using Python is not always straightforward.
Depending on the context, sometimes you have to connect to a Unix domain socket, allow-list IP addresses for TCP connections, run the Cloud SQL Auth proxy locally. Making these connections secure is yet another challenge: you might have to manage SSL certificates, firewalls rules, IP addresses, etc.
Is there a recommended way to Connect to Cloud SQL in a secure and easy way using Python?

Yes there indeed is, the Cloud SQL Python Connector, a Python package that makes connecting to Cloud SQL both easy and secure for all three supported database engines (Postgres, MySQL, and SQL Server), from anywhere (local machine, Cloud Run, App Engine, Cloud Functions, etc.)
The Python Connector is one of the Cloud SQL connector libraries (also available in Java and Go).
How is a connector different from the other methods?
The Cloud SQL connector libraries provide the following benefits:
IAM Authorization: the connectors use IAM permissions to control who and what can connect to your Cloud SQL instances.
Improved Security: the connectors use robust, updated TLS 1.3 encryption and identity verification between the client connector and the server-side proxy, independent of the database protocol.
Convenience: the connectors remove the requirement to use and distribute SSL certificates, manage firewalls or source/destination IP addresses.
IAM Database Authentication (optional): the connectors provide support for Cloud SQL’s automatic IAM database authentication feature.
How do I use the Python Connector ... what does the code look like?
Basic Usage (using SQLAlchemy)
from google.cloud.sql.connector import Connector, IPTypes
import sqlalchemy
# Python Connector database creator function
def getconn():
with Connector() as connector:
conn = connector.connect(
"project:region:instance-name", # Cloud SQL Instance Connection Name
"pg8000",
user="my-user",
password="my-password",
db="my-db-name",
ip_type=IPTypes.PUBLIC # IPTypes.PRIVATE for private IP
)
return conn
# create SQLAlchemy connection pool
pool = sqlalchemy.create_engine(
"postgresql+pg8000://",
creator=getconn,
)
# interact with Cloud SQL database using connection pool
with pool.connect() as db_conn:
# query database
result = db_conn.execute("SELECT * from my_table").fetchall()
# Do something with the results
for row in result:
print(row)
There are interactive "Getting Started" Colab notebooks that show you how to use the Cloud SQL Python Connector – all without needing to write a single line of code yourself! The notebooks will automatically use a supported database driver based on the database engine you are using with Cloud SQL.
PostgreSQL Notebook, using pg8000
MySQL Notebook, using pymysql
SQL Server Notebook, using pytds
Does it work with popular web frameworks?
Yes, the Python Connector can easily be used in web frameworks such as Flask-SQLAlchemy (and Flask), FastAPI, etc.
Flask-SQLAlchemy code
FastAPI code

Related

Is there an On-Prem agent to connect SQL Server DB from Google Dialogflow to store/fetch data

I am exploring Google Dialogflow and creating chatbot for learning purpose. I would like to store/fetch the details into the SQL Server DB in an local machine taken from the dialogflow chat session. Similar to Workato & Celonis tools, we have an On-Prem agent to install it in the respective machine which will create a tunneling to access it without affecting the machine firewall.
I tried looking in google documentation, but unable to get proper result based on my analysis. It would be great, if I get guidance/support on how to connect SQL DB hosted in local machine from Dialogflow Inline editor using On-Prem agent.
Please let me know if I need to add any other details on the mentioned scenario.
NOTE: Based on my google search, I came to know that we can write NodeJs code and create a webhook call by hosting with ngrok or storing the data in GCP Cloud SQL instances to achieve this. But wanted to how to save/fetch data in the local machine's SQL Server from dialogflow.
Thanks in advance.

Connecting to Cloud SQL from Cloud Run via cloud-sql-proxy with IAM login enabled

I would like to connect to a Cloud SQL instance from Cloud Run, using a service account. The connection used to be created within the VPC and we would just provide a connection string with a user and a password to our PostgreSQL client. But now we want the authentication to be managed by Google Cloud IAM, with the service account associated with the Cloud Run service.
On my machine, I can use the enable_iam_login argument to use my own service account. The command to run the Cloud SQL proxy would look like this:
./cloud_sql_proxy -dir=/cloudsql -instances=[PROJECT-ID]:[REGION]:[INSTANCE] \
-enable_iam_login -credential_file=${HOME}/.config/gcloud/application_default_credentials.json
The problem is that I can't seem to find a way to use the IAM authentication method to run the Cloud SQL Proxy from Cloud Run, I can just provide an instance name. Has anyone face this problem before?
Unfortunately, there isn't a way to configure Cloud Run's use of the Cloud SQL proxy to do this for you.
If you are using Java, Python, or Go, there are language specific connectors you can use from Cloud Run. These all have the option to use IAM DB AuthN as part of them.
For anyone using NodeJS on Cloud Run :
you can't enable the -enable_iam_login option in Cloud Run (cf accepted answer), which means you have to use a connector.
AFAIK, there is no NodeJS connector right now.
Which means you have 2 options :
write a connector yourself (good luck) or wait for Google to do it
use built-in database authentication for now.

flask-sqlalchemy read/write endpoints

I'm using flask/flask-sqlalchemy/flask-restx to build small micro-services; I'm using AWS Aurora with MySQL as database tech. Aurora conveniently provides separate read/write endpoints for the db and routes traffic/connections to either the master or to the read-replicas depending on what endpoint is used.
Is there a way to discriminate the session URL that is used by flask-sqlalchemy by configuring two database URLs in the app.config and using one or the other endpoint depending on whether the get method is used versus of the post, put and delete ones?

Google Cloud SQL - How to create window authentication

I have a requirement to migrate existing on-prem SQL Server to Google cloud sql and along with this i also wanted to migrate existing jobs(internally executes ssis package) and this jobs are running with windows authentication account.
Is there any way that we can create same windows authentication on cloud sql instance

How Google Cloud Run supports to connect different micro services and different databases

Assume I want to deploy multiple micro-services by using google cloud run and those micro services will be connected each other. My questions are follows
Does each micro-service deploy separately by creating google cloud run service
Then how each micro service call each other (by using public IPs)
How to connect different micro-service with different dbs such as Mongo DB, CassandraDb. Is there way we can create NoSQL Db in compute engine and access through google cloud run.
Does each micro-service deploy separately by creating google cloud run service
Yes, each microservice is individual and has it's own http/s endpoint if you need it.
If you need to deploy more more in bulk, you can always use a CI/CD tool.
Then how each micro service call each other (by using public IPs)
When you deploy your service for the first time with an HTTP trigger you are provided with an unique url (similar to what happens with cloud functions). You can then invoke your service via HTTP as usual.
Of course if you have many services, calling them blindly it's not the best option, I advice you to use a service mesh (istio) and/or an api-gateway (cloud endpoints) in order o have better control and flexibility on your apis.
How to connect different micro-service with different dbs such as
Mongo DB, CassandraDb. Is there way we can create NoSQL Db in compute
engine and access through google cloud run.
I don't see why not but please consider the list of this known limitations of cloud run (managed): here
Basically it doesn't support a VPC connector, so you can't do it over a private ip. Also consider many of the managed db gcp offers, maybe datastore is good enough for you use case ?