Very new in Google Cloud Platform & hence asking basic question.
I am looking for an API which will be hosted in GCP. An External application will call the API to read data from BigQuery.
Can anyone help me out with any example Code/Approach?
Looking for an End-to-End cloud based solution based on Python
I can't provide you with a complete code example. But:
You can setup your python API using (Flask for example)
You can then use the python client to connect to BigQuery https://cloud.google.com/bigquery/docs/reference/libraries
Deploy your python API in Google App Engine, Cloud Run, Kubernetes, Compute, etc....
Do not forget to setup CORS and potential auth,
That's it
You can create a Python program using the Bigquery client, then deploy this program as a HTTP Cloud Function or Cloud Run service :
from flask import escape
from google.cloud import bigquery
import functions_framework
#functions_framework.http
def your_http_function(request):
#HTTP Cloud Function.
request_json = request.get_json(silent=True)
request_args = request.args
# example to retrieve argument param in the HTTP call
if request_json and 'name' in request_json:
name = request_json['name']
elif request_args and 'name' in request_args:
name = request_args['name']
# Construct a BigQuery client object.
client = bigquery.Client()
query = """
SELECT name, SUM(number) as total_people
FROM `bigquery-public-data.usa_names.usa_1910_2013`
WHERE state = 'TX'
GROUP BY name, state
ORDER BY total_people DESC
LIMIT 20
"""
query_job = client.query(query) # Make an API request.
rows = query_job.result() # Waits for query to finish
for row in rows:
print(row.name)
return rows
You have to deploy your Python code as a Cloud Function in this example
Your function can be invoked with a HTTP call with a param name :
https://GCP_REGION-PROJECT_ID.cloudfunctions.net/hello_http?name=NAME
You can also use Cloud Run that gives more flexibility because you deploy a Docker image.
Related
I have built a machine learning model using Google's AutoML Tables interface. Once the model was trained, I exported it in a docker container to my local machine by following the steps detailed on Google's official documentation page: https://cloud.google.com/automl-tables/docs/model-export. Now, on my machine, it exists inside a Docker container, and I am able to run it successfully using the following command:
docker run -v exported_model:/models/default/0000001 -p 8080:8080 -it gcr.io/cloud-automl-tables-public/model_server
Once running as a local host via Docker, I am able to make predictions using the following python code:
import requests
import json
vector = [1, 1, 1, 1, 1, 2, 1]
input = {"instances": [{"column_1": vector[0],
"column_2": vector[1],
"column_3": vector[2],
"column_4": vector[3],
"column_5": vector[4],
"column_6": vector[5],
"column_7": vector[6]}]}
jsonData = json.dumps(input)
response = requests.post("http://localhost:8080/predict", jsonData)
print(response.json())
I need to publish this model as an API to be used by my client. I have considered AWS EC2 and Azure functions, however, I have not had any success so far. Ideally, I plan to use the FastAPI interface, but do not know how to do this in a dockerized context.
I have since solved the problem, however, it comes at a cost. It is possible to deploy an AutoML model from GCP, but it will incur a small charge over time. I don't think there is a way around this, otherwise Google would loose revenue.
Once the model is up and running, the following Python code can be used to make predictions:
from google.cloud import automl_v1beta1 as automl
project_id = 'chatbot-286a1'
compute_region = 'us-central1'
model_display_name = 'DNALC_4p_global_20201112093636'
inputs = {'sessionDuration': 60.0, 'createdStartDifference': 1206116.042162, 'confirmedStartDifference': 1206116.20255, 'createdConfirmedDifference': -0.160388}
client = automl.TablesClient(project=project_id, region=compute_region)
feature_importance = False
if feature_importance:
response = client.predict(
model_display_name=model_display_name,
inputs=inputs,
feature_importance=True,
)
else:
response = client.predict(
model_display_name=model_display_name, inputs=inputs
)
print("Prediction results:")
for result in response.payload:
print(
"Predicted class name: {}".format(result.tables.value)
)
print("Predicted class score: {}".format(result.tables.score))
break
In order for the code to work, the following resources may be helpful:
Installing AutoML in python:
How to install google.cloud automl_v1beta1 for python using anaconda?
Authenticating AutoML in python:
https://cloud.google.com/docs/authentication/production
(Remember to put the json file path to the authentication token as an environment variable - this is for security purposes.)
I'm new to GCP and Python. I have got a task to import JSON file into google firestore using google cloud functions via Python.
Kindly assist please.
I could achieve this system setup using below code. Posting for your reference:-
CLOUD FUNCTIONS CODE
REQUIREMENTS.TXT (Dependencies)
`google-api-python-client==1.7.11
google-auth==1.6.3
google-auth-httplib2==0.0.3
google-cloud-storage==1.19.1
google-cloud-firestore==1.6.2`
MAIN.PY
from google.cloud import storage
from google.cloud import firestore
import json
client = storage.Client()``
def hello_gcs_generic(data, context):
print('Bucket: {}'.format(data['bucket']))
print('File: {}'.format(data['name']))
bucketValue = data['bucket']
filename = data['name']
print('bucketValue : ',bucketValue)
print('filename : ',filename)
testFile = client.get_bucket(data['bucket']).blob(data['name'])
dataValue = json.loads(testFile.download_as_string(client=None))
print(dataValue)
db = firestore.Client()
doc_ref = db.collection(u'collectionName').document(u'documentName')
doc_ref.set(dataValue)
Cloud functinos are server less functions provided by google. The beauty of cloud function is it will destroy it will invoke any trigger happens and itself once the execution is complete. Cloud functions are single purpose functions. Not only python, you can also use NodeJS and Go to write cloud functions. You can create a cloud function very easily by visiting quick start of cloud functions (https://cloud.google.com/functions/docs/quickstart-console).
Your task is to import a JSON file into google firestore. This part you can do using Firestore python connector like any normal python program and add into cloud function console or upload via gcloud. Still the trigger part is missing here. As I mentioned cloud function is serverless. It will execute when any event happens in the attached trigger. You haven't mentioned any trigger here (when you want to trigger the function). Once you give information about the trigger I can give more insights on resolution.
Ishank Aggarwal, You can add the above code snippet as a part of cloud function by following steps:
https://console.cloud.google.com/functions/
Create function using function name, yours requirements, choose run time as python and choose trigger as your gcs bucket.
Once you create it, if any change happens in your bucket, the function will trigger and execute your code
We have a model currently serving on Cloud ML. As a modification we added connections to datastore, which return 403, Insufficient privileges.
The mock code generating the error is:
from google.cloud import datastore
import datetime
# create & upload task
client = datastore.Client()
key = client.key('Task')
task = datastore.Entity(
key, exclude_from_indexes=['description'])
task.update({
'created': datetime.datetime.utcnow(),
'description': 'description',
'done': False
})
client.put(task)
# now list tasks
query = client.query(kind='Task')
query.order = ['created']
return list(query.fetch())
The next step would be adding credentials (service account) and exporting new path to GOOGLE_APPLICATION_DEFAULT parameter. However, since getting this account is difficult (company layering), I'd like to save time by asking the question.
Is the only way of communication with a NoSQL DB via Cloud Functions? Is that the common approach?
When you create your Model, you need custom prediction, and define a service account that has access to your resources.
gcloud components install beta
gcloud beta ai-platform versions create your-version-name \
--service-account your-service-account-name#your-project-id.iam.gserviceaccount.com
...
There are great but still limited set of SDK access to GCP APIs within cloud function SDKs. e.g. Node.
I want to call gcloud cli within a cloud function. Is this possible? e.g.
gcloud sql instances patch my-database --activation-policy=NEVER
The goal is nightly shutdown of an SQL instance
I believe you should use the Cloud SQL Admin API. If you're using the Python runtime for example you'd had 'google-api-python-client==1.7.8' (for example) to your requirements file and on the respective client library you would use the method instances.patch with the appropriate parameters.
Hope this helps.
Also you have here a working example with the Python runtime, just be sure to edit the 'projid' and 'instance' variables accordingly.
from googleapiclient.discovery import build
service = build('sqladmin', 'v1beta4')
projid = '' #project id where Cloud SQL instance is
instance = '' #Cloud SQL instance
patch = {'settings': {'activationPolicy':'NEVER'}}
req = service.instances().patch(project=projid, instance=instance, body=patch)
x = req.execute()
print(x)
I need to automate a process to extract data from Google Big Query and exported to an external CSV in a external server outside of the GCP.
I just researching how to to that I found some commands to run from my External Server. But I prefer to do everything in GCP to avoid possible problems.
To run the query to CSV in Google storage
bq --location=US extract --compression GZIP 'dataset.table' gs://example-bucket/myfile.csv
To Download the csv from Google Storage
gsutil cp gs://[BUCKET_NAME]/[OBJECT_NAME] [OBJECT_DESTINATION]
But I would like to hear your suggestions
If you want to fully automatize this process, I would do the following:
Create a Cloud Function to handle the export:
This is the more lightweight solution, as Cloud Functions are serverless, and provide flexibility to implement code with the Client Libraries. See the quickstart, I recommend you to use the console to create the functions to start with.
In this example I recommend you to trigger the Cloud Function from an HTTP request, i.e. when the function URL is called, it will run the code inside of it.
An example Cloud Function code in Python, that creates the export when a HTTP request is made:
main.py
from google.cloud import bigquery
def hello_world(request):
project_name = "MY_PROJECT"
bucket_name = "MY_BUCKET"
dataset_name = "MY_DATASET"
table_name = "MY_TABLE"
destination_uri = "gs://{}/{}".format(bucket_name, "bq_export.csv.gz")
bq_client = bigquery.Client(project=project_name)
dataset = bq_client.dataset(dataset_name, project=project_name)
table_to_export = dataset.table(table_name)
job_config = bigquery.job.ExtractJobConfig()
job_config.compression = bigquery.Compression.GZIP
extract_job = bq_client.extract_table(
table_to_export,
destination_uri,
# Location must match that of the source table.
location="US",
job_config=job_config,
)
return "Job with ID {} started exporting data from {}.{} to {}".format(extract_job.job_id, dataset_name, table_name, destination_uri)
requirements.txt
google-cloud-bigquery
Note that the job will run asynchronously in the background, you will receive a return response with the job ID, which you can use to check the state of the export job in the Cloud Shell, by running:
bq show -j <job_id>
Create a Cloud Scheduler scheduled job:
Follow this documentation to get started. You can set the Frequency with the standard cron format, for example 0 0 * * * will run the job every day at midnight.
As a target, choose HTTP, in the URL put the Cloud Function HTTP URL (you can find it in the console, inside the Cloud Function details, under the Trigger tab), and as HTTP method choose GET.
Create it, and you can test it in the Cloud Scheduler by pressing the Run now button in the Console.
Synchronize your external server and the bucket:
Up until now you only have scheduled exports to run every 24 hours, now to synchronize the bucket contents with your local computer, you can use the gsutil rsync command. If you want to save the imports, lets say to the my_exports folder, you can run, in your external server:
gsutil rsync gs://BUCKET_WITH_EXPORTS /local-path-to/my_exports
To periodically run this command in your server, you could create a standard cron job in your crontab inside your external server, to run each day as well, just at a few hours later than the bigquery export, to ensure that the export has been made.
Extra:
I have hard-coded most of the variables in the Cloud Function to be always the same. However, you can send parameters to the function, if you do a POST request instead of a GET request, and send the parameters as data in the body.
You will have to change the Cloud Scheduler job to send a POST request to the Cloud Function HTTP URL, and in the same place you can set the body to send the parameters regarding the table, dataset and bucket, for example. This will allow you to run exports from different tables at different hours, and to different buckets.