azure notebooks webservice model lifecycle - web-services

As stated in title, what is the object lifecycle for the trained model instance in the webservice deployed in Azure Notebook. How long does the trained model persist in memory? Azure web app goes to sleep mode when there is no traffic hitting it. Does it mean that the trained model can become null?
An example of an azure notebook webservice is below
https://gallery.cortanaintelligence.com/Notebook/Deployment-of-AzureML-Web-Services-from-Python-Notebooks-4

Related

Connecting to Amazon Redshift from Azure Data Factory

We are attempting to connect to an Amazon Redshift Instance from Azure Data Factory as a linked service.
Steps Taken:
Provisioned Self Hosted Integration Runtime (Azure)
Created user access to database within Redshift (AWS)
White list IP addresses of SHIR within security group (AWS)
Built linked service to Redshift using log in, server address and database name (Azure)
From testing we know that this user log in works with this database for other sources and in general the process has worked for other technologies.
A screenshot of the error message received can be seen here
Any suggestions would be greatly appreciated :)
To connect to Amazon Redshift from Azure, look at using the Amazon Redshift AWS SDK for .NET. You can use the .NET Service client to write logic that performs CRUD operations on a Redshift cluster.
You can create a service client in .NET with this code:
var dataClient = new AmazonRedshiftDataAPIServiceClient(RegionEndpoint.USWest2);
Ref docs here:
https://docs.aws.amazon.com/sdkfornet/v3/apidocs/items/RedshiftDataAPIService/TRedshiftDataAPIServiceClient.html

How to deploy an automl model from gcp as a web app using streamlit on GCP

I plan to build an application which asks user input in the form of news and predicts it as fake or true.
i have trained the model using automl google cloud platform (GCP) on vertex AI platform. I have created the endpoint.
How to proceed further? How to build an app without downloading anything on local system? (all of this should be built on GCP)

How to deploy the model using sagemaker which has no model artifacts

I want to deploy the model which has no artifacts(model zip file) using sagemaker and use its endpoint in my application to get the results. Can someone help me in doing this
The steps to deploy a model in SageMaker is to -
Create a SageMaker Model
Create an endpoint configuration
Create the endpoint
SageMaker needs to access to a model.tar.gz file stored in S3 with the model and code stored in it. You can also deploy a model by training an Estimator and deploying that to an endpoint.
You can not deploy a model that has no model artifacts to a SageMaker hosted endpoint.
https://docs.aws.amazon.com/sagemaker/latest/dg/how-it-works-deployment.html
I work for AWS & my opinions are my own

Is it possible to share google oauth credentials between different services?

Lets say you have a web application that accesses (via api) a user's Google Drive files at a set time each week and performs some kind of task on them.
The user will grant authorization to the web application via it's website (which is hosted on App Engine). However, the weekly, scheduled queries to Google Drive will be carried out by cloud scheduler.
Is it possible then for Cloud Scheduler to use the same credentials (access and refresh tokens) gotten by the web application in the first instance?
Can the credentials, for example, be stored in a cloud data storage bucket, which is accessible to both the application and cloud scheduler?
Or is there another means of accomplishing this?

can I customize in odoo apps after deploying to google cloud

I'm new to google Cloud and Odoo ERP,I need to customize in 2 odoo apps Fleet and CRM, Can I customize after deploying odoo via Bitnami Lunchpad to Google Cloud, or I need to customize in local virtual machine, and Can I upload local virtual machine to google cloud or only can deploy odoo to Google via Bitnami Lunchpad.
Can Google endpoints access postgreSQL of odoo server
Any help is welcome, Thank you.
For Google Cloud Endpoints, it sits in front of your app. You can get the general architecture idea from these links depending on what platform you are choosing to use:
https://cloud.google.com/endpoints/docs/openapi/architecture-overview
https://cloud.google.com/endpoints/docs/frameworks/about-cloud-endpoints-frameworks
Since Google Cloud Endpoints sits in front of your app and forwards requests to your app, your app is still responsible for handling the requests. This would include things like accessing PostreSQL. Therefore, as long as your app can access PostgreSql, using Google Cloud Endpoints should not restrict your usage.