Missing Google Cloud Container Image - google-cloud-platform

I'm trying to solve a tutorial from the Google cloud platform, "Detecting anomalies in financial transactions by using AI Platform, Dataflow, and BigQuery"
But, the docker image required to run the data pipeline is not available in the Google Cloud Container Registry.
The link to the image was this as provided in the tutorial github repo.
But, when triggering the pipeline, it shows the following error.

Related

What is the difference between Cloud Build and Cloud Deploy?

They both seem to be recommended CI/CD tools within Google Cloud.. but with similar functionality. Would I use one over the other? Maybe together?
Cloud Build seems to be the de facto tool. While Cloud Deploy says that it can do "pipeline and promotion management."
Both of them are designed as serverless, meaning you don't have to manage the underlying infrastructure of your builds and defining delivery pipelines in a YAML configuration file. However, Cloud Deploy needs a configuration for Skaffold, which Google Cloud Deploy needs in order to perform render and deploy operations.
And according to this documentation,
Google Cloud Deploy is a service that automates delivery of your applications to a series of target environments in a defined sequence.
Cloud Deploy is an opinionated, continuous delivery system currently supporting Kubernetes clusters and Anthos. It picks up after the CI process has completed (i.e. the artifact/images are built) and is responsible for delivering the software to production via a progression sequence defined in a delivery pipeline.
While Google Cloud Build is a service that executes your builds on Google Cloud.
Cloud Build (GCB) is Google's cloud Continuous Integration/Continuous Development (CICD) solution. And takes users code stored in Cloud Source Repositories, GitHub, Bitbucket, or other solutions; builds it; runs tests; and saves the results to an artifact repository like Google Container Registry, Artifactory, or a Google Cloud Storage bucket. Also, supports complex builds with multiple steps, for example, testing and deployments. If you want to add your CI pipeline, it's as easy as adding an additional step to it. Take your Artifacts, either built or stored locally or at your destination and easily deploy it to many services with a deployment strategy of you choice.
Provide more details in order to choose between the two services and it will still depend on your use case. However, their objectives might help to make it easier for you to choose between the two services.
Cloud Build's mission is to help GCP users build better software
faster, more securely by providing a CI/CD workflow automation product for
developer teams and other GCP services.
Cloud Deploy's mission is to make it easier to set up and run continuous
software delivery to a Google Kubernetes Engine environment.
In addtion, refer to this documentation for price information, Cloud Build pricing and Cloud Deploy pricing.

AWS SageMaker - Upload our own docker image

I am new to AWS SageMaker and i am using this technology for building and training the machine learning models. I have now developed a docker image which contains our custom code for tensorflow. I would like to upload this custom docker image to AWS SageMaker and make use of it.
I have searched various links but could not find proper information on how to upload our own custom docker image.
Can you please suggest me the recommended links regarding the process of uploading our own docker image to AWS SageMaker?
In order to work with sagemaker, you have to push your container to ECR. The most important thing is that the container must be "adapted" to be complaint to what sagemaker requires, but everything is described here. In addition if you want to take a look to an example, here is mine.. where I use my container with TF Object Detection API in AWS Sagemaker.

Google Cloud Code's Cloud Run extension stores the Docker image in Cloud Storage instead of Artifact Registry

Why when I use the Visual Studio Code extension "Cloud Code", to deploy a Cloud Run service, it seems to store the image contents in Cloud Storage (via Container Registry).
Can I make it store the image in the Google Cloud Artifact Registry instead?
I just tried the scenario and it worked for me! Following these steps should get you going.
Create an artifact registry repo at https://console.cloud.google.com/artifacts and setup docker auth on your client to use gcloud to authenticate the repo. You can find detailed steps to do this here.
When deploying to Cloud Run in Cloud Code, you'll find that it will default to a Container Registry repo as the "Container image URL", but you can easily use an artifact registry repo here instead. Here, you can paste the repo name you created in the previous step, and append an image name. Here's a screenshot of the example I just tested.

Triggering a training task on cloud ml when file arrives to cloud storage

I am trying to build an app where the user is able to upload a file to cloud storage. This would then trigger a model training process (and predicting later on). Initially I though I could do this with cloud functions/pubsub and cloudml, but it seems that cloud functions are not able to trigger gsutil commands which is needed for cloudml.
Is my only option to enable cloud-composer and attach GPUs to a kubernetes node and create a cloud function that triggers a dag to boot up a pod on the node with GPUs and mounting the bucket with the data? Seems a bit excessive but I can't think of another way currently.
You're correct. As for now, there's no possibility to execute gsutil command from a Google Cloud Function:
Cloud Functions can be written in Node.js, Python, Go, and Java, and are executed in language-specific runtimes.
I really like your second approach with triggering the DAG.
Another idea that comes to my mind is to interact with GCP Virtual Machines within Cloud Composer through the Python operator by using the Compute Engine Pyhton API. You can find more information in automating infrastructure and taking a deep technical dive into the core features of Cloud Composer here.
Another solution that you can think of is Kubeflow, which aims to make running ML workloads on Kubernetes. Kubeflow adds some resources to your cluster to assist with a variety of tasks, including training and serving models and running Jupyter Notebooks. Please, have a look on Codelabs tutorial.
I hope you find the above pieces of information useful.

Deploying a custom build of Datalab to Google Cloud platform

For a project we are trying to expand Google Cloud Datalab and deploy the modified version to the Google Cloud platform. As I understand it, the deploying process normally consists of the following steps:
Build the Docker image
Push it to the Container Registry
Use the container parameter with the Google Cloud deployer to specify the correct Docker image, as explained here.
Since the default container registry, i.e. gcr.io/cloud_datalab/datalab:<tag> is off-limits for non-Datalab contributors, we pushed the Docker image to our own container registry, i.e. to gcr.io/<project_id>/datalab:<tag>.
However, the Google Cloud deployer only pulls directly from gcr.io/cloud_datalab/datalab:<tag> (with the tag specified by the containerparameter) and does not seem to allow specification of the source container registry. The deployer does not appear to be open-source, leaving us with no way to deploy our image to Google Cloud.
We have looked into creating a custom deployment similar to the example listed here but this never starts Datalab, so we suspect the start script is more complicated.
Question: How can we deploy a Datalab image from our own container registry to Google Cloud?
Many thanks in advance.
The deployment parameters can be guessed but it is easier to get the Google Cloud Datalab deployment script by sshing to the temporary compute node that is responsible for deployment and browsing the /datalab folder. This contains a runtime configuration file for use with the App Engine Flexible Environment. Using this configuration file, the google preview app deploy command (which accepts an --image parameter for Docker images) will deploy this to the App Engine correctly.