I'd like to review some startup scripts that were used with terminated and deleted instances in Google Cloud. Is there a way to get these via gcloud or the GCP console?
For terminated and deleted instances reviewing startup scripts is not possible without using a machine image, as Machine image offers a quick backup of a VM as a whole, in contrast to other features like PD snapshots and custom images.
Related
I just set up everything on google cloud shell (terminal that activates when you press "activate cloud shell" on top right). However, i just now realised it is not the same thing as the Instance-1 i created as connecting to it by pressing "SSH" leaves me with none of the software i installed in google cloud shell.
Why is it so? Is it possible to add google cloud shell to the Instance-1 virtual machine i just created? (it would greatly easy software setup)
As #John hanley expressed in the comments, cloud shell are transient and it isn't suggested to utilize it to store or introduce lasting data and applications.
In your case, you'd like to have an instance where you've got a default setup with the fundamental software preinstalled. There are 3 alternatives you'll utilize for this situation.
Using Snapshot, Image or Instance template.
A snapshot reflects the contents of a persistent disk in a concrete
instant in time. An image is the same thing, but includes an
operating system and boot loader and can be used to boot an instance.
But snapshot is cheaper in terms of cost.
Images are primarily meant for boot disk creation. They optimized for
multiple downloads of the same data over and over. If the same image
is downloaded many times, subsequent to the first download the following
downloads are going to be very fast (even for large images).
Instance templates define the machine type, boot disk image or
container image, labels, and other instance properties. An instance
template is a global resource that is not bound to a zone or a region.
All these options can create an instance using the cloud shell and you can get the command line version by clicking the Equivalent command line button when creating the resource.
After the setup, you will just initiate the command on cloud shell based on the option you chose then connect to your instance via SSH from cloud shell as also suggested by #dany L.
I need a Compute Engine instance to import the exact configuration (IP, services, files, etc...) of the original machine, without impacting the frontend if it concerns a web server for example. While running this machine, I would be able to shut down the original machine to increase its RAM or vCPUs before starting it again and deleting the cloned instance.
The problem is that I want to automate this process, and that's why I need the gcloud command. So is there a way to clone an entire gcp instance using the gcloud command or another tool?
This is not possible with the gcloud. This is possible with the cloud console, but as you can see in this documentation:
Restrictions
You can only use this feature in the Cloud Console; this feature is not supported in the gcloud tool or the API.
What you could do is create similar (not completely equal) instances from a custom image, using that all you have to do is use the following command:
gcloud compute instances create --image=IMAGE
More details on that command can be found here
I am trying to build an app where the user is able to upload a file to cloud storage. This would then trigger a model training process (and predicting later on). Initially I though I could do this with cloud functions/pubsub and cloudml, but it seems that cloud functions are not able to trigger gsutil commands which is needed for cloudml.
Is my only option to enable cloud-composer and attach GPUs to a kubernetes node and create a cloud function that triggers a dag to boot up a pod on the node with GPUs and mounting the bucket with the data? Seems a bit excessive but I can't think of another way currently.
You're correct. As for now, there's no possibility to execute gsutil command from a Google Cloud Function:
Cloud Functions can be written in Node.js, Python, Go, and Java, and are executed in language-specific runtimes.
I really like your second approach with triggering the DAG.
Another idea that comes to my mind is to interact with GCP Virtual Machines within Cloud Composer through the Python operator by using the Compute Engine Pyhton API. You can find more information in automating infrastructure and taking a deep technical dive into the core features of Cloud Composer here.
Another solution that you can think of is Kubeflow, which aims to make running ML workloads on Kubernetes. Kubeflow adds some resources to your cluster to assist with a variety of tasks, including training and serving models and running Jupyter Notebooks. Please, have a look on Codelabs tutorial.
I hope you find the above pieces of information useful.
I have been using both Google Colab and GCP VM instances for training some deep learning models.
With Google Colab, I haven't had any issues so far with installing a wide variety of deep learning packages that I explore as part of my work. With GCP VM instances though (even using the pre-configured Deep Learning VMs offered by Google), I frequently run into issues.
I was wondering if there is a way to export the Google Colab VM as an image to Google Cloud Storage, and then launch an instance in GCP using this image.
I tried searching online for this, but I couldn't find anything.
Is this do-able? Or, are there any other alternatives?
Thanks!
Background
In our current Google Cloud Project it seems it's common that people create a VM, start it, do some of their stuff and then just leave it there without turning it off or deleting the VM.
I'm trying to write an automated script that pulls the list of VM instances every day and send an E-mail to the VM owners and see if they forgot to turn off their VMs.
Question
So is there an easy way to find out who created a GCE VM instance using Google Cloud API?
You can view the information in Stackdriver -> Logging -> Logs. The log is a JSON file, the actor field is what you are looking for. You can export the logs for analysis.
See https://cloud.google.com/logging/docs/