Cannot Transfer files from my mac to VM instance on GCP - google-cloud-platform

I have managed to set up a VM instance on Google cloud platform using the following instructions:
https://towardsdatascience.com/running-jupyter-notebook-in-google-cloud-platform-in-15-min-61e16da34d52
I am then able to run a Jupyter notebook as per the instructions.
Now I want to be able to use my own data in the notebook....this is where I am really struggling. I downloaded the Cloud SDK onto my mac and ran this from the terminal (as per https://cloud.google.com/compute/docs/instances/transfer-files)
My-MacBook-Air:~ me$ gcloud compute scp /Users/me/Desktop/my_data.csv aml-test:~/amlfolder
where aml-test is the name of my instance and amlfolder a folder I created on the VM instance. I don't get any error messages and it seems to work (the terminal displays the following after I run it >> 100% 66MB 1.0MB/s 01:03 )
However when I connect to my VM instance via the SSH button on the google console and type
cd amlfolder
ls
I cannot see any files! (nor can I see them from the jupyter notebook homepage)
I cannot figure out how to use my own data in a python jupyter notebook on a GCP VM instance. I have been trying/googling for an entire day. As you might have guessed I'm a complete newbie to GCP (and cd, ls and mkdir is the extent of my linux command knowledge!)
I also tried using Google Cloud Storage - I uploaded the data into a google storage bucket (as per https://cloud.google.com/compute/docs/instances/transfer-files) but don't know how to complete the last step '4. On your instance, download files from the bucket.'
If anyone can figure out what i am doing wrong, or an easier method to get my own data running into a python jupyter notebook on GCP than using gcloud scp command please help!

Definitely try writing
pwd
to verify you're in the path you think you are, there's a chance that your scp command and the console SSH command login as different users.
To copy data from a bucket to the instance, do
gsutil cp gcs://bucket-name/you-file .

As you can see in gcloud compute docs , gcloud compute scp /Users/me/Desktop/my_data.csv aml-test:~/amlfolder will use your local environment username, so the tilde in your command refers to the home directory of a username that is the same name as your local.
But when you SSH from the Browser as you can see from docs that your Gmail username will be used.
So, you should check the home directory of the user used by gcloud compute scp ... command.
The easiest way to check, SSH to your VM and run
ls /home/ --recursive

Related

Copy files from Google Cloud Shell to GCP VM

My development environment is within Google Cloud Shell, whereas the production environment is a separate GCP VM instance. After testing I would like to move all files from my dist folder to the GCP VM instance.
As a test I have tried to copy just one file.
Attempt #1: within Cloud Shell terminal
gcloud alpha cloud-shell scp cloudshell:~/README-cloudshell.txt peter#myvm:~/README-cloudshell.txt
Error #1: All sources must be local files when destination is remote.
Attempt #2: within GCP VM terminal
gcloud compute scp cloudshell:~/README-cloudshell.txt myvm:~
Error #2: All sources must be local files when destination is remote.
I appreciate any help.
from cloud shell
gcloud compute scp ~/README-cloudshell.txt user#instanceName:~
Use a correct SSH key or set the correct permission in the ~ directory

whoami produces different names on gcp terminal and local terminal

I've somehow ended up as two different users depending on where I'm connecting from. I think it's the result of my org creating users for different projects. If I execute whoami from my local terminal I'm foo but if I execute the command from the ssh.cloud terminal I'm foo_foobar.
I have a folder projects on the VM and I can see it from both terminals, but all the subfolders that belong to foo are not visible to foo_foobar. OK, I get it.
The biggest issue is that from my local terminal, as foo I can't pull from or push to a cloud repo.
So my ask is: does there exist a cli command that will let me connect as foo_foobar from my local? I've looked at my config with gcloud config list and the email and project ID are correct. Thanks
The solution was to use this ssh command: gcloud compute ssh foo_foobar#foo-vm --zone us-west1-b
The confusion was caused by the command provided in the dashboard from the SSH dropdown View gcloud command which is: gcloud beta compute ssh with no "foo_foobar#foo-vm". In other words the provided ssh command does not indicate that username#instance_name should be part of the command

How to run Jupyter notebook on AWS instance

How to run Jupyter notebook on AWS instance, chmod 400 error
I want to run my jupyter notebooks in the cloud, ec2 AWS instance.
--
I'm following this tutorial:
https://www.codingforentrepreneurs.com/blog/jupyter-notebook-server-aws-ec2-aws-vpc
--
I have the Instance ec2 all set up as well as nginx.
--
Problem is..
When typing chmod 400 JupyterKey.pem just work for MAC not Windowns Power shell
cd path/to/my/dev/folder/
chmod 400 JupyterKey.pem
ssh ubuntu#34.235.154.196 -i JupyterKey.pem
Error: The term 'chmod' is not recognized as the name of a cmdlet, function, cript, or operation
category info: ObjectNotFound
FullyQualifiedErrorId: Command notFoundException
AWS has a managed Jupyter Notebook service as part of Amazon SageMaker.
SageMaker hosted notebook instances enable you to easily spin up a Jupyter Notebook with one click, with pay per hour pricing (similar to EC2 billing), and with the ability to easily upload your existing notebook directly onto the managed instance, all directly through the instance URL + AWS console.
Check out this tutorial for a guide on getting started!
I had the same permission problem and could fix it by running the following command in the Amazon Machine Image Linux:
sudo chown user:user ~/certs/mycert.pem

Connect to particular GCP account

I have been using the GCP console to connect to a cloud instance and want to switch to using SSH through powershell as that seems to maintain a longer persistence. Transferring my public key through cloud shell into authorized_key file seems to be temporary since once cloud shell disconnects, the file doesn't persist. I've tried using os-login but that generates a completely different user from what I've been using through cloud shell (Cloud shell creates a user: myname while gcloud creates a user: myname_domain_com. Is there a way to continue using the same profile created by cloud shell when logging in through gcloud. I am using the same email and account in both the console and gcloud myname#domain.com. The alternative is to start all over from gcloud and that would be a pain.
If you want to SSH to different instances of a google cloud project (from a mac or Linux), do the following:
Step 1. Install SSH keys without password
Use the following command to generate the keys on your mac
ssh-keygen -t rsa -f ~/.ssh/ -C
For example private-key-name can be bpa-ssh-key. It will create two files with the following names in the ~/.ssh directory
bpa-ssh-key
bpa-ssh-key.pub
Step 2. Update the public key on your GCP project
Goto Google Cloud Console, choose your project, then
VMInstances->Metadata->SSH Keys->Edit->Add Item
Cut and paste the contents of the bpa-ssh-key.pub (from your mac) here and then save
Reset the VM Instance if it is running
Step 3. Edit config file under ~/.ssh on your mac Edit the ~/.ssh/config to add the following lines if not present already
Host *
PubKeyAuthentication yes
IdentityFile ~/.ssh/bpa-ssh-key
Step 4. SSHing to GCP Instance
ssh username#gcloud-externalip
It should create a SSH shell without asking for the password (since you have created the RSA/SSH keys without a password) on the gcloud instance.
Since Metadata is common across all instances under the same project, you can seam-lessly SSH into any of the instances by choosing the respective External IP of the gcloud instance.

How can i transfer a file or files from a bucket to the file system of a GCE instance?

The documentation says download it, but when i try downloading, it downloads back to my local workstation. Is there a way to just transfer it within the cloud?
To download a file from GCS bucket to an instance:
SSH to instance (using gcloud ssh command or clicking on SSH button on the instances page in console.cloud.google.com).
Issue command `gsutil cp gs://$BUCKET/$file .