whoami produces different names on gcp terminal and local terminal - google-cloud-platform

I've somehow ended up as two different users depending on where I'm connecting from. I think it's the result of my org creating users for different projects. If I execute whoami from my local terminal I'm foo but if I execute the command from the ssh.cloud terminal I'm foo_foobar.
I have a folder projects on the VM and I can see it from both terminals, but all the subfolders that belong to foo are not visible to foo_foobar. OK, I get it.
The biggest issue is that from my local terminal, as foo I can't pull from or push to a cloud repo.
So my ask is: does there exist a cli command that will let me connect as foo_foobar from my local? I've looked at my config with gcloud config list and the email and project ID are correct. Thanks

The solution was to use this ssh command: gcloud compute ssh foo_foobar#foo-vm --zone us-west1-b
The confusion was caused by the command provided in the dashboard from the SSH dropdown View gcloud command which is: gcloud beta compute ssh with no "foo_foobar#foo-vm". In other words the provided ssh command does not indicate that username#instance_name should be part of the command

Related

Adding multiple scopes to a Compute Engine VM instance on Google Cloud not working

I'm trying to create a Compute Engine VM instance sample in Google Cloud that has an associated startup script startup_script.sh. On startup, I would like to have access to files that I have stored in a Cloud Source Repository. As such, in this script, I clone a repository using
gcloud source repos clone <repo name> --project=<project name>
Additionally, startup_script.sh also runs commands such as
gcloud iam service-accounts keys create key.json --iam-account <account>
which creates .json credentials, and
EXTERNAL_IP = $(gcloud compute instances describe sample --format='get(networkInterfaces[0].accessConfigs[0].natIP)' --zone=us-central1-a)
to get the external IP of the VM within the VM. To run these commands without any errors, I found that I need partial or full access to multiple Cloud API access scopes.
If I manually edit the scopes of the VM after I've already created it to allow for this and restart it, startup_script.sh runs fine, i.e. I can see the results of each command completing successfully. However, I would like to assign these scopes upon creation of the VM and not have to manually edit scopes after the fact. I found in the documentation that in order to do this, I can run
gcloud compute instances create sample --image-family=ubuntu-1804-lts --image-project=ubuntu-os-cloud --metadata-from-file=startup-script=startup_script.sh --zone=us-central1-a --scopes=[cloud-platform, cloud-source-repos, default]
When I run this command in the Cloud Shell, however, I can either only add one scope at a time, i.e. --scopes=cloud_platform, or if I try to enter multiple scopes as shown in the command above, I get
ERROR: (gcloud.compute.instances.create) unrecognized arguments:
cloud-source-repos,
default]
Adding multiple scopes as the documentation suggests doesn't seem to work. I get a similar error when use the scope's URI instead of it's alias.
Any obvious reasons as to why this may be happening? I feel this may have to do with the service account (or lack thereof) associated with the sample VM, but I'm not entirely familiar with this.
BONUS: Ideally I would like to run the VM creation cloud shell command in a cloudbuild.yaml file, which I have as
steps:
- name: 'gcr.io/cloud-builders/gcloud'
entrypoint: gcloud
args: ['compute', 'instances', 'create', 'sample', '--image-family=ubuntu-1804-lts', '--image-project=ubuntu-os-cloud', '--metadata-from-file=startup-script=startup_sample.sh', '--zone=us-central1-a', '--scopes=[cloud-platform, cloud-source-repos, default]']
I can submit the build using
gcloud builds submit --config cloudbuild.yaml .
Are there any issues with the way I've setup this cloudbuild.yaml?
Adding multiple scopes as the documentation suggests doesn't seem to work
Please use the this command with --scopes=cloud-platform,cloud-source-reposCreated and not --scopes=[cloud-platform, cloud-source-repos, default]:
gcloud compute instances create sample --image-family=ubuntu-1804-lts --image-project=ubuntu-os-cloud --zone=us-central1-a --scopes=cloud-platform,cloud-source-reposCreated
[https://www.googleapis.com/compute/v1/projects/wave25-vladoi/zones/us-central1-a/instances/sample].
NAME ZONE MACHINE_TYPE PREEMPTIBLE INTERNAL_IP EXTERNAL_IP STATUS
sample us-central1-a n1-standard-1 10.128.0.17 35.238.166.75 RUNNING
Also consider #John Hanley comment.

How to use autossh with google compute (`gcloud compute ssh`)

I would like to use autossh instead of the ssh command run by google cloud while doing
gcloud compute ssh --zone myzone --project myproject mymachine
What is the command line to replace it?
A replacement for the command line
Here is the two commands that allow you to use autossh:
gcloud compute config-ssh
autossh -M9042 mymachine.myzone.myproject
In my opinion, this is the easiest way to do it. The first line export the ssh key and configuration of your machines, and the second just use this alias.
More on ssh configuration
This also allow you to use the shortcut mymachine.myzone.myproject for scp, ssh, and all ssh related tools. Indeed, the list of your machines are exported in your ~/.ssh/config.
This is a list of aliases you can also configure by hand.
For more information on gcloud's config-ssh, you can refer to the official documentation.
Notice that some shell environment will provide you with ssh command autocompletion, which is really helpful when it comme to remembering the name of your machines. Here is a reference for zsh.

Connect to particular GCP account

I have been using the GCP console to connect to a cloud instance and want to switch to using SSH through powershell as that seems to maintain a longer persistence. Transferring my public key through cloud shell into authorized_key file seems to be temporary since once cloud shell disconnects, the file doesn't persist. I've tried using os-login but that generates a completely different user from what I've been using through cloud shell (Cloud shell creates a user: myname while gcloud creates a user: myname_domain_com. Is there a way to continue using the same profile created by cloud shell when logging in through gcloud. I am using the same email and account in both the console and gcloud myname#domain.com. The alternative is to start all over from gcloud and that would be a pain.
If you want to SSH to different instances of a google cloud project (from a mac or Linux), do the following:
Step 1. Install SSH keys without password
Use the following command to generate the keys on your mac
ssh-keygen -t rsa -f ~/.ssh/ -C
For example private-key-name can be bpa-ssh-key. It will create two files with the following names in the ~/.ssh directory
bpa-ssh-key
bpa-ssh-key.pub
Step 2. Update the public key on your GCP project
Goto Google Cloud Console, choose your project, then
VMInstances->Metadata->SSH Keys->Edit->Add Item
Cut and paste the contents of the bpa-ssh-key.pub (from your mac) here and then save
Reset the VM Instance if it is running
Step 3. Edit config file under ~/.ssh on your mac Edit the ~/.ssh/config to add the following lines if not present already
Host *
PubKeyAuthentication yes
IdentityFile ~/.ssh/bpa-ssh-key
Step 4. SSHing to GCP Instance
ssh username#gcloud-externalip
It should create a SSH shell without asking for the password (since you have created the RSA/SSH keys without a password) on the gcloud instance.
Since Metadata is common across all instances under the same project, you can seam-lessly SSH into any of the instances by choosing the respective External IP of the gcloud instance.

GCP VMs + ssh/config file

guys.
GCP offers multiple ways of ssh-ing in gcloud, cloud shell, and local machine cloud SDK.
While all these options are great and I have been using them, I normally prefer using .ssh/config to shorten the process of logging in to machines.
For an example, for EC2, you just add:
Host $name
HostName $hostname
User $username
IdentityFile $pathtoFile
Is there any way to replicate this for GCP VMs?
Thanks
According to This Doc
If you have already connected to an instance through the gcloud tool, your keys are already generated and applied to your project or instance. The key files are available in the following locations:
Linux and macOS
Public key: $HOME/.ssh/google_compute_engine.pub
Private key: $HOME/.ssh/google_compute_engine
Windows
Public key: C:\Users[USERNAME].ssh\google_compute_engine.pub
Private key: C:\Users[USERNAME].ssh\google_compute_engine
You can use the key with typical -i or in .ssh/config config file.
Or simply do
ssh-add ~/.ssh/google_compute_engine
to add the identity to your ssh agent.
PS> I've seen people create an alias for the ssh command, something like
alias gce='gcloud compute ssh'
If you want to SSH to different instances of a google cloud project (from a mac or Linux), do the following:
Step 1. Install SSH keys without password
Use the following command to generate the keys on your mac
ssh-keygen -t rsa -f ~/.ssh/<private-key-name> -C <your gcloud username>
For example private-key-name can be bpa-ssh-key. It will create two files with the
following names in the ~/.ssh directory
bpa-ssh-key
bpa-ssh-key.pub
Step 2. Update the public key on your GCP project
Goto Google Cloud Console, choose your project, then
VMInstances->Metadata->SSH Keys->Edit->Add Item
Cut and paste the contents of the bpa-ssh-key.pub (from your mac) here and then save
Reset the VM Instance if it is running
Step 3. Edit config file under ~/.ssh on your mac
Edit the ~/.ssh/config to add the following lines if not present already
Host *
PubKeyAuthentication yes
IdentityFile ~/.ssh/bpa-ssh-key
Step 4. SSHing to GCP Instance
ssh username#gcloud-externalip
It should create a SSH shell without asking for the password (since you have created the RSA/SSH keys without a password) on the gcloud instance.
Since Metadata is common across all instances under the same project, you can seam-lessly SSH into any of the instances by choosing the respective External IP of the gcloud instance.

Cannot Transfer files from my mac to VM instance on GCP

I have managed to set up a VM instance on Google cloud platform using the following instructions:
https://towardsdatascience.com/running-jupyter-notebook-in-google-cloud-platform-in-15-min-61e16da34d52
I am then able to run a Jupyter notebook as per the instructions.
Now I want to be able to use my own data in the notebook....this is where I am really struggling. I downloaded the Cloud SDK onto my mac and ran this from the terminal (as per https://cloud.google.com/compute/docs/instances/transfer-files)
My-MacBook-Air:~ me$ gcloud compute scp /Users/me/Desktop/my_data.csv aml-test:~/amlfolder
where aml-test is the name of my instance and amlfolder a folder I created on the VM instance. I don't get any error messages and it seems to work (the terminal displays the following after I run it >> 100% 66MB 1.0MB/s 01:03 )
However when I connect to my VM instance via the SSH button on the google console and type
cd amlfolder
ls
I cannot see any files! (nor can I see them from the jupyter notebook homepage)
I cannot figure out how to use my own data in a python jupyter notebook on a GCP VM instance. I have been trying/googling for an entire day. As you might have guessed I'm a complete newbie to GCP (and cd, ls and mkdir is the extent of my linux command knowledge!)
I also tried using Google Cloud Storage - I uploaded the data into a google storage bucket (as per https://cloud.google.com/compute/docs/instances/transfer-files) but don't know how to complete the last step '4. On your instance, download files from the bucket.'
If anyone can figure out what i am doing wrong, or an easier method to get my own data running into a python jupyter notebook on GCP than using gcloud scp command please help!
Definitely try writing
pwd
to verify you're in the path you think you are, there's a chance that your scp command and the console SSH command login as different users.
To copy data from a bucket to the instance, do
gsutil cp gcs://bucket-name/you-file .
As you can see in gcloud compute docs , gcloud compute scp /Users/me/Desktop/my_data.csv aml-test:~/amlfolder will use your local environment username, so the tilde in your command refers to the home directory of a username that is the same name as your local.
But when you SSH from the Browser as you can see from docs that your Gmail username will be used.
So, you should check the home directory of the user used by gcloud compute scp ... command.
The easiest way to check, SSH to your VM and run
ls /home/ --recursive