Does anyone know how to view the contents of a directory in gcloud.
I ran
gcloud compute ssh --zone=us-west1-b cs231-vm
from powershell and connected to my instance.
I am trying to navigate to like this:
cd cs231n/datasets
according to a tutorial here:
http://cs231n.github.io/assignments2018/assignment1/
But it says no such file or directory and so I want to know what is in the current directory. I tried ls and dir but get nothing.
ls or dir definitely works on gcloud, it seems probably you might have missed few steps of downloading folder/data. Please see if you have completed First time setup from http://cs231n.github.io/gce-tutorial/
You can also 'view gcloud command' by clicking ssh dropdown available at list of vm-instances page. Additionally you can pass --project='project-name' to your gcloud ssh command.
Related
I am trying to athenticate to the gcloud sdk using : gcloud init.
I get a URL I'm supposed to access in order to copy a token and return it to the CLI... but instead of a token, I get this error :
Erreur d'autorisation
Erreur 400 : invalid_request
Missing required parameter: redirect_uri
Is this a bug?
gcloud version info:
Google Cloud SDK 377.0.0
alpha 2022.03.10
beta 2022.03.10
bq 2.0.74
bundled-python3-unix 3.8.11
core 2022.03.10
gsutil 5.8
I am running gcloud init on wsl2 (Ubuntu 18.04). This error occurs right after the installation of gcloud with sudo apt install google-cloud-sdk.
I had the same problem and gcloud has slightly changed the way their auth flow works.
Run gcloud auth login and then copy the whole output (not just the URL) to a terminal on a computer that has both a web browser and gcloud CLI installed. The command you should copy looks like
gcloud auth login --remote-bootstrap="https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=****.apps.googleusercontent.com&scope=openid+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fuserinfo.email+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fappengine.admin+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcompute+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Faccounts.reauth&state=****&access_type=offline&code_challenge=****&code_challenge_method=S256&token_usage=remote"
When you run that on your computer that has a web browser, it will open a browser window and prompt you to log in. Once you authorize your app in the web browser you get a new URL in your terminal that looks like
https://localhost:8085/?state=****&code=****&scope=email%20openid%20https://www.googleapis.com/auth/userinfo.email%20https://www.googleapis.com/auth/cloud-platform%20https://www.googleapis.com/auth/appengine.admin%20https://www.googleapis.com/auth/compute%20https://www.googleapis.com/auth/accounts.reauth&authuser=0&hd=****&prompt=consent
Paste this new URL back into the prompt in your headless machine after Enter the output of the above command: (in your case, this would be in your WSL2 terminal). Press enter and you get the output
You are now logged in as [****].
Your current project is [None]. You can change this setting by running:
$ gcloud config set project PROJECT_ID
[8]+ Done code_challenge_method=S256
Try
gcloud init --console-only
Then you will get the url which will work.
You must log in to continue. Would you like to log in (Y/n)? y
WARNING: The --[no-]launch-browser flags are deprecated and will be removed on June 7th 2022 (Release 389.0.0). Use --no-browser to replace --no-launch-browser.
Go to the following link in your browser:
https://accounts.google.com/o/o....
update 2022-06-20. option console-only is removed for version 389.0.0.
So instead use
gcloud init --no-browser
There are some workarounds and they depend on your particular Windows environment.
In this post and in this one you can check the most related issues with respect to gcloud running in WSL.
Here you can find some Google groups related threads that might be helpful.
Finally, you could check some related Windows troubleshootings that can help in issues related to WSL2 on your own environment.
EDIT:
it seems this answer and the one from #K.I. give other commands that don't rely on implementation details. I've tested those 3 commands:
gcloud init --console-only
gcloud auth login --no-launch-browser
gcloud init --no-launch-browser
Original answer, another workaround (17/07/2022):
DISPLAY=":0" gcloud auth login
is a workaround mentioned in this issue. Instead of requiring you to install gcloud CLI outside WSL2, it pretends there is a browser.
A link is printed, click it, login on your browser, and you're authenticated with the CLI.
Then run again gcloud init.
You can do it without error by using another method of gcloud installation :
curl https://sdk.cloud.google.com | bash
exec -l $SHELL #restart shell
gcloud init
I am trying to do host this static website on Google App Engine and I am stuck on this crucial part of the process:
-bash: gcloud: command not found
I get into Google Cloud Platform, then login into the SSH, look for the files, then when I try to deploy, nothing happens. The two main files in this equation include: app.yaml and www (www containing the html and files). I am grabbing a file with a html, then making it the index.html. The index.html is what you see when you open the website after compiling the file(s) (with the command "gcloud app deploy"). After a couple other steps, it becomes available to view on the static website.
I have been trying to find a solution for a few hours now.
Here is what the code looks right now when trying to deploy:
vergil11$ cd Files
vergil11$ ls
websitegc
vergil11$ cd websitegc
vergil11$ ls
app.yaml IMD233 Files README.md www
vergil11$ gcloud app deploy
-bash: gcloud: command not found
vergil11$
Any help provided, thanks
You need to add gloud on your %PATH% (Windows) or $PATH (Linux/Mac)
See here for Mac
Or "How To Install Google Cloud GCP Command Line Utility gcloud ?" for Windows.
Here for Linux, modifying your ~/.profile
I have multiple projects in Google cloud and I need to find-out unused external ip address in all the projects. I have a query which works for one project but is there a way to run a query which runs on all projects together.
I am trying to avoid time and effort for switching projects every time.
Command to extract reserved pip's in a single project - gcloud compute addresses list --filter=status:reserved
For a process like this, It would be better to create a script that runs this for you! One great thing about gcloud commands is that they can be used in shell languages to help make things like this possible!
Open cloud shell in GCP, create a file called "script.sh" and write something like this to the file...
#The below line will do an action for every project in the project list
for project in $(gcloud project list --format='(project_id)');
do
#This gcloud command will run for every instance of project in projectlist
echo $(gcloud compute addresses list --project=$project --filter=status:reserved)
#ouput to csv
done >> output.csv
once this is done, make sure to grant yourself permission to run this script by typing...
chmod 755 script.sh
then run the script...
./script.sh
Let me know if this helps! Comment to this answer if you need more clarification or help!
I would like to know if there is a way to rename an existing 'gcloud topic configurations' e.g. I would like rename 'foo' to 'bar' in the below example.
I couldn't find anything on this in the gcloud reference documents.
Technically, it is not possible to change the name of that configuration using the gcloud command.
However, you can change it doing this little workaround:
Use gcloud config configurations activate [YOUR_CONFIG_NAME] to activate the configuration you wish.
Use gcloud info --format='get(config.paths.active_config_path)' to find the directory where your configurations are stored. You will get the path of the file of that specific configuration, looking like this /tmp/tmp.XAfddVDdg/configurations/[YOUR_CONFIG_NAME]
If you cd into the directory /tmp/tmp.XAfddVDdg/configurations/, you will find all your configurations there. Every configuration will be named there like this config_[YOUR_CONFIG_NAME]. Modifying the part that matches the name of your configuration will successfully change its name. DO NOT delete the config_ part of the name.
After this, is you print all the configurations using gcloud config configurations list, you will find your configuration renamed, but none will be active now. Just activate it with gcloud config configurations activate [YOUR_CONFIG_NAME], and you will be good to go.
Don't know when this was added, but there is a remame command for configurations. So no more need to jump through hoops by deleting and recreating configurations or directly editing the file.
gcloud config configurations rename CONFIGURATION_NAME --new-name=NEW_NAME
https://cloud.google.com/sdk/gcloud/reference/config/configurations/rename
I am trying to copy files from my instance to my local directory using following command
gcloud compute scp <instance-name>:~/<file-name> ~/Documents/
However, it is showing error as mentioned below
$USER/Documents/: Is a directory
ERROR: (gcloud.compute.scp) [/usr/bin/scp] exited with return code [1].
Copying from local directory to GCE works fine.
I have checked Stanford's tutorial and Google's documentation as well.
I have one another instance where there is no issue like this.
I somewhat believe it might be issue with SSH keys.
What might have gone wrong?
Your command is correct if your source and destination paths are correct
The command as you've posted in your question works for me when copying a file from the Google Compute Engine VM to my local machine.
$ gcloud compute scp vm1:~/.bashrc ~/Documents/
.bashrc 100% 3515 3.4KB/s 00:00
I also tried doing the copy from other side (i.e. from my local machine to GCE VM) and it works:
$ gcloud compute scp ~/Documents/.bashrc vm1:~/temp/
.bashrc 100% 3515 3.4KB/s 00:00
$ gcloud compute scp ~/Documents/.bashrc vm1:~/.bashrc-new
.bashrc 100% 3515 3.4KB/s 00:00
gcloud relies on the scp executable present in your PATH. The arguments you provide to the gcloud scp command are passed through to the scp binary. Assuming your source and destination paths are correct, it should work.
Recursive copying using scp
Based on your particular error message though, I've seen that variation only appear when the source path you're trying to copy from is a directory instead of file. For that particular case, you can pass a --recurse argument (similar to the -r argument supported by regular scp) which will recursively copy all files and directories under the specified directory.
gcloud compute scp --recurse SRC_PATH DEST_PATH
To copy files from VM to your desktop you can simply SSH into the VM and on top right corner there is a settings button, there you will find the download file option just enter the path of file.
If it is folder then first zip the folder then download it.
Everything was perfect except I was trying to run these commands on the terminal connected to GCE instead of local terminal.
oyashi#oyashi-torch-instance:~$ gcloud compute scp oyashi-torch-instance:~/spring1617_assignment1.zip ~/Documents/
/home/oyashi/Documents/: Is a directory ERROR: (gcloud.compute.scp)
[/usr/bin/scp] exited with return code [1].
But when I tried this one on my local terminal. This happened.
oyashi#oyashi:~/Documents$ gcloud compute scp oyashi-torch-instance:~/spring1617_assignment1.zip ~/Documents/
spring1617_assignment1.zip 100% 42KB 42.0KB/s 00:00
Thank you everyone for their comments and help. I know its a silly mistake from my end. But I posted this answer so that others might learn from my silliness.
If you need to pass the information of zone, project name you may like to do as it worked for me:
the instance name is the name you chose in the GCP instances.
gcloud beta compute scp --project "project_name" --zone "zone_name" instance_name:~jupyter/file_name /home/Downloads
I met the same problem. The point is you should run the scp command from a local terminal, rather than cloud terminal.
For copying file to local machine from Ubuntu vmware
For ex: you have instance by name : bhk
Run a basic nginx server and copy all the files in /var/www/html (nginx serving dir) and then from your local machine simple run wget <vm's IP>/<your file path>
For example If my vm's IP is 1.2.3.4 and I want to copy /home/me/myFolder/myFile , then simply copy this file in /var/www/html
then run wget 1.2.3.4/myfile
this works for me:
gcloud compute scp --project "my-project" ./my-file.zip user#instance-1:~
--project - google cloud project name
my-file.zip - local file to send to VM
user - vm linux username
instance-1 - instance name (vm name)
~ - instance destination path
I use below script to upload directory from local to remote directory
gcloud compute scp --recurse myweb-app/www/* user#instant-name:/var/www/html/sub-sites/myweb-app/www/