I have tried using this code, but it didn't work.
gcloud compute ssh user#ftp.example.com --zone my_zone \
--command='gsutil cp path/to/my_file gs://MY_BUCKET'
I tried even copying files from the server via http request too, no results. I have made the files available to public and tried this code.
gsutil cp http://example.com/dir1/ gs://MY_BUCKET
I am getting errors as such the http is unrecognized.
I also have the ssh access to the remote server.
gsutil cp allows you to stream in data it receives from stdin per documentation here: https://cloud.google.com/storage/docs/gsutil/commands/cp Pasted below:
Streaming Transfers Use '-' in place of src_url or dst_url to perform
a streaming transfer. For example:
long_running_computation | gsutil cp - gs://my-bucket/obj Streaming
uploads using the JSON API (see gsutil help apis) are buffered in
memory part-way back into the file and can thus retry in the event of
network or service problems.
Using this, we can stream the scp copy from your remote server to standard out (if you're on linux or mac) and pipe it into the gsutil cp like this:
scp <USER>#<YOUR_SERVER>:/<PATH>/<FILE_NAME> /dev/stdout | gsutil cp - gs://<BUCKET_NAME>/<FILE_NAME>
What you should be doing is not a regular copy, but an scp - secured copy.
gcloud compute scp securely copies files between a virtual machine instance and your local machine using the scp command.
More information on all available switches and descriptions can be found on online docs
Combining the other answers, to copy a file from a Compute Engine instance to Google Cloud Storage, you need the following command:
gcloud compute scp --zone [ZONE] [USER]#[INSTANCE_NAME]:/path/to/file /dev/stdout | gsutil cp - gs://[BUCKET_NAME]/[FILE_NAME]
If the file is located in a remote server other than a Compute Engine instance, you can use "scp" as Nicholas said.
Related
How to copy files from a remote server to google cloud storage bucket over SSH? This question outlines how to copy files from a google compute engine instance to google cloud storage which is not what I need.
The files are too large to download to my local machine before uploading them. Is it possible to do something like this:
ssh -p 2455 skiventist#myremoteserver | gsutil cp -r /path/to/remote_folder :gs//mybucket
Error:
CommandException: No URLs matched: /path/to/remote_folder
it doesn't matter if it's GCE or another server, they are both remote servers. you need to:
1- ssh into your remote server
2- make sure that you have gsutil installed and that you are authorized to manage the bucket.
3- copy the files to that remote server:
gsutil cp -r /path/to/remote_folder gs//mybucket
you can use also -m flag which will perform a parallel multi-threaded/multi-processing and make the transfer even faster.
check this doc for more info
I have a system built on top of Google's services, however AWS seems to have a terrific setup for video utilities (https://aws.amazon.com/elastictranscoder/ and https://aws.amazon.com/mediaconvert/). Is it possible to send my users' video from GCP to AWS and back again?
You can do it if you use Google Cloud Storage and Amazon S3 to store and exchange data between clouds.
Have a look at the gsutil command line documentation:
The gsutil tool lets you access Cloud Storage from the command line.
It can also be used to access and work with other cloud storage
services that use HMAC authentication, like Amazon S3. For example,
after you add your Amazon S3 credentials to the .boto configuration
file for gsutil, you can start using gsutil to manage objects in your
Amazon S3 buckets.
To do it, follow Setting Up Credentials to Access Protected Data guide, then go to your ~/.boto file and find these lines:
# To add HMAC aws credentials for "s3://" URIs, edit and uncomment the
#aws_access_key_id = <your aws access key ID>
#aws_secret_access_key = <your aws secret access key>
fill in the aws_access_key_id and aws_secret_access_key settings with your S3 credentials.
After that, you'll be able to copy from S3 to GCS or vice versa:
gsutil cp -R s3://my-aws-bucket gs://my-gcp-bucket
If you have a large number of files to transfer you might want to use
the top-level gsutil -m option (see gsutil help options), to perform a
parallel (multi-threaded/multi-processing) copy:
gsutil -m cp -R s3://my-aws-bucket gs://my-gcp-bucket
for more information check gsutil cp documentation.
Also, you can use gsutil rsync command to synchronizes data between S3 and GCP:
gsutil rsync -d -r s3://my-aws-bucket gs://my-gcp-bucket
for more information check gsutil rsync documentation.
I'm trying download the whole data from my bucket (tracking-data) on google cloud to my instance (instance-1) on Linux system.
I see some options here:
https://cloud.google.com/compute/docs/instances/transfer-files#transfergcloud
but I'm not sure there's a way there to download from bucket to instance.
I'm accessing my instance through my terminal and I've made a few tries with gsutil, but not successfully so far.
Any idea how can I download the whole bucket into my instance? (preferably to put it in MDNet/data, I don't have such directory yet, but I probably should store the data there).
First of all, check the API access rights for your Compute Engine service account:
For instance, read only:
Then, just use gsutil cp (doc) or even gsutil rsync (doc):
gsutil -m cp -r gs://<your-bucket>/* <destination_folder>
Disclaimer: Comments and opinions are my own and not the views of my employer.
Use gsutil cp or gsutil rsync
https://cloud.google.com/storage/docs/gsutil/commands/cp
https://cloud.google.com/storage/docs/gsutil/commands/rsync
Adding fullstop after hitting a space at the end helped me
I would like to verify my website in Google Search Console by uploading a file to my website - example.com which is hosted on Google Cloud Compute Engine (Wordpress site deployed with Google Cloud Launcher) . It has to be placed on a website as example.com/123googlesearchconsoleexample.html , but I cannot upload it via Cloud Shell from a Google Cloud Storage bucket.
Is there a way to do this?
I tried gcloud compute copy-files command
gcloud compute copy-files gs://example/123googlesearchconsoleexample.html example-prd-vm:~/
, but it seems that it does not work in the browser and this is how I would like to do it (without installing SDK).
The error I get is:
(gcloud.compute.copy-files) All sources must be local files when destination is remote.
I don't know if there's a way of doing it in a single step, but it looks like it should be easy to do in two steps:
$ gsutil cp gs://example/123googlesearchconsoleexample.html .
$ gcloud compute copy-files 123googlesearchconsoleexample.html example-prd-vm:~/
The first command copies from Google Cloud Storage to the VM running Cloud Shell. The second command then copies from that VM to example-prd-vm... it's now copying a local file, so the error you're currently getting won't be applicable.
First list all the buckets you have using the following command
gsutil ls
Use the following command to download a file from your Google Cloud Storage bucket to your local location
gsutil cp gs://bucketname/filename local-location
You can use the -r option to download a folder from GCS
gsutil cp -r gs://bucketname/folder-name local-location
You can also use the -m option to download large number of files which performs a parallel (multi-threaded/multi-processing) copy
gsutil -m cp -r gs://bucketname/folder-name local-location
In order to copy one file:
gsutil -m cp "gs://yourbucketnam/folder/../filename.txt"
In order to copy the whole folder:
gsutil -m cp "gs://yourbucketnam/folder"
In order to copy a folder and all its contents.
How would I copy files from a remote server to a google bucket? For example,
gcloud compute scp username#server:/path/to/file gs://my-bucket
This method gives the error: All sources must be local files when destination is remote.
Additionally, gsutil only provides support for cp and not scp.
Thanks in advance!
You can also directly execute gsutil command on your GCE VM (Most VM images have Cloud SDK preinstalled). For example:
gcloud compute ssh user#server --zone my_zone \
--command='gsutil cp path/to/my_file gs://MY_BUCKET'
Note that for this to work your service account associated with VM must have appropriate access scope to GCS. If you run
gcloud beta compute instances describe my_instance --zone my_zone \
--format="value(serviceAccounts.scopes)"
It will show list of scopes set for VM service account. Make sure you have https://www.googleapis.com/auth/cloud-platform or https://www.googleapis.com/auth/devstorage.full_control or https://www.googleapis.com/auth/devstorage.read_write. If not you can use
set-scopes beta command to reset them or go to console and edit VM in question.
gsutil supports streaming data, so you can try to pipe the data in.
Not 100% certain this works with arbitrary binary data (and I'm on a phone so I can't test it)
gcloud compute ssh username#server --command='cat /path/to/file' | gsutil cp - gs://my-bucket