Google Cloud - Downloading data from bucket to instance - google-cloud-platform

I'm trying download the whole data from my bucket (tracking-data) on google cloud to my instance (instance-1) on Linux system.
I see some options here:
https://cloud.google.com/compute/docs/instances/transfer-files#transfergcloud
but I'm not sure there's a way there to download from bucket to instance.
I'm accessing my instance through my terminal and I've made a few tries with gsutil, but not successfully so far.
Any idea how can I download the whole bucket into my instance? (preferably to put it in MDNet/data, I don't have such directory yet, but I probably should store the data there).

First of all, check the API access rights for your Compute Engine service account:
For instance, read only:
Then, just use gsutil cp (doc) or even gsutil rsync (doc):
gsutil -m cp -r gs://<your-bucket>/* <destination_folder>
Disclaimer: Comments and opinions are my own and not the views of my employer.

Use gsutil cp or gsutil rsync
https://cloud.google.com/storage/docs/gsutil/commands/cp
https://cloud.google.com/storage/docs/gsutil/commands/rsync

Adding fullstop after hitting a space at the end helped me

Related

Problems deleting GCP storage bucket

I have two buckets that I can't delete on GCP, tried through the UI. I can't find any support info from GCP and can't delete them. They are relatively large and I want to avoid the cost as well. Do I need to delete the project?
Thanks for any help!
As previously posted here, this process may actually take a while, so perhaps just wait a little bit.
But, if it still doesn't work than perhaps you can follow what was suggested.
As a workaround to the UI being unclear, you can use gsutil to remove all files in a bucket, followed by the bucket itself, using gsutil rm -r gs://bucket.
Or perhaps gsutil rm -m gs://bucket for a parallel deletion.
Try to delete GCP bucket using UI (Check you have access to delete GCP bucket )
Otherwise, you can delete the GCP bucket using gsutil
// Set Project name
gcloud config set project <Project_Name>
// Delete Bucket
gsutil rm -m gs://<Bucket-Name>

Fetch content in AWS S3 public bucket from GCP Data Storage

I am trying to fetch the content of the bucket s3://open-images-dataset from GCP data storage through the gsutil or the transfer service. I am using the following command in the case of the command line alternative:
gsutil -m -o GSUtil:parallel_composite_upload_threshold=150M cp -r --no-sign-request s3://open-images-dataset gs://<bucket-name>
The problem here is that the s3://open-images-dataset is public and one would usually do --no-sign-request when downloading it to a local directory. However as far as I have been able to see GCP don't allow any option to go over this issue. Any idea about that problem?
I can not download it first to my local machine because the content of the bucket is too big.
It is not possible at the moment but a PR has been issued to the boto library.
gsutil uses the Boto library to handle communicating with S3. After a bit of digging through the code, it seems Boto allows specifying that an individual connection should be anonymous... but it looks it would require patching the Boto library to make all S3 connections for a given session be anonymous (i.e. setting a Boto config option like "no_sign_request = True" under the [s3] section).
When I try to list that bucket with AWS credentials set, via gsutil ls s3://open-images-dataset, the signed request succeeds. Given that it works, is there any particular reason you don't want the request to be signed?
Edit
I submitted this pull request to add support for no_sign_request in Boto:
https://github.com/boto/boto/pull/3833
It will be in the next version of Boto, whenever they decide to release it. At that point, gsutil can grab the new version and include it in a subsequent release.

GCP - Google cloud platform: Is there any way to push code from eclipse to cloud without any version control system?

I'm new to GCP and was exploring some features and need to find a way to push local files to GCP path without any version control.
I know following way to push the code to GCP:
1) using gsutil cp command copy local file changes to GCP bucket.
Eg: gsutil -m cp -r sourcepath gs://bucket-id
2) Copy code from GCP bucket to desired path.
Eg: gsutil cp -r gs://bucket-id/xyz /path
Is there any alternative way to achieve this using eclipse? to transfer files directly from local system to GCP path. Or do we have any eclipse plugin to achieve the same?
Any help is highly appreciated.
Why would you need to copy the files directly to the gcloud shell directory?
If it is for a backup the bucket is already the best tool for that. if it is for deployment you can already use the App Engine Standard plugin from eclipse.

How to transfer a file from Google Cloud Storage to Compute Engine instance using gcloud terminal?

I would like to verify my website in Google Search Console by uploading a file to my website - example.com which is hosted on Google Cloud Compute Engine (Wordpress site deployed with Google Cloud Launcher) . It has to be placed on a website as example.com/123googlesearchconsoleexample.html , but I cannot upload it via Cloud Shell from a Google Cloud Storage bucket.
Is there a way to do this?
I tried gcloud compute copy-files command
gcloud compute copy-files gs://example/123googlesearchconsoleexample.html example-prd-vm:~/
, but it seems that it does not work in the browser and this is how I would like to do it (without installing SDK).
The error I get is:
(gcloud.compute.copy-files) All sources must be local files when destination is remote.
I don't know if there's a way of doing it in a single step, but it looks like it should be easy to do in two steps:
$ gsutil cp gs://example/123googlesearchconsoleexample.html .
$ gcloud compute copy-files 123googlesearchconsoleexample.html example-prd-vm:~/
The first command copies from Google Cloud Storage to the VM running Cloud Shell. The second command then copies from that VM to example-prd-vm... it's now copying a local file, so the error you're currently getting won't be applicable.
First list all the buckets you have using the following command
gsutil ls
Use the following command to download a file from your Google Cloud Storage bucket to your local location
gsutil cp gs://bucketname/filename local-location
You can use the -r option to download a folder from GCS
gsutil cp -r gs://bucketname/folder-name local-location
You can also use the -m option to download large number of files which performs a parallel (multi-threaded/multi-processing) copy
gsutil -m cp -r gs://bucketname/folder-name local-location
In order to copy one file:
gsutil -m cp "gs://yourbucketnam/folder/../filename.txt"
In order to copy the whole folder:
gsutil -m cp "gs://yourbucketnam/folder"
In order to copy a folder and all its contents.

how to download files from google cloud bucket to local windows machine? Please give me example

I tried to run below command but did not worked for me.
gsutil -m cp gs://bkname/. c:/Download/
gsutil cp gs://bucketname/filename c:/Download/ it works for me.
And make sure your account has right authorize to access bucket.