gsutil upload specific extension file to gcp gcs - google-cloud-platform

cant upload all file with extension .css in the directory or sub directory to GCS bucket
gsutil -h "Cache-Control:public,max-age=2628000" -h "Content-Encoding:gzip" cp plugins/**.css gs://cdn.test.example.io/wp-content/plugins
response
CommandException: No URLs matched: plugins/**.css
there are some css files deep in the directory
i want to upload all css file to GCS bucket inside plugin folder or it's any sub-folder

The documentation describe this feature and it works as expected
For my test, I used Cloud Shell with the latest version of gsutil
> gsutil -v
gsutil version: 4.52
Check your version, and update it to see if that solve your issue
However you can also do this to upload only the file of Plugin directory
cd plugins
gsutil -h "Cache-Control:public,max-age=2628000" -h "Content-Encoding:gzip" cp **.css gs://cdn.test.example.io/wp-content/plugins
If you want to scan any directory and export the CSS files, you can use this command
gsutil cp ./**/*.css gs://cdn.test.example.io/wp-content/plugins

Related

How exactly do I use gsutil to download a Google Cloud Storage bucket to a local disk?

I am trying to download a full bucket from my Google Cloud Storage. I am using gsutil and the CLOUD SHELL Terminal.
My current piece of code receives and error: "CommandException: Destination URL must name a directory, bucket, or bucket
subdirectory for the multiple source form of the cp command."
The code is:
gsutil -m cp -r gs://googleBucket D:\GOOGLE BACKUP
where googleBucket is the bucket and D:\GOOGLE BACKUP is the directory to my desired download location. Am I missing something here?
Any help is appreciated.
P.S. I am in no way tech savvy, and most of this is new to me.
download this way first
gsutil -m cp -r gs://googleBucket .
The . downloads it to current directory. Do an ls and you will see the download
Then go to the 3 dots and download locally. The 3 dots is to the right of open editor.

How to exclude the particular folder in gcs bucket in google cloud while copying to local machine?

I am trying to copy the files and folders from google cloud storage to vm machine using gsutil command but i need to exclude few of the folders in the gcs bucket while copying to vm, i tried searching for the options but i couldn't find it, please help if anyone knows the command for this.
Thanks in-advance,
For this you can use a command like:
gsutil -m rsync -r -x '^dir3/*' gs://bucket
this should retrieve all objects located on the bucket, except objects beginning with dir3 (files not located in dir3 directory in your example).
Here you can find more details about the rsync command

How to download an entire bucket in GCP?

I have a problem downloading entire folder in GCP. How should I download the whole bucket? I run this code in GCP Shell Environment:
gsutil -m cp -R gs://my-uniquename-bucket ./C:\Users\Myname\Desktop\Bucket
and I get an error message: "CommandException: Destination URL must name a directory, bucket, or bucket subdirectory for the multiple source form of the cp command. CommandException: 7 files/objects could not be transferred."
Could someone please point out the mistake in the code line?
To download an entire bucket You must install google cloud SDK
then run this command
gsutil -m cp -R gs://project-bucket-name path/to/local
where path/to/local is your path of local storage of your machine
The error lies within the destination URL as specified by the error message.
I run this code in GCP Shell Environment
Remember that you are running the command from the Cloud Shell and not in a local terminal or Windows Command Line. Thus, it is throwing that error because it cannot find the path you specified. If you inspect the Cloud Shell's file system/structure, it resembles more that of a Unix environment in which you can specify the destination like such instead: ~/bucketfiles/. Even a simple gsutil -m cp -R gs://bucket-name.appspot.com ./ will work since Cloud Shell can identify the ./ directory which is the current directory.
A workaround to this issue is to perform the command on your Windows Command Line. You would have to install Google Cloud SDK beforehand.
Alternatively, this can also be done in Cloud Shell, albeit with an extra step:
Download the bucket objects by running gsutil -m cp -R gs://bucket-name ~/ which will download it into the home directory in Cloud Shell
Transfer the files downloaded in the ~/ (home) directory from Cloud Shell to the local machine either through the User Interface or by running gcloud alpha cloud-shell scp
Your destination path is invalid:
./C:\Users\Myname\Desktop\Bucket
Change to:
/Users/Myname/Desktop/Bucket
C: is a reserved device name. You cannot specify reserved device names in a relative path. ./C: is not valid.
There is not a one-button solution for downloading a full bucket to your local machine through the Cloud Shell.
The best option for an environment like yours (only using the Cloud Shell interface, without gcloud installed on your local system), is to follow a series of steps:
Downloading the whole bucket on the Cloud Shell environment
Zip the contents of the bucket
Upload the zipped file
Download the file through the browser
Clean up:
Delete the local files (local in the context of the Cloud Shell)
Delete the zipped bucket file
Unzip the bucket locally
This has the advantage of only having to download a single file on your local machine.
This might seem a lot of steps for a non-developer, but it's actually pretty simple:
First, run this on the Cloud Shell:
mkdir /tmp/bucket-contents/
gsutil -m cp -R gs://my-uniquename-bucket /tmp/bucket-contents/
pushd /tmp/bucket-contents/
zip -r /tmp/zipped-bucket.zip .
popd
gsutil cp /tmp/zipped-bucket.zip gs://my-uniquename-bucket/zipped-bucket.zip
Then, download the zipped file through this link: https://storage.cloud.google.com/my-uniquename-bucket/zipped-bucket.zip
Finally, clean up:
rm -rf /tmp/bucket-contents
rm /tmp/zipped-bucket.zip
gsutil rm gs://my-uniquename-bucket/zipped-bucket.zip
After these steps, you'll have a zipped-bucket.zip file in your local system that you can unzip with the tool of your choice.
Note that this might not work if you have too much data in your bucket and the Cloud Shell environment can't store all the data, but you could repeat the same steps on folders instead of buckets to have a manageable size.

Downloading folder from GCS to local directory

I'm trying to download a folder from my Cloud Storage bucket to local directory using the command gsutil cp -r gs://bucket/my_folder . . But it is showing OSError : Access is denied. Any idea how to get around this problem?
I can reproduce this error if I do not have permissions to create LOCAL_DEST_DIR on my local machine.
$ gsutil cp -r gs://BUCKET_NAME/DIR_IN_BUCKET LOCAL_DEST_DIR
Copying gs://BUCKET_NAME/DIR_IN_BUCKET/FILE...
OSError: Permission denied.
Please check you have permissions to create a file/directory in your current working directory.
You can run touch test-file.text to verify if you're able to create files in the current directory.
If you're on linux/*nix/mac, usually you will have full permissions to create files and directories in your $HOME directory, so you can try running the gsutil command in that directory.

How to download all bucket files using gsutil with django-gae app

I want to download all bucket files using gsutil in my django non-rel gae app.
gsutil -m cp -R gs://"BUCKET NAME" .
Replace "BUCKET NAME" with the name of your bucket(no quotes) and don't forget the period at the end. If you want to specify a folder replace the period with the destination folder.
-m to perform a parallel (multi-threaded/multi-processing) copy
-R to copy an entire directory tree
More details can be found here