firebase storage: changing folder (file path)'s name with gsutil - google-cloud-platform

gsutil mv gs://bucket/gamelists/ gs://bucket/games/
I have folders of media files inside the gamelists and want to rename the folder to games. I did the following command and it did not change anything.
Is there any way that I can just change the main folders?

As suggested and referred , by Doug Stevenson in the comment above please follow recommendations to prevent the issue.
The gsutil mv command allows you to move data between your local file system and the cloud, move data within the cloud, and move data between cloud storage providers.
You can use the gsutil mv command to rename all objects with a given prefix to have a new prefix.
gsutil mv gs://my_bucket/oldprefix gs://my_bucket/newprefix
If you have a large number of files to move you might want to use the gsutil -m option, to perform a multi-threaded/multi-processing move:
gsutil -m mv gs://my_bucket/oldprefix gs://my_bucket/newprefix
Reference Document

Related

copy multiple folders of similar names from google cloud to google colab

I have a google cloud bucket which has 7 subfolders named subset0 to subset7. I want to copy all of them to google colab. Right now I am using code like
!gsutil -m cp -r gs://mybucket/datafolder/subset0 datafolder/
to copy each folder separately. I am not sure how I can write a for loop to copy all folders without repeating the same line 7 times. Thanks a lot!!
As #FerreginaPelona mentioned in the comments, you can use gsutil -m cp -r gs://mybucket/datafolder/subset* datafolder/ if your gs://mybucket/datafolder/ only contains subset0 to subset7 and no other subfolders.
However, if your source bucket path has other subfolders and you only want to specify your needed subfolders, you may put your subfolders in a list and use a for loop as shown below.
from google.colab import auth
auth.authenticate_user()
# Download the file from a given Google Cloud Storage bucket.
subfolder_list = ["subset0","subset1","subset2","subset3","subset4","subset5","subset6","subset7"]
for subfolder in subfolder_list:
!gsutil -m cp -r gs://mybucket/datafolder/{subfolder} /datafolder

How do I change the extension of multiple files in Cloud Storage?

How do I change the extension of multiple files in GCP's Cloud Storage?
For example, in the following directory:
gs://[bucket-name]/filesdirectory/
I want to change the files with extension .ipynb to .py.
You can do it by using gsutil which you can install on your local machine or use directly from cloud shell.
gsutil uses command much similar to the linux CLI, you can use the gsutil mv command to achieve this, but since you can't use wildcards there you have to use something similar to this:
IFS=$'\n'
gsutil ls gs://your-bucket/*.ipynb| while read x; do gsutil mv $x $(echo $x | sed "s/.ipynb/.py/g"); done
I'm not a shell expert so probably this can be improved, but here's an explanation:
gsutil ls uses a wildcard to return the files you want to rename
loop through the file and store into a variable the result
use gsutil mv + sed to place the file extension and rewrite the file as desired
This is like "rewriting" the files entirely since gcs objects are immutable, so there are probably a few considerations that you should keep in mind, although this might not be your case:
if you have ACLs rules specified for those files, you have to use the
-p flag to pass them on to the new files
these are operations for GCS, implying costs based on your storage class. (since mv is actually copy + delete, if you are on
nearline or coldline, you could have additional early deletion fees)
hope this helps :)
As mentioned in the documentation, you can rename your files in your GCS buckets using Console, GSutil command, Code or REST APIS.
The Gsutil command you should use is the following:
gsutil mv gs://[BUCKET_NAME]/[OLD_OBJECT_NAME] gs://[BUCKET_NAME]/[NEW_OBJECT_NAME]
Furthermore, in case you want to change more than one file, I would suggest you to use a script in order to do it for each file you need to change.

How to download an entire bucket in GCP?

I have a problem downloading entire folder in GCP. How should I download the whole bucket? I run this code in GCP Shell Environment:
gsutil -m cp -R gs://my-uniquename-bucket ./C:\Users\Myname\Desktop\Bucket
and I get an error message: "CommandException: Destination URL must name a directory, bucket, or bucket subdirectory for the multiple source form of the cp command. CommandException: 7 files/objects could not be transferred."
Could someone please point out the mistake in the code line?
To download an entire bucket You must install google cloud SDK
then run this command
gsutil -m cp -R gs://project-bucket-name path/to/local
where path/to/local is your path of local storage of your machine
The error lies within the destination URL as specified by the error message.
I run this code in GCP Shell Environment
Remember that you are running the command from the Cloud Shell and not in a local terminal or Windows Command Line. Thus, it is throwing that error because it cannot find the path you specified. If you inspect the Cloud Shell's file system/structure, it resembles more that of a Unix environment in which you can specify the destination like such instead: ~/bucketfiles/. Even a simple gsutil -m cp -R gs://bucket-name.appspot.com ./ will work since Cloud Shell can identify the ./ directory which is the current directory.
A workaround to this issue is to perform the command on your Windows Command Line. You would have to install Google Cloud SDK beforehand.
Alternatively, this can also be done in Cloud Shell, albeit with an extra step:
Download the bucket objects by running gsutil -m cp -R gs://bucket-name ~/ which will download it into the home directory in Cloud Shell
Transfer the files downloaded in the ~/ (home) directory from Cloud Shell to the local machine either through the User Interface or by running gcloud alpha cloud-shell scp
Your destination path is invalid:
./C:\Users\Myname\Desktop\Bucket
Change to:
/Users/Myname/Desktop/Bucket
C: is a reserved device name. You cannot specify reserved device names in a relative path. ./C: is not valid.
There is not a one-button solution for downloading a full bucket to your local machine through the Cloud Shell.
The best option for an environment like yours (only using the Cloud Shell interface, without gcloud installed on your local system), is to follow a series of steps:
Downloading the whole bucket on the Cloud Shell environment
Zip the contents of the bucket
Upload the zipped file
Download the file through the browser
Clean up:
Delete the local files (local in the context of the Cloud Shell)
Delete the zipped bucket file
Unzip the bucket locally
This has the advantage of only having to download a single file on your local machine.
This might seem a lot of steps for a non-developer, but it's actually pretty simple:
First, run this on the Cloud Shell:
mkdir /tmp/bucket-contents/
gsutil -m cp -R gs://my-uniquename-bucket /tmp/bucket-contents/
pushd /tmp/bucket-contents/
zip -r /tmp/zipped-bucket.zip .
popd
gsutil cp /tmp/zipped-bucket.zip gs://my-uniquename-bucket/zipped-bucket.zip
Then, download the zipped file through this link: https://storage.cloud.google.com/my-uniquename-bucket/zipped-bucket.zip
Finally, clean up:
rm -rf /tmp/bucket-contents
rm /tmp/zipped-bucket.zip
gsutil rm gs://my-uniquename-bucket/zipped-bucket.zip
After these steps, you'll have a zipped-bucket.zip file in your local system that you can unzip with the tool of your choice.
Note that this might not work if you have too much data in your bucket and the Cloud Shell environment can't store all the data, but you could repeat the same steps on folders instead of buckets to have a manageable size.

How to transfer all storage from google cloud to local storage

I would like to export all of the images, videos, and data that I have in my google storage to my local directory since I am canceling my subscription. But there is no proper documentation on doing that, I found how to transfer it from 1 service provider to another but not to export.
https://cloud.google.com/storage-transfer/docs/how-to?authuser=4
That's the only documentation I found, but it doesn't mention how to transfer locally.
If you install gsutil, you can use the cp command like this:
gsutil -m cp -r gs://YOUR_BUCKET_NAME/*.* .
Use -m to perform a parallel copy, in case of a large number of files. Use -r to copy also the contents of the subdirectories. Then, *.* is a regex for "any filename and file extension" and . at the end will download it at the directory where you are running gsutil. You can find help about these flags in here
Repeat this for all the buckets you may have, and you are set.
Another alternative would be using either the libraries or the API, but that would require more set up from you, while gsutil is 'easier' to do.

Google GSutil create folder

How can u create a new folder inside a bucket in google cloud storage using the gsutil command?
I tried using the same command in creating bucket but still got an error
gsutil mb -l us-east1 gs://my-awesome-bucket/new_folder/
Thanks!
The concept of directory is abstract in Google Cloud Storage. From the docs (How Subdirectories Work) :
gsutil provides the illusion of a hierarchical file tree atop the "flat" name space supported by the Google Cloud Storage service. To the service, the object gs://your-bucket/abc/def.txt is just an object that happens to have "/" characters in its name. There is no "abc" directory; just a single object with the given name.
So you cannot "create" a directory like in a traditional File System.
If you're clear about what folders and objects already exist in the bucket, then you can create a new 'folder' with gsutil by copying an object into the folder.
>mkdir test
>touch test/file1
>gsutil cp -r test gs://my-bucket
Copying file://test\file1 [Content-
Type=application/octet-stream]...
/ [1 files][ 0.0 B/ 0.0 B]
Operation completed over 1 objects.
>gsutil ls gs://my-bucket
gs://my-bucket/test/
>gsutil ls gs://my-bucket/test
gs://my-bucket/test/file1
It won't work if the local directory is empty.
More simply:
>touch file2
>gsutil cp file2 gs://my-bucket/new-folder/
Copying file://test\file2 [Content- ...
>gsutil ls gs://my-bucket/new-folder
gs://my-bucket/new-folder/file2
Be aware of the potential for Surprising Destination Subdirectory Naming. E.g. if the target directory already exists as an object. For an automated process, a more robust approach would be to use rsync.
I don't know if its possible to create an empty folder with gsutil. For that, use the console's Create Folder button.
You cannot create folders with gsutil as gsutil does not support it (workaround see below).
However, it is supported via:
UI in browser
write your own GCS client (we have written our own custom client which can create folders)
So even if Google has a flat name space structure as the other answer correctly points out, it still has the possibility to create single folders as individual objects. Unfortunately gsutil does not expose this.
(Ugly) workaround with gsutil: Add a dummy file into a folder and upload this dummy file - but the folder will be gone once you delete this file, unless other files in that folder are present.
Copied from Google cloud help:
Copy the object to a folder in the bucket
Use the gsutil cp command to create a folder and copy the image into it:
gsutil cp gs://my-awesome-bucket/kitten.png gs://my-awesome-bucket/just-a-folder/kitten3.png
This works.
You cannot create a folder with gsutil on GCS.
But you can copy an existing folder with gsutil to GCS.
To copy an existing folder with gsutil to GCS, a folder must not be empty and the flag "-r" is needed as shown below otherwise you will get error if a folder is empty or you forgot the flag -r:
gsutil cp -r <non-empty-folder> gs://your-bucket
// "-r" is needed for folder
You cannot create an empty folder with mb