Uploading data to google cloud storage bucket using gsutil - google-cloud-platform

I am uploading locally stored .mat file with 1.8KB to google cloud from MATLAB using fillowing gsutil command
gsutil cp -f data.mat gs://mytestbucket/
Uploading data takes more than 2 seconds.
Copying file://data.mat [Content-Type=application/octet-stream]...
/ [0 files][ 0.0 B/ 1.8 KiB]
/ [1 files][ 1.8 KiB/ 1.8 KiB]
Operation completed over 1 objects/1.8 KiB.
Elapsed time is 2.686053 seconds.
I want to upload data in milliseconds.
How can I upload so that the uploading time is small in milliseconds.

Related

gsutil compose fails composing file larger than 250mb

To move a medium sized blob (>250mb) from one place to another in GCS (gsutil cp my_blob my_new_blob),
gsutil wants me to compose it :
so I am doing gsutil compose my_blob my_blob to compose it and overcome this error, but I then get another error:
where it would just retry again and again and I would finally get a
503 - We encountered an internal error - Try again
error.
Why is this happenning ? Is there a limit to the size of the file to be composed and why this limit would be only 250mb ?
Trid it on my end using this docs Cloud Storage cp options.
$ gsutil -o "GSUtil:max_upload_compression_buffer_size=8G" -m cp -J filetest filtest_new
Copying file://filetest...
/ [1/1 files][300.0 MiB/300.0 MiB] 100% Done
Operation completed over 1 objects/300.0 MiB.
Tried to simplify it, same Chaotic comments with slight changes
gsutil -m cp filetest filtest_new
XXXXX#cloudshell:~ (XXXXX)$ gsutil -m cp filetest filtest_new2
Copying file://filetest...
/ [1/1 files][300.0 MiB/300.0 MiB] 100% Done
Operation completed over 1 objects/300.0 MiB.

Gsutil Terminal Command not actually updating files in the Cloud Console

Linux Distro: Pop!_OS 19.10 Ubuntu
For context, I'm hosting a static website through Google Cloud Bucket.
So I tried executing the command
gsutil -m cp -r ejsout gs://evaapp.xyz
to push to my storage bucket.
The command copies all the files successfully, returning
/ [333/333 files][ 19.7 MiB/ 19.7 MiB] 100% Done 998.4 KiB/s ETA 00:00:00
Operation completed over 333 objects/19.7 MiB.
but when I go and look at the bucket online, the files aren't overwritten.
I've waited for hours or days and nothing happens to the files on Google Cloud Console. Only new files that haven't been created show up in the bucket console but not existing files. I have to manually update the files through cloud console for it to update.
Anyway to fix this? Feel free to close this issue because I'm bad at this and I couldn't find any helpful documentation on this. It's just annoying, I want to do the push from the terminal. Thanks!

Using gsutil rsync to update contents of a static website served from a Google Bucket

I have a personal static website, www.kurtpeek.com, that I'm serving from a Google Bucket (cf. https://cloud.google.com/storage/docs/hosting-static-website). In order to quickly make updates on my local directory visible on the website, I'd like to use gsutil's rsync command.
For example, I've made some changes to index.html and then run this command:
~/Google Drive/webpage$ gsutil rsync -d . gs://www.kurtpeek.com
Building synchronization state...
Starting synchronization
Copying file://./.DS_Store [Content-Type=application/octet-stream]...
Copying mtime from src to dst for gs://www.kurtpeek.com/CV_Kurt_Peek_September_2017.pdf
Copying mtime from src to dst for gs://www.kurtpeek.com/confirmation.html
Copying file://./index.html [Content-Type=text/html]...
| [2 files][ 19.1 KiB/ 19.1 KiB]
Operation completed over 2 objects/19.1 KiB.
However, if I then navigate to www.kurtpeek.com and do 'view source', I see that the changes have not appeared, even if I refresh the page.
Can someone explain why this is not working?
As pointed out by Robert Lacok, the content was being served from the cache. Clearing my history or just viewing it in a different browser showed the changes.

How do I put a file on an Amazon S3 bucket using their s3cmd tools?

Does anyone have familiarity with Amazon's command line tools? I'm using them on Amazon Linux. I have verified my ~/.s3cfg crednetials are correct, but for some reason when I try and put a file onto the remote bucket, it never appears. This is my output ...
[myuser#myprojectgate ~]$ s3cmd put test s3://myprojectasset.myco.com/myproject-exchange-test
upload: 'test' -> 's3://myprojectasset.myco.com/myproject-exchange-test' [1 of 1]
5 of 5 100% in 0s 55.73 B/s done
upload: 'test' -> 's3://myprojectasset.myco.com/myproject-exchange-test' [1 of 1]
5 of 5 100% in 0s 54.86 B/s done
Even if I type an invalid bucket name, I see the same output above. What is the correct way to place a file on an s3 bucket using Amazon's command line tools?
Edit: Tried wiht an ending "/," as suggested by an answer, but got the same output (there is no file in my S3 bucket, incidentally) ...
[myuser#myprojectgate ~]$ s3cmd put test s3://myprojectasset.myco.com/myproject-exchange-test/
upload: 'test' -> 's3://myprojectasset.myco.com/myproject-exchange-test/test' [1 of 1]
5 of 5 100% in 0s 46.15 B/s done
upload: 'test' -> 's3://myprojectasset.myco.com/myproject-exchange-test/test' [1 of 1]
5 of 5 100% in 0s 56.01 B/s done
Rather than using the 3rd-party s3cmd tool, you should use the official AWS Command-Line Interface (CLI).
It has an aws s3 cp command to copy files to/from S3.
Have you checked? if a bucket is not available that will be created new one. and put that content.
You have to define directory name and / like:-
s3://myprojectasset.myco.com/myproject-exchange-test/
Then it will put test object to myproject-exchange-test directory of myprojectasset.myco.com bucket.
For future Googlers, the answer is:
s3cmd put my_file s3://my_bucket/

How to install Google Cloud SDK on Travis?

I have tried to install Google Cloud SDK on Travis with the following .travis.yml
sudo: required
language: go
- curl https://sdk.cloud.google.com | bash;
My attempt is inspired by this guide from Google: https://cloud.google.com/solutions/continuous-delivery-with-travis-ci
Unfortunately, I get this output on Travis:
$ curl https://sdk.cloud.google.com | bash;
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 421 0 421 0 0 17820 0 --:--:-- --:--:-- --:--:-- 60142
Downloading Google Cloud SDK install script: https://dl.google.com/dl/cloudsdk/channels/rapid/install_google_cloud_sdk.bash
######################################################################## 100.0%
Running install script from: /tmp/tmp.uz8jP70e56/install_google_cloud_sdk.bash
which curl
curl -# -f https://dl.google.com/dl/cloudsdk/channels/rapid/google-cloud-sdk.tar.gz
######################################################################## 100.0%
Installation directory (this will create a google-cloud-sdk subdirectory) (/home/travis):
Travis waits for 10 minutes and then terminates the build. It seems like it is waiting for an installation directory.
How do I install Google Cloud SDK on Travis?
You are running into this issue because there is no interaction possible on Travis CI. Hence, the installation script is blocked waiting for input and Travis CI kills the build after 10 minutes.
The trick is to disable the prompts when installing the Google Cloud SDK. This can be done by setting the CLOUDSDK_CORE_DISABLE_PROMPTS environment variable to 1.
Here's a sample recipe to put in your .travis.yml file (including caching it for faster subsequent builds):
cache:
directories:
- "$HOME/google-cloud-sdk/"
script:
- gcloud version || true
- if [ ! -d "$HOME/google-cloud-sdk/bin" ]; then rm -rf $HOME/google-cloud-sdk; export CLOUDSDK_CORE_DISABLE_PROMPTS=1; curl https://sdk.cloud.google.com | bash; fi
# Add gcloud to $PATH
- source /home/travis/google-cloud-sdk/path.bash.inc
- gcloud version
Hope this helps!