I would like to upload a image on a google storage bucket, for that I generated signed URL which would be passed to client for upload. I observed that google cloud bucket name is exposed in the signed URL.
https://storage.googleapis.com/myproject-images/test.PNG?X-Goog-Algorithm=GOOG4-RSA-SHA256&X-Goog-Credential=3242342308700-compute%40developer.gserviceaccount.com%2F20200430%2Fauto%2Fstorage%2Fgoog4_request&X-Goog-Date=20200430T044803Z&X-Goog-Expires=900&X-Goog-SignedHeaders=host&X-Goog-Signature=*********************
My question: Is it possible to encrypt/map or hide the google cloud bucket in signed URL. I do not want to expose my bucket name to end user
It's not possible, if you want the client to directly access that data. You could obfuscate it by using a URL shortener, but all that would do is hide it from view temporarily.
Once you choose to allow clients to access your project directly, your project id is no longer private information. That ID is absolutely required in order to identify resources within your project (and not just Cloud Storage). The same is true for all Firebase-related client access that goes directly to Google Cloud and Firebase products.
If you don't want anyone to see the name of your project, you will either:
Disallow all direct client access
Route all requests through some middleware service identified by another DNS name that hides all the implementation details of the interaction with Google Cloud products.
Related
I have a bucket bucket1 in which there is a file abc.pdf, I want to make this file accessible by all the users in my organization irrespective of they have GCP account or not i.e., make it specifically public to all my intranet users in my organization who are even not part of GCP also.
for example : if a user with id - abc#xyz.com who is not a part of GCP or doesn't have google account when he clicks the URL he should be able to access it.
need guidance and help on this.
I would suggest using signed URLs as users are provided with time-limited access to a specific Cloud Storage resource even without a Google account.
By requirement, the google bucket I use for file storage should be private. But I need to make the files on the bucket accessible over google CDN.
Most of the documents I found describe the best practice involving signed URLs. But I need to make it work w/o signed URLs or cookies. Does anyone know how to achieve that?
I've successfully configured the access over signed URLs, including all the permissions settings for bucket/CDN but that's not what I need.
At this moment, Cloud CDN still requires tokenized access in order to access a private origin. There is a solution where you can deploy a proxy that will dynamically sign your request with an ephemerial token and access the private storage bucket.
https://github.com/GoogleCloudPlatform/cdn-auth-proxy
There is work underway for Cloud CDN that will allow you to dynamically generate an access token without having deploy a proxy, but a definitive time for the release has not been set.
The new Google Cloud Media CDN service allows you to access a private storage bucket via IAM permissions: https://cloud.google.com/media-cdn/docs/origins?hl=en#private-storage-buckets .
You can register service worker credentials in a json file on the webserver that is supposed to serve the files. Just make sure the worker has proper permissions to access the desired resources. In the gcloud SDK there is full support to make requests to protected resources via a service worker given the permissions are sufficient.
This way you can just map the requests dynamically to the web service and have the service take care of accessing and using protected credentials in the back.
I have a Google Cloud Storage bucket with images that I want to serve to users of my website. The public URL is something like this:
https://storage.googleapis.com/example-bucket/filename.jpg
So my website can easily access it, and any random internet user can enter the URL in a browser directly to access it.
Is it possible, via Google Cloud, to restrict this so that if my website tries to access the file, it succeeds, but if a random user tries to enter the URL into a browser window, they get denied?
Cloud Storage lets us set CORS policies, but they only apply for the XML API: https://cloud.google.com/storage/docs/cross-origin#server-side-support
Is it possible to restrict this via a Load Balancer, Cloud Armor, or Cloud CDN?
FYI, let's say my website is accessing it from the DOM directly, like this:
<html>
<body>
<img src="https://storage.googleapis.com/example-bucket/filename.jpg"></img>
</body
</html>
Based on your use-case. Google Cloud Storage has no mechanism as of now to allow read but restrict download. Once the image/file has been set to public, any random user/website with the URL can read or download the file.
I would suggest:
Use another 3rd party app to render the documents as graphic/image inside the app which prevents any user from downloading.
You can change your use-case. Have a user sign-in to your website using their google account and configure using IAM permissions with ACLs which makes the Google Cloud Storage object only accessible if they are allowed to read and also authenticated.
You could also check this blog on how to control access to Google Cloud Storage.
GCP seems to allow you to delegate Cloud Storage authentication via IAM and that's great but you're only able to ever get a single file at a time using that method.
What I mean is, if I gave permissions to a user to have 'Storage Object Viewer' Role to a folder in a bucket, then a user would be able to browse to a single file (let's say an .html file) using https://storage.cloud.google.com/bucket-name/folder-name/filename and display that, but if that .html file contains .css or other files it needs the user to download then those all return 404 not found errors.
It seems that whatever token is obtained upon authentication is only valid for the retrieval of the single file that was requested before the token was created.
How does one host a static website, with some form of authentication in GCP Cloud Storage?
I did see a question similar to this asked over 5 years ago and thought GCP has changed considerably since then so that's why I'm re-asking.
Edit: Ok, let's assume I'm okay with public read-only access to bucket contents, and instead I'm going to focus on securing the GCP Cloud Functions which make the changes.
Now, I've enabled authentication on the GCP functions and used OAuth ID Token. The issue is now CORS. Any calls made to the GCP functions need a CORS access-control-allow-origin header, but that header does not get returned until AFTER the authentication.
Anybody know how to enable CORS on GCP Cloud Functions before any authentication takes place?
Thanks!
You can host your static files on App Engine. The content is served for free!
In front of App Engine, you can activate IAP.
Finally, grant your user (or groups, or Google Workspace domains) the role IAP-Secured Web App User.
I was trying to understand the Google Cloud Platform storage but couldn't really comprehend the language used in the documentation. I wanted to ask if you could use the storage and the APIs to store photos users take within your application and also get the images back if provided with a URL? and even if you can, would it be a safe and reasonable method to do so?
Yes you can pretty much use a storage bucket to store any kind of data.
In terms of transferring images from an application to storage buckets, the application must be authorised to write to the bucket.
One option is to use a service account key within the application. A service account is a special account that can be used by an application to authorise to various Google APIs, including the storage API.
There is some more information about service accounts here and information here about using service account keys. These keys can be used within your application, and allow the application to inherit the permission/scopes assigned to that service account.
In terms of retrieving images using a URL, one possible option would be to use signed URLs which would allow you to give users read or write access to an object (in your case images) in a bucket for a given amount of time.
Access to bucket objects can also be controlled with ACL (Access Control Lists). If you're happy for you images to be available publicly (i.e. accessible to everybody), it's possible to set an ACL with 'Reader' access for AllUsers.
More information on this can be found here.
Should you decide to make the images available publically, the URL format to retrive the object/image from the bucket would be:
https://storage.googleapis.com/[BUCKET_NAME]/[OBJECT_NAME]
EDIT:
In relation to using an interface to upload the files before the files land in the bucket, one option would be to have a instance with an external IP address (or multiple instances behind a Load Balancer) where the images are initially uploaded. You could mount Cloud Storage to this instance using FUSE, so that uploaded files are easily transferred to the bucket. In terms of databases you have the option of manually installing your database on a Compute Engine instance, or using a fully managed database service such as Cloud SQL.