Streaming data to web from google cloud storage - google-cloud-platform

I am trying to publish the data from my google cloud to a website. I have created a script which dynamically uploads the data from my device to google cloud. The issue i am facing is how can i publish the data from google cloud to a website. I just need to display the data which i have on google cloud.

As suggested in the comments, you can make your bucket public and just have your application fetch the object via an HTTP request or even just post the link in your website depending on what you are trying to do. If you don’t want to make your bucket public, or you just want a more personalized approach, you can just create your own application that retrieves data from the bucket by using the GCS client library. Here you can find some code samples on how you could download objects from a GCS bucket from within your own application.

I have found a way when you upload some data(lets say some images) you make the storage bucket public. After making it public you will get link of each object you upload. So, when you are uploading a image in gcp storage save them with continuous names(ex-1.jpg,2.jpg,3.jpg...). So you will get a link of each object in the format(https://storage.googleapis.com/bucket_name/1.jpg).
So when you will be working on its front-end you just need to set a loop and all the data would be streamed to web.
If there's a good way then please suggest.
Thank you

Related

How to upload images to AWS Amplify

I'm trying to upload images to Aws Amplify. First of all I'm not even sure if that's how it is done since I'm fairly new to AWS but I have a model in which there's an array of string that holds images links addresses that I use for display in the app. Now I'm trying to do the opposite by uploading images from the react native app to AWS amplify knowing a link needs to be created after uploading. How would I proceed to doing that is what I need to know
Amplify Storage actually utilizes S3, so you'd be creating an S3 storage bucket and uploading your images to that. The Amplify libraries provide an easy way to do all of this.
An overview of the upload process: https://docs.aws.amazon.com/AmazonS3/latest/API/browser-based-uploads-aws-amplify.html
Amplify JS documentation for the put method: https://docs.amplify.aws/lib/storage/upload/q/platform/js/

Transcription problem with Google Cloud services

I try to transcribe with Google Cloud using this method
https://cloud.google.com/speech-to-text/docs/transcribe-console
I got stuck at this point
https://cloud.google.com/speech-to-text/docs/transcribe-console#model_adaptation_optional
It does not let me send to create the transcript
Attaches a screenshot
enter image description here
You need to submit it for which you have to select Workspace from the dropdown option as can be seen in the image. A workspace is a Cloud Storage bucket that stores your Speech-to-Text assets, like configurations, uploaded audio files, and transcriptions. You can use an existing Cloud Storage bucket or you can create a new one.

Monitoring downloads of Google Cloud Storage objects that have public URLs

I have some objects in a Google Cloud Storage bucket that are publicly downloadable on URLs like https://storage.googleapis.com/blahblahblah. I want to set up a monitoring rule that lets me see how often one of these objects is being downloaded. I have turned on the Data Read audit log as mentioned here, but I don't see any logs when I download the object from the storage.googleapis.com link. I have another bucket where downloads are performed through the Node Google Cloud Storage client library, and I can see download logs from that bucket, so it seems like downloads from the public URL don't get logged.
I also don't see a way to specify the object in a particular bucket when setting up an alert in Google Cloud. Is creating a new bucket solely for this object the best way to try to set up monitoring for the number of downloads, or is there something I'm missing here?
Google Cloud Audit logs do not track objects that are public (allUsers or allAuthenticatedUsers).
Enable usage logs to track access to public objects.
Should you use usage logs or Cloud Audit Logs?

How can I display cloud storage hosted images in Google Data Studio?

I wish to produce a Data Studio report with a grid of images and their metadata. These images are stored on a Google Cloud Storage bucket. This bucket must be secure ie not open to allusers.
Is there any way to hook this bucket into Data Studio?
The Google Cloud Storage Connector only seems to allow me to access csv files from there, and the Image control requires a URL which I do not know how to get from the bucket and surely won't pass the security anyway.
I have a secure Cloud MySQL DB and that works in the report.
According to the official documentation, the Google Cloud Storage connector can only handle files of tabular data in CSV format. This means that it does not support what you are intending to do. As for the URL, you must use the IMAGE function, but the permissions must be set to allUsers, so unfortunately, this would not help you either.
I would advise to ask in the official community help forum to see if anyone might have a workaround for your use case.

Accessing private S3 content only from my application

I have an application that stores images in AWS S3. It is like a profile picture upload case. After uploading the profile picture, the image will be stored in AWS S3 and the S3 link will be stored in a database. The application will then show the profile picture using that link in the database.
Right now, as the bucket is private the profile picture is not visible in my application. How can I use this link to show the image without making the bucket public?
I don't think, I can use AWS's signed URL because this link can't be time-limited. The link need to be available all the time for showing the image on the application.
Is there any method to do so? Or is there any other industry-standard method for making this feature possible?
Regarding images, best way is the serve them via CDN (you can link it with S3). Their long, scaffolded URL should be enough (and make a dedicated S3 bucket, public). Check a photo from a friend's Facebook account, it will should even if not logged in. How to setup CDN https://learnetto.com/blog/cloudfront-s3
If you are really concerned about security, you can fetch the images in Base64 (https://stackoverflow.com/a/2429959/290036). Make your bucket private and allow access only to your internal services. That way you have better control, but lose all the benefits of a CDN.