How to wait for S3 Image to load? - amazon-web-services

In my react native app with AWS Amplify I use S3 Image to show images from my S3 bucket. I want to know how to display an animation(a gif) till my S3 Images load and after the image has loaded then show the image?
<S3Image imgKey={info.Imageurl} style={styles.image}/>

The documentation has a handleOnLoad attribute that's called when the image loads.
This should work:
<S3Image imgKey={info.Imageurl} style={styles.image} handleOnLoad={()=>{console.log('da-ta!'}}/>
Note, it appears that you will need to render the S3Image tag to start the image loading. I don't know what it's appearance is while the image is loading, but if you want to show/hide it you should do that via the style attribute and not by including or excluding the component from render with conditional logic like {isImageLoaded && <S3Image.../>}.
EDIT
The above is wrong, per #sama that attribute is not available for React Native. You could get a link to the image and then include that in an <img src={imageUrl}> tag. Probably make your own component that takes a component to display as the loading state and the key of the object in S3.
The default config for Storage.get has download = false so you'll get a signed-url that points the the image object in the S3 bucket. Your component can show the loading image while it awaits the real image's url, then plug that into an image tag. You'll still need to wait on the image tag to actually fetch the image, so keep it hidden until you get the image's onload event, then set set the image to visible and hide your spinner.
const signedURL = await Storage.get(key)

Related

Photo Spheres & Google Cloud Storage

Is it possible to render photo spheres stored in Google Cloud Services? I have uploaded a photo sphere that works without issue when hosting from a local server, but not when stored in a bucket on the Google Cloud Platform (GCP). I am using the image as a sky element in an a-frame scene, but it doesn't render when the source is the GCP url and built in a Google Apps-Script Web App for testing. I also tested the sky element using a photo sphere from Flikr as the source and it had no problems. Does the metadata not get read properly when serving from GCP? Any help would be greatly appreciated!
<a-assets>
<!-- Images. -->
<img id="skyTexture" src="https://farm5.staticflickr.com/4734/24508950177_b7b09a1f30_k.jpg">
</a-assets>
<a-sky src="#skyTexture"></a-sky>
<a-assets>
<!-- Images. -->
<img id="skyTexture" src="https://storage.googleapis.com/pano-images/cwm-vcfacility/PANO_20171019_130509_0.jpg">
</a-assets>
<a-sky src="#skyTexture"></a-sky>
If you work with images from resource other than your own app, make sure to include crossorigin="anonymous" in your img tag, the error should disappear.
<img id="skyTexture" crossorigin="anonymous" src="https://storage.googleapis.com/pano-images/cwm-vcfacility/PANO_20171019_130509_0.jpg">
It still won't work as you intend though. I don't know much about google storage, but I'd read the docs. If it works similarly to Amazon S3, then you'd have to enable your app to get access to the resource. In S3 it's done with XML rules.
Last tips for working with images:
make sure they don't exceed 4096 x 2048 size
make sure the size is power of two
If you don't follow this, it will be resized for you every time on page load - it takes time, so why not do it once.

Where can I find the S3 image fetch limit settings?

Any idea where can I remove the image fetch limit? Because I have images in a Magento site that is hosted in Amazon S3. If I change the image url to S3, it fetches the images including all the thumbnails, but eventually, blocks the thumbnails, and only fetches the main image.
But if I host the image in my other server (Not Amazon S3), it doesn't have any limit. It will fetch all the images again and again, regardless of how many times I refresh it.
Here are examples:
www.shoptv.com.ph/active-posture.html - Image hosted in S3
dev.shoptv.com.ph/active-posture.html - Image hosted in Dreamhost
As you can see, the thumbnails are all present in DH, but in S3, it doesn't show up. But if you use the direct permalink of the images, it actually shows. For example:
Amazon S3:
http://s3.shoptv.com.ph/images/601938/601938-1.jpg
http://s3.shoptv.com.ph/images/601938/601938-2.jpg
http://s3.shoptv.com.ph/images/601938/601938-3.jpg
http://s3.shoptv.com.ph/images/601938/601938-4.jpg
Dreamhost:
http://dostscholars.org/images/601938/601938-1.jpg
http://dostscholars.org/images/601938/601938-2.jpg
http://dostscholars.org/images/601938/601938-3.jpg
http://dostscholars.org/images/601938/601938-4.jpg
All the images are present. But if you host it in S3, and include it in your media.phtml in Magento, it just won't show.
I suspect that it has something to do with my Amazon S3 settings, maybe a limit somewhere in the S3 dashboard that I can't find.
There is no image limit in Amazon S3.
Your problem is caused by the fact that the www.shoptv.com.ph/active-posture.html page is missing this HTML code (which I got from dev.shoptv.com.ph/active-posture.html):
<div class="more-views">
<h2>More Views</h2>
<ul class="product-image-thumbs">
It isn't displaying the images because there is no HTML telling the web browser to display the images!

Amazon S3: Do not allow client to modify already uploaded images?

We are using S3 for our image upload process. We approve all the images that are uploaded on our website. The process is like:
Clients upload images on S3 from javascript at a given path. (using token)
Once, we get back the url from S3, we save the S3 path in our database with 'isApproved flag false' in photos table.
Once the image is approved through our executive, the images start displaying on our website.
The problem is that the user may change the image (to some obscene image) after the approval process through the token generated. Can we somehow stop users from modifying the images like this?
One temporary fix is to shorten the token lifetime interval i.e. 5 minutes and approve the images after that interval only.
I saw this but didn't help as versioning is also replacing the already uploaded image and moving previously uploaded image to new versioned path.
Any better solutions?
You should create a workflow around the uploaded images. The process would be:
The client uploads the image
This triggers an Amazon S3 event notification to you/your system
If you approve the image, move it to the public bucket that is serving your content
If you do not approve the image, delete it
This could be an automated process using an AWS Lambda function to update your database and flag photos for approval, or it could be done manually after receiving an email notification via Amazon SNS. The choice is up to you.
The benefit of this method is that nothing can be substituted once approved.

Cloudinary image is not refreshing with Rails4 and CarrierWave

I'm using Cloudinary and CarrierWave to upload images from my Rails application, and it works fine.
My requirement is that a user can have a one image, so if a user already have an image and if he/she uploads a new one the previous image should be overridden by the new one.
My problem is, when I upload the new image to Cloudinary it is not invalidating the previous image and hence old image is still shown as users image.
Then I found an option called invalidate and tried to use it, but no luck.
This is my Cloudinary class
class PictureUploader < CarrierWave::Uploader::Base
include Cloudinary::CarrierWave
version :show do
process :invalidate => true
end
end
and it my view
recipe.picture_url(:show)
but this shows the old image and not the new one. What am I missing?
When a Cloudinary image is accessed for the first time, it gets cached in the CDN.
You can indeed update the image, by re-uploading while keeping the public ID, but if you access the same URL, you might be still delivered with the CDN cached version of the image.
You can tell Cloudinary to invalidate the image through the CDN, however note that enabling the invalidate parameter should be included at the upload process, and not inside the 'versions' block, as invalidation is applied upon re-uploading and not on delivery. Also note that it might take up to an hour for the invalidation to fully propagate through the CDN.
It is recommended to use the versions component instead. Adding the 'versions' component to the URL tells Cloudinary to force delivery of the latest version of the image while bypassing CDN cached versions. The updated version value is returned with every upload call. For more information:
http://cloudinary.com/documentation/rails_image_manipulation#image_versions
While it takes a while for invalidation to propagate, the 'version' component affects immediately.

Upload image to Django admin, crop and scale, and send it to Amazon S3 without saving the file locally?

I want to allow users upload an image through the Django admin, crop and scale that image in memory (probably using PIL), and save it to Amazon S3 without saving the image on the local filesystem. I'll save the image path in my database, but that is the only aspect of the image that is saved locally. I'd like to integrate this special image upload widget into the normal model form on the admin edit page.
This question is similar, except the solution is not using the admin interface.
Is there a way that I can intercept the save action, do manipulations and saving of the image to S3, and then save the image path and the rest of the model data like normal? I have a pretty good idea of how I would crop and scale and save the image to S3 if I can just get access to the image data.
See https://docs.djangoproject.com/en/dev/topics/http/file-uploads/#changing-upload-handler-behavior
If images are smaller than a particular size, the will already be stored only in memory, so you can likely tune the FILE_UPLOAD_MAX_MEMORY_SIZE parameter to suit your needs. Additionally, you'll have to make sure that you don't access the .path field of these uploaded images, because that will cause them to be written out to a file. Instead, use (for example) the .read() method. I haven't tested this, but I believe this will work:
image = PIL.Image(request.FILES['my_file'])
Well if you don't want to touch the Admin part of Django then you can define scaling in the models save() method.
But when using the ImageField in Django. Django can actually do the saving for you. It has height and width options available.
https://docs.djangoproject.com/en/dev/ref/models/fields/#imagefield
For uploading to S3 I really suggest using django-storages backends from:
https://bitbucket.org/david/django-storages/src (preferably S3-boto version)
That way you basically will not have to write any code yourself. You can just use available libraries and solutions that people have tested.