I am running a cityscape and nature photography website that contains a lot of images which range from 50kb-2mb in size. I have already shrunk them down in size using a batch photo editor so I can't lose any more quality in the images without them getting too grainy.
Google page insights recommends lossless compression and I am trying to figure out how to solve this. These specific images are in s3 buckets and being served by AWS cloudfront
Losslessly compressing https://d339oe4gm47j4m.cloudfront.net/bw107.jpg could save 57.6KiB (38% reduction).
Losslessly compressing https://luminoto-misc.s3-us-west-2.amazonaws.com/bob_horsch.jpg could save 40.6KiB (42% reduction). ...... and a hundred more of the same.
Can Cloudfront do the compression before the image is server to the client? Or do I have to do some other type of compression and then reupload each file to a new s3 bucket. I am looking for a solution where cloudfront will do it.
I have searched around but haven't found a definitive answer.
Thanks,
Jeff
Update
As implicitly pointed out by Ryan Parman (+1), there are two different layers at play when it comes to compression (and/or optimization), which seem to get mixed a bit in this discussion so far:
My initial answer below has addressed lossless compression using Cloudfront as per your question title, which is concerned with the HTTP compression layer:
HTTP compression is a capability that can be built into web servers and web clients to make better use of available bandwidth, and provide greater transmission speeds between both.
[...] data is compressed before it is sent from the server: compliant browsers will announce what methods are supported to the server before downloading the correct format; browsers that do not support compliant compression method will download uncompressed data. [...]
That is, the compress/decompress operation is usually automatically handled by the server and the client to optimize bandwidth usage and transmission performance - the difference with CloudFront is, that its server implementation does indeed not handle compression automatically like most web servers, which is why you need to prepare a compressed representation yourself if desired.
This kind of compression works best with text files like HTML, CSS and JavaScript etc., but isn't useful (or even detrimental) with binary data formats that are already compressed by themselves like ZIP and other prepacked archives and esp. image formats like PNG and JPEG.
Now, your question body talks about a different compression/optimization layer all together, namely lossy JPEG_compression and specifically Lossless_editing as well as optimization via jpegoptim - this has nothing to do with how files are handled by HTTP servers and clients, rather just compressing/optimizing the files themselves to better match the performance constraints within specific use cases like web or mobile browsing, where the transmission of a digital photo in its original size wouldn't make any sense when it is simply to be viewed on a web page for example.
This kind of compression/optimization is one that is rarely offered by web servers themselves so far, even though notable efforts like Google's mod_pagespeed are available these days - usually it is the responsibility of the web designer to prepare appropriate assets, ideally optimized for and selectively delivered to the expected target audience via CSS Media queries.
Initial Answer
AWS CloudFront is capable of Serving Compressed Files, however, this is to be taken literally:
Amazon CloudFront can serve both compressed and uncompressed files
from an origin server. CloudFront relies on the origin server either
to compress the files or to have compressed and uncompressed versions
of files available; CloudFront does not perform the compression on
behalf of the origin server. With some qualifications, CloudFront can
also serve compressed content from Amazon S3. For more information,
see Choosing the File Types to Compress. [emphasis mine]
That is, you'll need to provide compressed versions yourself, but once you've set this up, this is transparent for clients - please note that the content must be compressed using gzip; other compression algorithms are not supported:
[...] If the request header includes additional content encodings, for example, deflate or sdch, CloudFront removes them before forwarding the request to the origin server. If gzip is missing from the Accept-Encoding field, CloudFront serves only the uncompressed version of the file. [...]
Details regarding the requirements and process are outlined in How CloudFront Serves Compressed Content from a Custom Origin and Serving Compressed Files from Amazon S3.
JPEGOptim doesn't do any compression -- it does optimization.
The short answer is, yes, you should always use JPEGOptim on your .jpg files to optimize them before uploading them to S3 (or whatever your source storage is). This has been a good idea since forever.
If you're talking about files which are plain text-based (e.g., CSS, JavaScript, HTML), then gzip-compression is the appropriate solution, and Steffen Opel would have had the 100% correct answer.
The only compression amazon really supports is zip or gzip. You are able to load those compressions into S3, and then do things like loads directly into resources like Redshift. So in short, no amazon does not provide you with the service you are looking for. This would be something you would have to leverage yourself...
Related
I'm hosting videos on aws S3 at the moment. I can place the s3 url into the src attribute of my tags and everything works correctly and plays as though the video is being streamed to my site. These are not small videos either. Some are 1gb in size.
I can also immediately jump to the end of the video as though the entire file wasn't downloaded, but just the part I need.
Whenever I google info on streaming on demand video from aws I get answers that I need a service in front of s3 to do something like this. Is aws automatically doing this for me?
S3 support partial GET requests. This allows clients to request only a specific part of the file. Most modern players (including HTML5) are able to utilize this feature to provide the experience you describe to the users.
Quoting from here:
HTTP range requests allow to send only a portion of an HTTP message
from a server to a client. Partial requests are useful for large media
or downloading files with pause and resume functions, for example.
Can anyone please help me before I go crazy?
I have been searching for any documentation/sample-code (in JavaScript) for uploading files to S3 via CloudFront but I can't find a proper guide.
I know I could use Tranfer Acceleration feature for faster uploads and yeah, Transfer Acceleration essentially does the job through CloudFront Edge Points but as long as I searched, it is possible to make the POST/PUT request via AWS.CloudFront...
Also read an article posted in 2013 says that AWS just added a functionality to make POST/PUT requests but says not a single thing about how to do it!?
CloudFront documentation for JavaScript sucks, it does not even show any sample codes. All they do is assuming that we already know all the things about the subject. If I knew, why would I dive into documentation in the first place.
I believe there is some confusion here about adding these requests. This feature was added simply to allow POST/PUT requests to be supported for your origin so that functionality in your application such as form submissions or API requests would now function.
The recommended approach as you pointed out is to make use of S3 transfer acceleration, which actually makes use of the CloudFront edge locations.
Transfer Acceleration takes advantage of Amazon CloudFront’s globally distributed edge locations. As the data arrives at an edge location, data is routed to Amazon S3 over an optimized network path.
I am about to embark on creating a rest api that accepts two images base 64 encoded from an external party.
I have heard from people that it would be prudent to think about protecting against a malicious file being sent to me via some sort of an attack.
My initial thoughts are that I need to think about virus scanning but also an incorrect or dodgy image (for example in image from a porn site).
We are using AWS as our cloud platform. Can anyone help me with some ideas as to best practices / how i can protect against this?
In general (there are obviously exceptions), if you are just receiving and storing images into something such as S3, you do not need to worry about viruses and whatnot from the image upload, as these files are not going to be executed, and should only be rendered as jpg/png/whatever file format you use.
If you wish to check for unsafe or inappropriate images, you could utilize Amazon Rekognition's Unsafe content detection feature. In addition you could also use Rekognition to perform image tagging and generate metadata for each image.
AWS CloudFront has the option to compress files (see here)
What are the pros and cons of using AWS CloudFront gzip compression vs using the compression-webpack-plugin?
CloudFront compresses content on the fly and it has certain limitation.
https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/ServingCompressedFiles.html
The common problem is
CloudFront is busy
In rare cases, when a CloudFront edge location is unusually busy, some files might not be compressed.
This is most common problem because if this happens CF caches a non compressed copy and serve it until you clear the cache.
compression-webpack-plugin on the other hand compresses at the disk which is more helpful.
My website is on shopify platform. Google test show following message
Compressing resources with gzip or deflate can reduce the number of bytes sent over the network.
Enable compression for the following resources to reduce their transfer size by 1.6MiB (78% reduction).
Compressing https://sdk.azureedge.net/js/1.b... could save 1.6MiB (78% reduction).
How can I compress these files in shopify?
You can speed up your Google Speed test by compressing your websites (.js) file
Google itself provides its Closure Compiler for this purpose.
refer Closure Compiler
P.S. Take backup of your JS file before updating in your store js because its compressed code cannot be rolled back to its original state after being compiled.
If you're on Shopify, most likely those PageSpeed warnings are coming from 3rd party apps.
In that case you can just reach out to them and ask if they will compress the file.
If you don't know which app it is, you can go to the domain name or try searching "what is [paste file path here]".
For Shopify specifically, you can get more details and a template from this post.