Google has developed and open-sourced a new JPEG algorithm that reduces file size by about 35 percent—or alternatively, image quality can be significantly improved while keeping file size constant. Importantly, and unlike some of its other efforts in image compression (WebP, WebM), Google's new JPEGs are completely compatible with existing browsers, devices, photo editing apps, and the JPEG standard.
The new JPEG encoder is called Guetzli, which is Swiss German for cookie (the project was led by Google Research's Zurich office). Don't pay too much attention to the name: after extensive analysis, I can't find anything in the Github repository related to cookies or indeed any other baked good.
There are numerous ways of tweaking JPEG image quality and file size, but Guetzli focuses on the quantization stage of compression. Put simply, quantization is a process that tries to reduce a large amount of disordered data, which is hard to compress, into ordered data, which is very easy to compress. In JPEG encoding, this process usually reduces gentle colour gradients to single blocks of colour and often obliterates small details entirely.
The difficult bit is finding a balance between removing detail, and keeping file size down. Every lossy encoder (libjpeg, x264, lame) does it differently.
(Score: 2) by Wootery on Monday March 20 2017, @12:03PM (1 child)
Is there a good reason no-one uses modern image compression algorithms and JavaScript decoders? It's not like it's impossible -- even video decoding can be done in JavaScript. [brionv.com]
I imagine sites like Flickr and Imgur could save countless terabytes of bandwidth if they did.
(Score: 0) by Anonymous Coward on Monday March 20 2017, @02:02PM
JavaScript decoders? You like maxing out your CPU, don't you? Note that on your phone, video is typically not even decoded by machine code running on the CPU, but on specialized hardware included for just that purpose. Guess why.
Not to mention that there are already too many sites which require JavaScript …