Meet Guetzli, the image compression algorithm that might change the shape of the Internet

Guetzli

Google developed Guetzli, a new JPEG encoding algorithm. What’s special with it? The algorithm, which btw is open source, is able to compress a JPG file without loss of image quality and to cut its size by 35% (again: it’s lossless).

Guetzli is a JPEG encoder that aims for excellent compression density at high visual quality. Guetzli-generated images are typically 20-30% smaller than images of equivalent quality generated by libjpeg. Guetzli generates only sequential (nonprogressive) JPEGs due to faster decompression speeds they offer.

From the Google Research Blog:

Guetzli [guɛtsli] — cookie in Swiss German — is a JPEG encoder for digital images and web graphics that can enable faster online experiences by producing smaller JPEG files while still maintaining compatibility with existing browsers, image processing applications and the JPEG standard. From the practical viewpoint this is very similar to our Zopfli algorithm, which produces smaller PNG and gzip files without needing to introduce a new format, and different than the techniques used in RNN-based image compression, RAISR, and WebP, which all need client changes for compression gains at internet scale.

The visual quality of JPEG images is directly correlated to its multi-stage compression process: color space transform, discrete cosine transform, and quantization. Guetzli specifically targets the quantization stage in which the more visual quality loss is introduced, the smaller the resulting file. Guetzli strikes a balance between minimal loss and file size by employing a search algorithm that tries to overcome the difference between the psychovisual modeling of JPEG’s format, and Guetzli’s psychovisual model, which approximates color perception and visual masking in a more thorough and detailed way than what is achievable by simpler color transforms and the discrete cosine transform. However, while Guetzli creates smaller image file sizes, the tradeoff is that these search algorithms take significantly longer to create compressed images than currently available methods.

It’s easy to figure out how big of an impact this will make for the Internet. The algorithm may literally reduce the size of the Internet, and that’s no joke.

Guetzli

20×24 pixel zoomed areas from a picture of a cat’s eye. Uncompressed original on the left. Guetzli (on the right) shows less ringing artefacts than libjpeg (middle) without requiring a larger file size (Image © Google)

You may try out Guetzli on your own, it’s free and available on GitHub. It’s compatible with all browsers and image processing applications, and obviously it’s compatible with the JPEG standard.

What about the name? You’ve seen above that it is Swiss German for cookie. The project was born out of Google Research’s Zurich office.

[via Google Research Blog]
  • Thomas E.

    I am thinking there is some confusion on what lossless means. The images on the left and right are not perfectly equal. To me, that is not lossless???

    • iThoughtS0

      You are correct on that.
      That is not lossless for sure. At least not on the example they are showing.
      It clearly shows sign of degradation.