Google has discovered a new data compression algorithm which is said to make the internet faster by increasing data transfer speed and at the same time, reduces the page loading time of webpages by compressing the data by up to 8% smaller than the zlib software library. Dubbed as Zopfli, this open-source data compression algorithm is named after a popular Swiss bread recipe and is complete implementation of the Deflate algorithm, which is used in the popular ZIP archive format, as well as in gzip file compression.

In related Google developers blog post, Lode Vandevenne, Software Engineer in compression Team says:

The smaller compressed size allows for better space utilization, faster data transmission, and lower web page load latencies. Furthermore, the smaller compressed size has additional benefits in mobile use, such as lower data transfer fees and reduced battery use.

The higher data density is achieved by using more exhaustive compression techniques, which make the compression a lot slower, but do not affect the decompression speed. The exhaustive method is based on iterating entropy modeling and a shortest path search algorithm to find a low bit cost path through the graph of all possible deflate representations.

Output generated by Zopfli is about 3-8 percent smaller as compared to zlib at the maximum compression. Zopfli has been written in C and is a compression-only library. Zopfli is bit-stream compatible with compression used in gzip, Zip, PNG, HTTP requests, and others.

Due to the amount of CPU time required — 2 to 3 orders of magnitude more than zlib at maximum quality — Zopfli is best suited for applications where data is compressed once and sent over a network many times, for example, static content for the web. By open sourcing Zopfli, thus allowing webmasters to better optimize the size of frequently accessed static content, we hope to make the Internet a bit faster for all of us.