Categories: General

Introducing the ‘mozjpeg’ Project

Today I’d like to announce a new Mozilla project called ‘mozjpeg’. The goal is to provide a production-quality JPEG encoder that improves compression while maintaining compatibility with the vast majority of deployed decoders.

Why are we doing this?

JPEG has been in use since around 1992. It’s the most popular lossy compressed image format on the Web, and has been for a long time. Nearly every photograph on the Web is served up as a JPEG. It’s the only lossy compressed image format which has achieved nearly universal compatibility, not just with Web browsers but all software that can display images.

The number of photos displayed by the average Web site has grown over the years, as has the size of those photos. HTML, JS, and CSS files are relatively small in comparison, which means photos can easily make up the bulk of the network traffic for a page load. Reducing the size of these files is an obvious goal for optimization.

Production JPEG encoders have largely been stagnant in terms of compression efficiency, so replacing JPEG with something better has been a frequent topic of discussion. The major downside to moving away from JPEG is that it would require going through a multi-year period of relatively poor compatibility with the world’s deployed software. We (at Mozilla) don’t doubt that algorithmic improvements will make this worthwhile at some point, possibly soon. Even after a transition begins in earnest though, JPEG will continue to be used widely.

Given this situation, we wondered if JPEG encoders have really reached their full compression potential after 20+ years. We talked to a number of engineers, and concluded that the answer is “no,” even within the constraints of strong compatibility requirements. With feedback on promising avenues for exploration in hand, we started the ‘mozjpeg’ project.

What we’re releasing today, as version 1.0, is a fork of libjpeg-turbo with ‘jpgcrush’ functionality added. We noticed that people have been reducing JPEG file sizes using a perl script written by Loren Merritt called ‘jpgcrush’, references to which can be found on various forums around the Web. It losslessly reduces file sizes, typically by 2-6% for PNGs encoded to JPEG by IJG libjpeg, and 10% on average for a sample of 1500 JPEG files from Wikimedia. It does this by figuring out which progressive coding configuration uses the fewest bits. So far as we know, no production encoder has this functionality built in, so we added it as the first feature in ‘mozjpeg’.

Our next goal is to improve encoding by making use of trellis quantization. If you want to help out or just learn more about our plans, the following resources are available:

* github
* mailing list

No comments yet

Post a comment

Leave a Reply

Your email address will not be published. Required fields are marked *