Recent Changes - Search:

libjpeg-turbo Home

About libjpeg-turbo

Downloads

Documentation

Reports

Position Statements

Developer Info

Contact

What About mozjpeg?

On March 5, 2014, Mozilla announced the mozjpeg project, a JPEG encoder that is designed to provide better compression for web images, at the expense of performance. Since mozjpeg builds upon the libjpeg-turbo source code, quite a few people are asking the question, "why didn't Mozilla just integrate their changes into libjpeg-turbo?" The words "at the expense of performance" provide the simplest answer to that question, but I felt that a more detailed explanation was in order.

For starters, Mozilla did approach me regarding the possibility of integrating their code into libjpeg-turbo, but we both basically agreed that the project goals are incompatible. As the README file for mozjpeg says:

'mozjpeg' is not intended to be a general JPEG library replacement.
It makes tradeoffs that are intended to benefit Web use cases and
focuses solely on improving encoding.  It is best used as part of
a Web encoding workflow.  For a general JPEG library (e.g. your
system libjpeg), especially if you care about decoding, we recommend
libjpeg-turbo.

mozjpeg's sole purpose is to losslessly reduce the size of JPEG files that are served up on the web. Thus, Mozilla wants their solution to compress as tightly as possible out of the box. We, on the other hand, want our solution to compress as quickly as possible out of the box. There is really no way to reconcile those two goals.

mozjpeg relies on progressive JPEG encoding, and thus it enables progressive mode by default. libjpeg-turbo's optimizations primarily benefit baseline JPEGs, so while progressive mode works, the speedup it gives relative to libjpeg is only about 25-40%, not the 200-400% that can be achieved when producing baseline JPEGs. To put this another way, progressive mode in libjpeg-turbo is about 1/10 to 1/8 as fast as baseline. Decompression is a bit better but is still not stellar. libjpeg-turbo can decompress progressive JPEGs about 40-65% faster than libjpeg, but this is still in the neighborhood of 1/3 as fast as baseline. In short, progressive mode is not a "turbo" feature.

While progressive mode is already slow, enabling jpgcrush (which mozjpeg also does by default) makes things even slower. jpgcrush compresses at only about 15-30% of the speed of "plain" progressive mode and 3-4% (!!) of the speed of baseline mode. On the decompression side, "crushed" JPEGs generally decompress at about 30-45% of the speed of baseline JPEGs. Some images (particularly photographs and other "smooth" content) decompress more quickly when encoded using jpgcrush rather than "plain" progressive JPEG, but other images decompress more slowly.

In short, enabling this mode by default is simply a non-starter. We'd have to change our name from "libjpeg-turbo" to "libjpeg-turtle."

In addition to the conflicting project goals, the other show-stopping issue is that the jpgcrush feature breaks ABI compatibility. This code extends the exposed libjpeg compress structure (jpeg_compress_struct), and thus programs that link against a jpgcrush-enabled version of libjpeg-turbo would not be able to use a non-jpgcrush-enabled version of libjpeg-turbo at run time (and vice versa.) This is exactly the same problem that was introduced by jpeg-7 and later (jpeg-7, jpeg-8, and jpeg-9 also extended the exposed libjpeg structures in order to support the SmartScale and forward DCT scaling features, and in doing so, every one of those releases broke ABI compatibility with the previous release.)

Before we start adding more stuff to the libjpeg API, a framework needs to be developed for extending that API without breaking ABI compatibility. Most likely, get/set functions would need to be introduced, and any new features would need to use those functions rather than directly modifying the compress/decompress structures. Those structures would not be extended, but rather, any state variables needed by the new features would be stored in one of the opaque structures that are allocated "behind the scenes." The existing exposed structure members would also be made available using the new get/set functions, thus allowing applications to transition to the "new" way of doing things. Perhaps if enough applications made the transition, then we could eventually consider making all of the libjpeg structures opaque, similarly to what libpng did. Given the long history of libjpeg, I don't have much confidence that such a transition would ever fully happen, but at least adding get/set functions would provide a less painful path forward for new features.

Without the addition of get/set methods, per above, or some other framework for adding new features without breaking ABI compatibility, then the only way that jpgcrush could be integrated into libjpeg-turbo is by using #ifdef statements. Basically, the feature would have to be made compile-time optional, and anyone who wanted to use it would have to build libjpeg-turbo from source.

The attached spreadsheet compares the performance of baseline and progressive modes in libjpeg and libjpeg-turbo with the default settings in mozjpeg 1.0.0, using the same images and benchmarks that were used in the libjpeg-turbo Performance Study. Mozilla claims 10% better compression, on average, relative to baseline JPEGs. This is easily reproducible. However, it is important to point out that most of that increase in compression ratio comes simply from the use of progressive mode. I don't claim that our set of test images is in any way canonical, but these images do at least run the gamut from things that are "easy" for JPEG to compress (photographs) to things that are more difficult (artificially-generated content.) In my testing, simply using progressive mode by itself improved the compression ratio by 5-20% (average 11%), whereas the addition of jpgcrush improved the compression ratio by only an additional 0.4% to 4% (average 2%) relative to "plain" progressive JPEGs. As explained above, you pay a severe performance penalty for that 2% reduction in size.

jpgcrush is meant to be a "starter" feature, and other features will follow (including trellis quantization.) These new features will likely provide additional improvements, but they will also likely be even more disruptive to the code base, and it is unclear what affect they may have on performance. Thus, it makes sense for these features to be developed in isolation.

When Mozilla first approached me about this, my first reaction was to ask why they went to the trouble of switching to libjpeg-turbo in Firefox if they were going to turn right around and encourage the use of a JPEG format that doesn't benefit from libjpeg-turbo's performance enhancements. Their answer was that CPU time is not the primary bottleneck when loading web pages. Perhaps they're right. Personally, though, I have my doubts as to whether a 10% average reduction in bandwidth relative to baseline, or a 2% average reduction in bandwidth relative to progressive, is going to matter much. According to Akamai, the global average Internet connection speed increased 10% in the third quarter of 2013 alone. So why would anyone want to go to the trouble of re-encoding all of their JPEG files when they could just wait 3 months? Mozilla is assuming that SysAdmins will run some sort of script to automatically optimize all of the JPEGs on their web server and that they will re-run this script every time the content changes. Personally, having been both a web developer and a SysAdmin in a former life, I have my doubts as to whether things will shake out like that. If a company is finding that their web site takes too long to load, then they would be much better served by re-encoding the JPEG images at the source (PhotoShop, for instance) using a lower JPEG quality. That's really how JPEG is supposed to work. You reduce bandwidth by dialing down the quality (dialing up the loss) until it is no longer visually acceptable. Rather than playing a game of inches, it seems like it would be much more fruitful to develop an algorithm that figures out, based on perceptual metrics (DSSIM, for instance), what the "acceptable" amount of loss is for a given image (roughly similar in concept to what DCTune does.) That would almost certainly reduce bandwidth by a lot more than 10%.

In a broader sense, as libjpeg-turbo gains more popularity as a project, I have to field more and more requests from the community to integrate new features, many of which are not very "project-friendly" (that is, they break ABI compatibility or create some sort of maintenance burden, or they simply aren't written very elegantly.) I want to support the community as much as I can, but I also have to be a bit of a mercenary about this stuff. I am an independent developer, so I do not earn a salary for my work on open source projects. I do things this way because it allows me to more easily respond to the needs of a variety of different organizations rather than being confined to the narrow agenda of just one company. The open source projects I maintain are ultimately healthier because of this. However, being independent also means that, unless an organization is specifically paying for me to work on a project, any work I do on that project is pro bono. In the case of libjpeg-turbo, I only get paid when I'm writing code for a paying customer. I don't generally get paid to answer questions on the mailing lists or to integrate patches from the community. I also don't get paid when I take lunch breaks or vacations or sick days. I am also not getting paid to write this. I contribute a significant amount of pro bono labor to libjpeg-turbo already, and it has reached the point at which trying to make everyone happy is cutting into my ability to pay rent. At some point, I have to step back and acknowledge that libjpeg-turbo cannot be all things to all people. At the end of the day, 100% of the income I have made from my work on this project has come from organizations who are interested in its high performance, either directly or indirectly (via VirtualGL and TurboVNC, my other pet projects.) Thus, whereas it's pretty easy to get me excited about a feature that significantly improves the performance of libjpeg-turbo, or a feature that improves the compression ratio of JPEG images without affecting performance, it is very hard to build a business case for working on other types of features, unless someone is specifically paying for them.

Despite the fact that my personal interest in libjpeg-turbo is 100% confined to baseline encoding at the moment, I maintain libjpeg-turbo as a general-purpose project-- including donating a lot of free labor to it-- because I believe in what it can do for the community. I am proud of what it has done already. I admit that I take a conservative approach when it comes to maintaining this project, and I understand that that makes it more difficult for people to contribute experimental features to the project. That's kind of the point, though. I want libjpeg-turbo to always be stable and performant, not experimental. Thus, whenever I add features to the code, I try my hardest to always maintain at least a beta level of quality, even in the subversion trunk. I take special care to ensure that neither stability nor performance regress when significant new features are added. I take special care to ensure that ABI compatibility is maintained and that any code that is accepted into the project is code that will not create maintenance problems for me or others down the road. I stand by my anal retentiveness, but I totally understand why a research project such as mozjpeg would find it to be an impediment to their progress. They're trying to build an isolated, fit-for-purpose toolkit, so things like ABI compatibility aren't really a concern for them.

I think a lot of the confusion about mozjpeg stems from the misuse of the word "fork." mozjpeg is not really a fork per se, as that would imply that they are seeking to supplant libjpeg-turbo. Rather, they are a special-purpose JPEG encoder that just happens to build upon the libjpeg-turbo source code. Their goals are 180 degrees different from ours, not to mention being much narrower in scope. We can debate whether or not some of those goals could be achieved almost as well using the unmodified libjpeg-turbo code, and we can debate whether the end result of mozjpeg will induce cheering or yawning, but no one is claiming that mozjpeg is a replacement for libjpeg-turbo. In fact, Mozilla's own product (Firefox) will continue to use libjpeg-turbo.

In short, even though I have my doubts about whether mozjpeg will achieve what it is setting out to achieve, it seems "mostly harmless" from my point of view, as long as people understand what that project is and what it isn't. Ultimately, the community will decide whether it's important, not me.

Creative Commons LicenseAll content on this web site is licensed under the Creative Commons Attribution 2.5 License. Any works containing material derived from this web site must cite The libjpeg-turbo Project as the source of the material and list the current URL for the libjpeg-turbo web site.

Edit - History - Print - Recent Changes - Search
Page last modified on April 07, 2014, at 03:46 PM