Mozilla's new Mozjpeg 2.0 image encoder improves JPEG compression

Himanshu Arora

Posts: 902   +7
Staff

Mozilla yesterday announced the launch of an updated version of its JPEG compression tool Mozjpeg. The latest version (2.0) snips down file sizes by 5 percent on average compared to the widely used libjpeg-turbo.

Although that might look like a small number, it's significant for image-heavy sites like Facebook, helping them save bandwidth and render faster. Also, depending on the image, that number can vary from slightly less than 5 percent to as high as 15 percent.

"The end goal is to reduce page load times and ultimately create an enhanced user experience for sites hosting images", said Josh Aas, senior technology strategist for Mozilla.

He also added that unlike Mozjpeg 1.0, which only focused on progressive JPEGs, the new version also improves images saved in the baseline format.

The project is being backed by Facebook, which has already begun testing the tool. The social network has also donated $60,000 to Mozilla to continue its work on this technology, including the next iteration, Mozjpeg 3.0.

"Facebook supports the work Mozilla has done in building a JPEG encoder that can create smaller JPEGs without compromising the visual quality of photos", Facebook software engineer Stacy Kerkela said. "We look forward to seeing the potential benefits mozjpeg 2.0 might bring in optimizing images".

The libjpeg-turbo library, on which Mozjpeg is based, uses fewer CPU cycles than the Mozilla's tool, and powers JPEG decoding in Firefox. "We recommend using libjpeg-turbo for a standard JPEG library and any decoding tasks. Use mozjpeg when creating JPEGs for the Web", Aas wrote.

The JPEG format, which has been in use for more than 20 years, is one of the most widely used image formats on the Internet. It's a lossy format, which means that you can remove some data to reduce the file size without significantly affecting the original image’s integrity.

Google has been promoting the use of its WebP image format, a derivative of the video format VP8, but Mozilla has long resisted the call to adopt it. A Mozilla study from 2013 concluded that newer formats such as WebP or JPEG XR were not significantly better than JPEG.

Permalink to story.

 
"It's a lossy format, which means that you can remove some data to reduce the file size without significantly affecting the original image’s integrity"

Thats lossless. Well, depends what you consider "significantly". jpegs fall under what I would consider "significantly" degrading picture quality (compared to lossless anyway.)

Basically, I think you have them backwards.
 
WebP is a far superior format! - but ALL major browser vendors really need to get together and decide on one standard format! It makes no sense for Google to be developing WebP and Mozilla to be developing a separate "Mozjpeg" format! - it's not good for end-users, and web developers aren't going to support a format that isn't widely supported in the vast majority of browsers! WebP is the way forward!
 
@ikesmasher
A lossless format would be bit for bit the same as the original. Any loss at all, is lossy.

Does not matter what your definition of "significant" is.
 
WebP is a far superior format! - but ALL major browser vendors really need to get together and decide on one standard format! It makes no sense for Google to be developing WebP and Mozilla to be developing a separate "Mozjpeg" format! - it's not good for end-users, and web developers aren't going to support a format that isn't widely supported in the vast majority of browsers! WebP is the way forward!

Wrong. It's still a standard JPEG decoder that does the job. Only the ENCODING part is different. The JPEG standard only controls the decoding process. As long as it decodes the same way (Which it does), it will work in all existing browsers. So all Mozilla has to do is spread the word about their encoder, and nothing new has to be done on the browser side.
 
"The end goal is to reduce page load times and ultimately create an enhanced user experience for sites hosting images", said Josh Aas, senior technology strategist for Mozilla.
After reading the previous quote, I don't understand how the following can be said.
Wrong. It's still a standard JPEG decoder that does the job. Only the ENCODING part is different.
How can decoding (which would make pages render faster) be faster, if there is no change in the decoder?

Am I looking at this out of context?
 
How can decoding (which would make pages render faster) be faster, if there is no change in the decoder?

Am I looking at this out of context?[/quote]

By making file sizes smaller, which the encoder also does.
 
Samfisher

The jpeg decoders same as fast battery charger stopped at 80% after some minutes bc the rest 20% charge needs hours.
And the jpeg decoding stopped also at cca. 80% because the rest of parts 30-60 sec on strong cpu for no artifact and big different from original quality.
There are real 100% decoders but nobody use it because users dont want waiting for slow web pages.
Dont think when you change jpeg quality 80-90-95% bc that different thing, that is encoding but this is decoding.
 
Back