Mozilla's new Mozjpeg 2.0 image encoder improves JPEG compression

By Himanshu Arora · 9 replies
Jul 16, 2014
Post New Reply
  1. Mozilla yesterday announced the launch of an updated version of its JPEG compression tool Mozjpeg. The latest version (2.0) snips down file sizes by 5 percent on average compared to the widely used libjpeg-turbo.

    Read more
    cliffordcooley likes this.
  2. ikesmasher

    ikesmasher TS Evangelist Posts: 2,996   +1,317

    "It's a lossy format, which means that you can remove some data to reduce the file size without significantly affecting the original image’s integrity"

    Thats lossless. Well, depends what you consider "significantly". jpegs fall under what I would consider "significantly" degrading picture quality (compared to lossless anyway.)

    Basically, I think you have them backwards.
  3. MID.AS

    MID.AS TS Member

    WebP is a far superior format! - but ALL major browser vendors really need to get together and decide on one standard format! It makes no sense for Google to be developing WebP and Mozilla to be developing a separate "Mozjpeg" format! - it's not good for end-users, and web developers aren't going to support a format that isn't widely supported in the vast majority of browsers! WebP is the way forward!
  4. Wasted.
  5. @ikesmasher
    A lossless format would be bit for bit the same as the original. Any loss at all, is lossy.

    Does not matter what your definition of "significant" is.
  6. samfisher

    samfisher TS Rookie

    Wrong. It's still a standard JPEG decoder that does the job. Only the ENCODING part is different. The JPEG standard only controls the decoding process. As long as it decodes the same way (Which it does), it will work in all existing browsers. So all Mozilla has to do is spread the word about their encoder, and nothing new has to be done on the browser side.
  7. cliffordcooley

    cliffordcooley TS Guardian Fighter Posts: 9,715   +3,694

    After reading the previous quote, I don't understand how the following can be said.
    How can decoding (which would make pages render faster) be faster, if there is no change in the decoder?

    Am I looking at this out of context?
  8. samfisher

    samfisher TS Rookie

    How can decoding (which would make pages render faster) be faster, if there is no change in the decoder?

    Am I looking at this out of context?[/quote]

    By making file sizes smaller, which the encoder also does.
  9. samfisher

    samfisher TS Rookie

    Won't let me post links, but the article on ArsTechnica explains this in further detail.
  10. Samfisher

    The jpeg decoders same as fast battery charger stopped at 80% after some minutes bc the rest 20% charge needs hours.
    And the jpeg decoding stopped also at cca. 80% because the rest of parts 30-60 sec on strong cpu for no artifact and big different from original quality.
    There are real 100% decoders but nobody use it because users dont want waiting for slow web pages.
    Dont think when you change jpeg quality 80-90-95% bc that different thing, that is encoding but this is decoding.

Similar Topics

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...