Results 1 to 9 of 9

Thread: [News] Google's new algorithm shrinks JPEG files by 35 percent

  1. #1
    Join XS BOINC Team StyM's Avatar
    Join Date
    Mar 2006
    Location
    Tropics
    Posts
    9,468

    [News] Google's new algorithm shrinks JPEG files by 35 percent

    https://www.engadget.com/2017/03/17/...by-35-percent/

    For obvious reasons, Google has a vested interest in reducing the time it takes to load websites and services. One method is reducing the file size of images on the internet, which they previously pulled off with the WebP format back in 2014, which shrunk photos by 10 percent. Their latest development in this vein is Guetzli, an open-source algorithm that encodes JPEGs that are 35 percent smaller than currently-produced images.

    As Google points out in its blog post, this reduction method is similar to their Zopfli algorithm that shrinks PNG and gzip files without needing to create a new format. RNN-based image compression like WebP, on the other hand, requires both client and ecosystem to change to see gains at internet scale.

    If you want to get technical, Guetzli (Swiss German for "cookie") targets the quantization stage of image compression, wherein it trades visual quality for a smaller file size. Its particular psychovisual model (yes, that's a thing) "approximates color perception and visual masking in a more thorough and detailed way than what is achievable" in current methods. The only tradeoff: Guetzli takes a little longer to run than compression options like libjpeg. Despite the increased time, Google's post assures that human raters preferred the images churned out by Guetzli. Per the example below, the uncompressed image is on the left, libjpeg-shrunk in the center and Guetzli-treated on the right.

  2. #2
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    More image quality decrease... no thank you!
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  3. #3
    Xtreme Mentor
    Join Date
    Aug 2006
    Location
    HD0
    Posts
    2,646
    Quote Originally Posted by BenchZowner View Post
    More image quality decrease... no thank you!
    Well it looks like the decrease is MOSTLY in the Hue and Saturation domains as opposed to the the vibrancy (which we're not as sensitive to).

    Basically the data that gets distorted is the subset which humans are not as good at gauging and the data we're sensitive to is better preserved.

    This is literally just using an optimization metric closer which better matches to our genetics.

    Imagine better image quality at the same file size.

  4. #4
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955


    Shoutout to those who get it
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  5. #5
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    Quote Originally Posted by xlink View Post
    Well it looks like the decrease is MOSTLY in the Hue and Saturation domains as opposed to the the vibrancy (which we're not as sensitive to).

    Basically the data that gets distorted is the subset which humans are not as good at gauging and the data we're sensitive to is better preserved.

    This is literally just using an optimization metric closer which better matches to our genetics.

    Imagine better image quality at the same file size.
    I take such claims with a buttload of pinches of salt.
    Google is "reinventing" the internet just when the average internet access speed globally is on the rise.
    Most websites have image size limits already well under 500kb per image, and we are having something like 1MB/s average speed ( counting all the slow countries in the average, not excluding really slow ones ).
    Facebook gives us terrible image quality and filesizes under 100kb per image in most cases... we don't need lower filesizes than that crap.
    I highly doubt that at the same size Google's algorithm does something useful ( for example avoid creating compression halos ).
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  6. #6
    Xtreme Mentor
    Join Date
    Aug 2006
    Location
    HD0
    Posts
    2,646
    Quote Originally Posted by BenchZowner View Post
    I take such claims with a buttload of pinches of salt.
    Google is "reinventing" the internet just when the average internet access speed globally is on the rise.
    Most websites have image size limits already well under 500kb per image, and we are having something like 1MB/s average speed ( counting all the slow countries in the average, not excluding really slow ones ).
    Facebook gives us terrible image quality and filesizes under 100kb per image in most cases... we don't need lower filesizes than that crap.
    I highly doubt that at the same size Google's algorithm does something useful ( for example avoid creating compression halos ).
    I'm just imagining facebook with non- quality.

  7. #7
    Xtreme Addict
    Join Date
    Feb 2005
    Location
    OZtralia
    Posts
    2,051
    I can just see all the photo sites adopt this reduction in size @ the cost of quality and clarity................................NOT
    lots and lots of cores and lots and lots of tuners,HTPC's boards,cases,HDD's,vga's,DDR1&2&3 etc etc all powered by Corsair PSU's

  8. #8
    Xtreme Member
    Join Date
    Jun 2008
    Location
    New Jersey
    Posts
    208
    Looks apparent. You can see the green getting markedly de-saturated.

  9. #9
    Xtreme Addict
    Join Date
    Sep 2010
    Location
    US, MI
    Posts
    1,680
    Seems ok to me, better then std lib-jpeg it seems.
    This should bring higher quality jpegs on the net, as long as they are kept the same size as previous stuff.

    So I'm all for this .

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •