A new (slow) open source JPEG algorithm makes images 35% smaller and looks better than older compression systems

Originally published at: http://boingboing.net/2017/03/22/improved-psychovisual-models.html

2 Likes

Good thing no one is going to slap a patent on this, right?

5 Likes

They probably aren’t, because the code has already been released under an open source license, which (a) would make it difficult or impossible to later get and enforce a patent, and (b) would create terrible optics if Google tried.

If they come up with a way to improve the performance by multiple orders of magnitude (making it useful in the real world, and for video), it’s the improvement that would be commercially valuable, so that’s the thing they’d want to patent, anyway.

2 Likes

John Gruber says that it took eight minutes to compress a single image on his top of the line Imac. He concludes that this new algorithm is going to be deployed mostly on server backends, rather than in image editing programs.

Although I might be interested in an app that offers to recompress all my images using the algorithm in the background - check back in a week or two and see how much disk space has been freed up by optimizing your photo library.

7 Likes

Is it just me, or do all of these cats eyes look identical?

6 Likes

It’s just you. Check out all the noise above the eye in the middle pic. There’s a little in the right one, but it’s not bad.

9 Likes

So it’s extremely slow now. What are the chances it won’t be improved inside of a year?

Now imagine if Microsoft owned it.

2 Likes

It would be as intuitive and productive as Word. :+1:

5 Likes

I just ran this on a whole slew of images. “Slow” is true - and the memory use is kind of drastic as well; I think a couple of 12Mpix images peaked over 6GB. Not the sort of thing you run automatically on uploaded images on your dinky 1GB-RAM VPS. Still, it does what it claims, and I hope it’s possible to prune the memory usage a bit. (It might not be - I have no idea what sort of structures this builds internally.)

6 Likes

I tried this on my MacBook Air over the weekend (would have used my Windows desktop but guetzli requires dark magic to run on Windows right now). It’s interesting but I doubt you’ll see it on even backends any time soon.

The memory and CPU time involved are intense.

What’s kind of funny to me is BBS ran the screenshot I uploaded through some other library and increased the file size by about 3kb. :smiley:

9 Likes

Revolutionary compression algorithm even comes with a customized logo

2 Likes

For the CPU issue, one would think this would be a good place to let a GPU do the heavy lifting instead. I’ve certainly found that to be the case for video transcoding. An AMD FX-6100 (yeah, I know, Bulldozer sucks) can’t transcode 1080i MPEG-2 to H.264 in real time, but NVENC-enabled ffmpeg on the same system, using an Nvidia GTX 960 at the “slow” setting can do it over 7x faster than real time.

4 Likes

Whether or not a GPU would help depends entirely on how this new algorithm works.
If it’s easy to parallelise then a GPU may help, but otherwise it’s not going to make any difference.

1 Like

psychovisualization

Album name!

You mean it would piss me off with ‘features’ like the Ribbon and difficult alignment that instead works flawlessly and smoothly in LibreOffice?

4 Likes

I read an article a couple of days ago (which naturally I can’t now find) saying that Guetzli is basically much the same as MozJPEG, but much slower. I’ve had pretty good results using this online encoder in the past: https://mozjpeg.codelove.de/

See also: http://www.jpegmini.com/

Is the encoder available for public use? I didn’t notice a link to it in the articles or paper.

It’s available on GitHub.

3 Likes

I’m going to use this as an excuse to link to BPG.

Sometimes I daydream about the bandwidth savings BPG would provide for animated gifs on this BBS alone.

5 Likes

I’ve read some comparisons between the two. Like with most lossless compression of comparable quality, the benefits vary depending on what you’re compressing with each. In terms of CPU and RAM usage during the compression phase, mozjpeg is currently the clear winner. Whether it’ll stay that way is unclear at the moment (the current implementation of guetzli seems pegged to a single core; if it can be threaded and run in a VM like Ruby or Java usually are, it might be faster).

Which one will result in a smaller, nicer looking image is going to depend on the original image. If I was building a service right now, I’d go with Mozilla’s library. In six months or a year, who knows?

Jpegmini seems to be involved in a lot of AstroTurfing right now. I’m not saying they’re not good but I don’t trust them because of the AstroTurf.

3 Likes