I am an extreme moderate

November 20, 2011

Webp lossless – first impressions

Filed under: Uncategorized — niezmierniespokojny @ 7:13 pm

Google released a lossless version of their webp codec recently. Or more likely – a totally new codec under the same brand.
It’s a good development, PNG is very bad and there’s hardly anything that could replace it. Almost all research codecs released in last years were unsuitable for many uses, because their decompression is slow and memory hungry. The only exception that comes to my mind is BCIF. BCIF compresses fast and usually well, but on some kinds of images is very poor; its decompression is very fast.
There are also JPEG 2000 and Microsoft’s JPEG XR. The former is not very good and is quite slow. On weaker machines decompression is slower than transfer. Can’t tell much about the latter, nobody uses it.
So on one hand I think that any good evaluation of webpll should take BCIF as the main competitor. OTOH no large business stands behind it and it has little chance of adoption, so while it’s the state of art compressor with fast decoding, it’s actually hardly a webpll competitor. JPEG 2000 and JPEG XR both failed and don’t seem to have any future, which leaves poor old PNG and the only format in the field.

I didn’t read the google performance report, since their original webp announcement turned out to be hugely misleading, I don’t think it’s worth my time. I downloaded the codec (Surprise, no sources. Yet?) and wanted to test it on a couple of my files. The first one was 27 MB uncompressed or 10 MB as a PNG. The strongest codec on the file is BMF, which gets it down to 6.2 MB. I’ve heard that webpll was very slow, so for the start I decided to try the fastest mode (0).
It took 20 minutes. And compressed to 9.6 MB. Well, I like hugely asymmetric codecs, but after the test I hoped that stronger modes are much stronger and not much slower. Also, compression needed over 200 MB RAM. OK, we have a baseline, let’s try what’s the best it can do. The strongest mode, 100.

I waited for 20 minutes.
Then for one hour.
After 5 hours I started to wonder whether it would ever return. Maybe it’s hung? How can I know? I won’t be able to tell unless it returns.
After 6 hours and 7 minutes there was a crash. Out of memory. I don’t know how much was needed, but I’m pretty sure that over 1 GB. Damn. It’s was 6 hours and 7 minutes of CPU time, but closer to 7h realtime wasted.

I decided to make another trial on the file, this time with the default mode. The bad thing is that the program doesn’t tell what is the default. Maybe it’s 100? The program text suggests that it’s not, saying that the setting is to make files denser. Maybe it’s 0? After 20 minutes I know it isn’t. So it can be 99 and crash too. And there are no sources to check.

While waiting I decided to make another trial. I didn’t use any of my standard images, they are all too large. I searched the disk for the smallest file that I got. It wasn’t a good one, because it clearly have been compressed in a lossy way before, but well, it’s better to have some data then none.
The file was 24-bit, 300×300, 264 KB uncompressed or 67.4 KB as PNG. I didn’t try many compressors on it, but BMF shrunk it to 53.3 KB.
webpll 0: 0.9 s., 63.2 KB. Better than on the previous file.
webpll 100: 176 s., 60.3 KB. Half way from PNG to BMF.
webpll default: 181 s., 60.5 KB. Weirdly, slightly slower. I guess it’s a testing variance, the test was not supposed to be accurate anyway.
For comparison, BCIF: 0.2 s., 60.2 KB. Well….testing on a single file, and one that was compressed in a lossy way, on a machine loaded with another compression job certainly isn’t accurate, but webpll doesn’t look great, does it?
And, BTW, webpll used a little over 40 MB of memory at peak.

~3 minutes / quarter MB = ~12 minutes / MB = ~5 hours 24 minutes / 27 MB. Damn, I was likely very close to finish the first test before it crashed. And since the default mode takes about as much as much as 100, I terminated the test that I had running in background.

I tried also decompression. Too bad that webpll can be only decompressed to PNG, which means that it actually first decompresses the file to a bitmap and then compresses to png, which skews the timing, possibly by a lot.

Anyway, decompressing the small file took 0.091 s. BCIF needed 0.031 s. Hard to tell which one is better.

To sum up, webpll seems less than thrilling. I searched a bit and it seems that there are no independent, reasonably well done tests yet. One good tester reported that he was unable to run the test, because there’s only a 32-bit Windows executable and 32 bits are not enough to compress some 100 MB files that he uses.
So overall, huge memory consumption and huge slowness, they are sure. I don’t have data to assess webpll strength, but seeing that it took 6+ hours on the first file, I hoped it had something in common with MRP, which should make it about as strong if not stronger than BMF. I guess it’s weaker.
I’m severely disappointed.

UPDATE:
Alexander Ratushnyak did a test of webpll in mode 0. He failed to run tests on his entire test corpus because of the mentioned memory problems, but you can see the results of other compressors here. It should be noted that the corpus covers only photographic images. And that PNGs are not optimized.
PNG size: 1 220 278 081
webpll size: 1 100 509 276
BCIF size: 975 294 955
There’s no good timing, because while he shows sizes of individual files, which let me calculate BCIF result, he only shows timing on the entire corpus.
Time is in seconds.
webpll compression: 17 561
webpll decompression: 1 286
BCIF compression: 832.30
BCIF decompression: 167.29
If we assume that BCIF performance is the same on all files and adjust the times to reflect a smaller corpus size, the result would be:
BCIF compression: 705.43
BCIF decompression: 141.79

Still, it doesn’t look good at all, though the fastest modes are frequently inefficient.

ADDED:
Some fragmentary test results:
http://encode.ru/threads/1136-WebP-(lossy-image-compression)?p=27211&viewfull=1#post27211
http://tech.slashdot.org/comments.pl?sid=2533368&cid=38102662
http://encode.ru/threads/1136-WebP-(lossy-image-compression)?p=27195&viewfull=1#post27195
http://www.heise.de/developer/news/foren/S-Re-Ein-typischer-PNG-Anwendungsfall-fehlt-in-den-Beispielen/forum-216147/msg-21079458/read/
https://groups.google.com/a/webmproject.org/group/webp-discuss/msg/bfa4a880ef68f877
https://groups.google.com/a/webmproject.org/group/webp-discuss/msg/f99d74e73bdd3386
http://encode.ru/threads/1136-WebP-(lossy-image-compression)?p=27254&viewfull=1#post27254
http://encode.ru/threads/1136-WebP-(lossy-image-compression)?p=27261&viewfull=1#post27261
I will probably add more as I spot them. If I missed something, please let me know.

The results are mixed. 2 testers report wonderful savings, others quite bad ones, one very bad (1.7% saved over PNG).

ADDED: Actually the webpll source is available, just not next to the binaries. It’s here.
The default compression level is 95.

ADDED: Alexander Ratushnyak posted some more results. He tested c10 and c30 on a much restricted (because of crashes) set of files.
I’d like to add BCIF results again:
49 385 276. 4.6% less then c30 though on file by file comparison there are differences swinging wildly both ways. You can see results on individual files in his post and BCIF restricted to only those files here.

ADDED: I created a quite good benchmark involving WebP Lossless.

Leave a Comment »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a comment

Blog at WordPress.com.