Hacker News new | past | comments | ask | show | jobs | submit login
WebP is so great except it's not (2021) (aurelienpierre.com)
282 points by enz on Dec 15, 2023 | hide | past | favorite | 393 comments



I've noticed the same issue with WebP and have gone back to JPG/PNG for most things (jpg for photos, png for UI-type images)

I think the real problem is, like many of the commenters here, most people can't tell the difference because desktop monitors have been stuck in a deadzone of zero innovation for the last 10 years. I'm sure half the folks here are viewing his example images on a 2012-era HD 1920x1080 LCD, which is definitely part of the problem.

It's bizarre. Smaller displays (Mobile phones) and larger displays (4k TVs) have fantastic pixel densities now considering their viewing distance. However any panel in the range of 20"-40" is stuck in the mid-2000s.

Also, I think the author would have done us a favor by using example photos with lighter backgrounds (or changing the background color of his post to black). The harshness of the black images on white don't allow the eye to adjust enough to see the issue. If you put those images on a dark background its super easy to tell the difference.


I have no problem seeing the artefacts on both my 2012-era displays. One of them is a rather good at the time 30" 2560x1600 IPS monitor, the other is an entry-level 27" TN 1080p TV.

So I don't think display quality really is the problem here. Maybe the drivers, or post-processing filters. Or maybe everyone doesn't have an eye for this. I have an interest in image processing, and that's the kind of detail one tends to notice with experience. The author of the article is undoubtedly more experienced than me and noticing these details may even be part of his job. He most likely will be able to notice these problems on crappy monitors, as well as telling you in which way that monitor is crap.


Someone else noted the author is sending different images to different monitor types... so no wonder everyone is seeing different things.

Generally though i would expect wide gaumet monitors to make a significant difference for these types of artifacts


I have an extremely hard time perceiving any difference on a 27" 4K monitor. I am not even sure I really see them.

The examples are just bad. If you want to show something, screenshot and enlarge it to show the artifacts.


This seems to be highly subjective. I had absolutely no problem seeing those artifacts without any pixel peeping, they're that obvious.

WebP image gradients just looked broken (posterized) except the lossless one, which was (obviously) perfect.


It's hard to see in the first set of images, but the second set is much clearer. In the WebP example, look to the right of the subject, about 1/6th of the image's width from the right edge. There's a hard transition between shades of grey. The JPEG version directly above it also has banding but each band is narrower so the difference at the edges is more subtle.


> enlarge it to show the artifacts.

One might argue that if you need to enlarge it to see the artifacts, then the artifacts aren't perceptible enough and the codec is already good enough for the use case.


But we are philistines not pro photographers


He was talking about the background, not the foreground.

The difference is in color around the edges of the picture in the background change noticeably on a non-fullscreen image on my Android 12 device.


> The examples are just bad. If you want to show something, screenshot and enlarge it to show the artifacts.

Yes! Where's the red underlines and diffs? I can see the background banding, but the foreground looks the same at a glance except that some of them look ambiguously "off" in ways that could just be placebo.

You'd think a visual artist would be more interested in visual communication and not just a wall of text with un-annotated photos.


I think he was complaining specifically about the background banding.


I downloaded the images and then compared them via Beyond Compare.

After that it was pretty obvious what the author is talking about.


The article is about the background banding.


Laptop and desktop monitors have been advancing just fine over in the Apple world with high ppi, brightness and color accuracy being standard for nearly a decade... it's just expensive and so one of the first corners cut for PC as most folks simply don't care.


I see the rings easy on my few years old AOC 1440p monitor. PC users can have way better monitors. Studio colour accuraccy or fast hz gaming


I could see them, but only after turning my brightness up close to the max. I usually have it very low.


> I've noticed the same issue with WebP and have gone back to JPG/PNG for most things (jpg for photos, png for UI-type images)

Wait... I agree for JPG but if you use lossless WEBP instead of PNG, isn't it simply the same pixels, just with a file about 30% smaller than the corresponding PNG file? (and 15% smaller compared to already heavily optimized PNG files like when using zopfli/optipng/etc.).

Isn't the "lossless" in "lossless WEBP" actually lossless when converting a PNG file to WEBP?

FWIW when you convert losslessly a PNG to WEBP, then decompress the WEBP back to a PNG file, then convert again that PNG back to a WEBP file, you get the exact same lossless WEBP file. It's also the same WEBP you get when you encode losslessly from either a PNG or that same PNG but "crushed" with a PNG optimizer.


Yeah but I just don't fw webp and other weird formats. JPEG and PNG are tried and true, also it's nice how the extension indicates lossiness.

On the technical side, webp support still isn't like png. Tried dragging a webp into Google Slides just now, got "unsupported image type," which is ironic. I'll try again in like 10 years.


> On the technical side, webp support still isn't like png.

Oh that's a good point.

I see lossless WEBP mostly as a way to save bandwith where PNG would have been used. If you've got a pipeline where, anyway, you already "crush" your PNG file, you may as well also generate a lossless WEBP file and serve that: all browsers support it. And you can fall back on the optimized PNG should the browser not support WEBP.

I mean: I use WEBP, but only lossless WEBP, as a replacement for PNG when I'd serve PNG files to browsers.

But for that one usecase: showing a PNG file in a webpage, I don't see that many downsides to lossless WEBP. It saves bandwith.


At this point in my life, I just don't have time. I basically use either mp4 or PNG for all web "images/animation" when doing web pages. I don't detect browsers or the like. Unless there is some revolutionary new image/video tech, I'll stick with them for the foreseeable future. I only bother with JPEG when it's straight from the phone/camera and I don't want any reduction in quality from the original high rez.


Only if you can accurately detect browser support and serve the PNG instead, which means added complexity. And you have to store both.

Also, if users download your images and use them elsewhere, webp will still be more annoying for them. Though it's not very common that you want them doing that anyway.


https://caniuse.com/webp

Any updated (modern) browser should be able to see webp just fine, I'd rather just serve it without a backup plan if I'm planning to have webp in my website.


The browser support for webp is fine, problem is everything else. If you only care about displaying the images (not letting people use them elsewhere), you only use lossless webp, and all your backend infra supports it, then sure.


I'm on a 27" 4K IPS screen here and have to squint/zoom in to see the difference the author is writing about. While it's nice some people really care for the best result I think most people aren't going to notice or care about it.


I'm guess it's also true that HN is definitely the wrong audience for this post. As the author suggests, if you spend all day in VScode/VIM, you're among the segment of computer users who looks at images the least as a percentage of time spent on a computer.


Yes, but at least there are a decent amount of font 'connoisseurs' here ;)


It's like the audiophile equivalent of using $500 speaker wire. Nobody normal really cares about the difference, if there's really any difference at all.


I caught it on my Android 12 without full screening. He's talking about the background, not the foreground. The backgrounds color noticeably changes from shot to shot around edges.


I have to zoom in to really notice that. But both the jpg and webp have distortion - webp slightly more. Both have difficulty with edges.


I think we're talking about two different things. You're not noticing the forest for the trees. I'm talking about big huge macro effects that become more apparent when you zoom out, not less.

There is a difference in the gradients of color. One hasn't the guy looking backlit and one doesn't.


At default zoom the image is 20% of the width of my monitor so it's hard to see artefacts. When zoomed in the posterization is noticeable but jpeg at 85% is about as bad as webp. I don't see any substantial difference in lighting.


>because desktop monitors have been stuck in a deadzone of zero innovation for the last 10 years.

That's a weird thing to say unless the pixel density is your one and only measure. Regardless of that, the posterization should be perfectly visible on a 2012 FullHD monitor, or even a 1366x768 TN screen of a decade-old laptop. Most commenters here are probably viewing the pictures on a scale different from 1:1.


> That's a weird thing to say unless the pixel density is your one and only measure.

Is it though? We now have OLED TVs and OLED smartphones.

Where's our OLED PC monitors?

On every measure, if you care about colors/contrast/black+white levels/resolution/density, the average computer monitor has fallen far behind.

You can't even buy a smartphone that has a panel half as bad as most PC monitors on the market. And, at least in my area, you'd actually have to go to a lot of effort to find a non-4k TV.


> Where's our OLED PC monitors?

https://computers.scorptec.com.au/computer/Oled-Monitor

They've been around for years.

PC monitors have been improving constantly with high refresh rates, local dimming HDR + 10 bit color, adaptive sync, OLED and more.


Only on the unusual high-end gaming monitors.


OLED is overwhelmingly reserved to high-end TVs and phones as well, so I think that point is moot.


My base iPhone 12 mini from years ago has OLED, so do a lot of cheaper Android phones. Gaming displays are far less common than these.


Phones have a smaller display which makes them easier to manufacter.


Yeah, that also supports how the iPads don't have OLED yet.


> Where's our OLED PC monitors?

https://www.displayninja.com/oled-monitor-list/

Mainly targeted towards the gaming market at the moment.


some of those prices are insane. Why are they so much more expensive that OLED TV's of similar size? Frame rate?


I dunno about TV much since I don't use them, but I have some ideas why it might be:

- Framerate - Response time - Adaptive sync - (how prone to burn-in is OLED? Monitors often have way more static images to TVs)

I assume combing these all might just make it more expensive than just individually each feature


> - Framerate - Response time - Adaptive sync - (how prone to burn-in is OLED? Monitors often have way more static images to TVs)

The much more complicated electronics plus Supply & Demand. Demand for TVs should be way higher then for high end monitors.


Not true. Monitors now are 1440p or 4k. Even at work for me.

The "issue" is that monitors last a LONG time. And thats good. We dont touch them or fiddle with them. They tend to just work. Phones and shit we keep dropping and breaking, then the battery gets bad.

Also for gaming you may even want 1080p 200hz monitor for high refresh rate and FPS over pixel density.


You also can't write software bad enough that you're forced to upgrade your monitor due to poor performance.


You almost can. The Windows Terminal app has a performance issue on gsync monitors. I think it's being treated like a game but the app only renders at 60 fps or something, maybe lower, which I guess forces the whole screen to refresh at that rate which causes mouse stutter


> They tend to just work

They really don't...


> I'm sure half the folks here are viewing his example images on a 2012-era HD 1920x1080 LCD, which is definitely part of the problem.

I just looked at the first two images of the post.

First on two mid end LCDs: one ASUS IPS from this year and one BenQ TN from 2012, both 24" 1920x1080 (~91 DPI). The difference between the images is clear on both.

And before posting, to make sure, I pulled out a 15" 1024x768 (~85 DPI: basically the same) NEC TN LCD from 2002. And a NEC CRT roughly 15" viewable 1024x768 from 1998. Both on VGA connectors (so there is the typical noise from that, which still doesn't cover up the posterization). The difference between the images is clear on both.

All monitors viewed from 3' away.

People are simply accommodated to poor image quality, including posterization. AAA FPS video games display it on static art backgrounds in the loading menu, and I can never tell if they are intended. Show them a 240Hz monitor with 30ms input lag and 5 frames of overshoot artifacts and viewing angles worse than 1998, and they'll be wowed.


It’s quite noticeable on a 2011 MacBook Air, too. The issue is less pronounced if you don’t have a decent display but it’s more that people are not used to it. Like bad kerning, it’s something you’ll notice everywhere if you train your eye to look for it, but otherwise probably don’t notice except that some things feel less appealing.


Also, only a tiny fraction of PC monitors have color gamuts wider than sRGB, proper HDR support, or any kind of calibration.

Recently I’ve been dabbling in HDR video, but I realised that the exercise is futile because I can’t send the results to anyone — unless they’re using an Apple device.


I see thing rings easy on my few years old AOC 1440p monitor.


Pixel density isn't the issue. 2K-4K computer monitors are pretty common. But they tend to suck in other ways compared to a MacBook screen. And yes I can tell the difference between the images on my MBP.


I opened the first two pictures in separate tabs and switched quickly between them. There is zero difference. Tried it on two different monitors, Chrome and Firefox. Same with the pictures of the guy at the end.

EDIT: The last comparison is webp twice, he linked it wrong. Here is the jpg one, still no difference:

https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...


I checked those images on a Macbook 16 M2 Max (standard P3-1600 nits preset), Chrome 120.0.6099.109. All of the WebP images had pretty bad posterization, while JPEG examples did not.

Edit: You have to actually click for a full size image to see the truth. Those inline images had pretty bad compression artefacts, even the supposed lossless versions.

So https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20... (full size lossless WebP image) looks fine, but inline version of the same image https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20... looks terrible.

Edit 2: The difference between...

https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20... lossy-noise.jpg (216 kB JPEG)

https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20... (150 kB WebP)

https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20... (301 kB WebP)

... is pretty obvious. Both of the WebP examples, even that 301 kB version, show clearly visible posterization.

I wonder if there's some issue with the WebP encoder (or the settings) he is using?

Edit 3:

It should be noted that monitor gamma and color profile might affect gradient posterization visibility.


> I wonder if there's some issue with the WebP encoder (or the settings) he is using?

I played around with online optimizers and IrfanView which I had locally. IrfanView got the results they did, no matter what else I tuned, obvious degradation at 90. Online optimizers were not even comparable in how bad they were.

edit: I found Squoosh [0], which has WebP V2 compression marked as unstable. It’s far better, half the size of JPEG 90, but it’s still degraded in comparison. Also, it saves as wp2 file, which neither Chrome nor FF support natively.

[0]: https://squoosh.app/editor


They ceased development on WebP2.. don't think they could've come up with anything better than AVIF or JXL already have anyway.


The first link in your Edit 2 section (the JPEG) one is broken, it should be https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...


Thanks! Unfortunately I can't change it anymore.


> I wonder if there's some issue with the WebP encoder (or the settings) he is using?

He's re-encoding the JPEG compressed images. That is a huge mistake.


From the article:

> It’s not 100 % clean either, but much better. Granted, this is WebP re-encoding of an already lossy compressed JPEG, so we stack 2 steps of destructive compression. But this is what Google Page Speed insights encourage you to do and what a shitload of plugins enable you to do, while pretending it’s completely safe. It’s not.


Addendum:

Tried it with a Windows laptop connected to a Samsung LS32A800 32" 4k display. Laptop has factory default settings. Chrome 120. The monitor is pretty low end for a 4k model.

Monitor's picture settings: Custom, brightness 81, contrast 75, sharpness 60, gamma mode1 and response time fastest.

Switched between those three "Edit 2" images blindly, yet the issues are obvious also on this combination.

The JPEG version looks better compared to WebP ones. (Also, this goes against my prior general assumptions about JPEG vs WebP quality.)


the second image and the third image are half resolution of the other, yeah some posterization is visible in Shoot-Antoine-0044-_DSC0085-lossless-1200x675.webp, but it's half resolution and he purposefully added a high frequency noise for his test then averaged the noise point trough resizing, and well, of course it's blurry.


> I opened the first two pictures in separate tabs and switched quickly between them. There is zero difference. Tried it on two different monitors, Chrome and Firefox. Same with the pictures of the guy at the end.

One easy difference to spot is the background in this pair is posterized (https://en.wikipedia.org/wiki/Posterization) in webp but not in jpg:

https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...

https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...


For clarity if anyone is still confused, on Wikipedia's example image, look at the snakes's shadow - that's what's happening to the background in the blog's image.

I didn't know the word "posterization", so I'd describe this (slightly?) more simply as a stepped gradient instead of a smooth gradient.


> There is zero difference.

There is a clear difference though, I can see it in all my monitors, from desktop to laptop and even mobile. It's especially visible in the top right quarter.

That being said if you're not into photography you might just not care enough to see it


At 50 y/o my eyesight began to fail and yet the differences in the pictures are freaking obvious. As in: it's impossible to not see how huge the differences are.

And many people commented the same. These simply aren't small differences.

People who cannot see the differences or who only see them after taking a close look should realize something: there are many people for whom the differences are going to be immediately obvious.


> People who cannot see the differences or who only see them after taking a close look should realize something: there are many people for whom the differences are going to be immediately obvious.

That's one possible conclusion. Another is that some people are overstating how obvious it is. I don't mean this as an insult - there's plenty of cases where people's stated perceptions and preferences disappear when tested under strict conditions (hello Audiophiles).

So - it's not immediately obvious whether claims such as yours are trustworthy.

(for the record I can see the difference but it's fairly subtle on my screen)


It's definitely an objective phenomenon but there's two factors at play: first is the monitor quality. I have two monitors of the same model number but made in different years with obviously different panels (color reproduction is all over the place between them), and the banding is obvious in one monitor but not the other. I can drag the window between screens and it disappears. On my iPhone, it's very obvious.

Second is how much each person's brain interpolates. I got used to those visual artifacts on the web in the early 90s so my brain started doing its own interpolation. It took reading the entire article and flipping tabs back and forth to compare images before I noticed the difference. Now I can't unsee it in other images that I recently converted to webp for a project.


The first picture is very hard to spot imo. I had to zoom in a bit to spot it initially. You'll see the "blockiness" is slightly worse in the webp version. (Left side of the image, head height)

For the second image, I opened the jpeg 90 [1] and webp 90 [2] versions. Comparing those two, there are clear banding issues to the right of the neck. Slightly less visible are the darker bands circling around the whole image, though still noticeable if you know where to look.

Comparing the jpeg 90 version with either webp lossless, jpeg 100 or jpeg 95, I can spot some very slight banding in the jpeg 90 version just to the right of the neck. Very difficult to spot though without zooming in.

[1] https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...

[2] https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...


I don't see any difference either on Windows on either of my monitors.

I wonder if the author's issue is due to the author using a Mac. Back when I was at Google working on VR images, my work machine was a Macbook and my home machine was a normal Windows desktop. I realized that images looked worse on my laptop's screen because the native resolution of the display hardware was something like 4000 (numbers made up because I don't remember the specs) but the display was set to 3000. So OSX would incorrectly rescale the image using the wrong gamma curves. Since I was trying to calibrate VR headsets, I spent way too much time looking at gamma test images like https://www.epaperpress.com/monitorcal/gamma.html where a high res pure black + pure white grid is shown next to a set of grays. That was how I realized that my Mac was incorrectly resizing the graphics without being properly gamma aware. I also realized that if I set the OS resolution to 2000, it would use nearest neighbor instead of bilinear filtering and the gamma issue would go away. My Windows desktop had the OS running at the native resolution of the display so this wasn't an issue there. This also wasn't an issue if I had an external monitor hooked up to the Mac and set to its native resolution.

Apple users tend to say "it just works" which is true 90% of the time. But there are cases like this where it doesn't "just work" and there was no easy way to force the OS to run at its native resolution on that specific laptop.

Edit: I tested with the second set of images (the upper body shot) and the problems with the gradient are visible there. But I still can't see a different when quickly flipping through the first part of images on my properly calibrated native-resolution monitor. I _can_ see some banding on one of my monitors that was intentionally miscalibrated so that I could read text better.


It could also be a browser issue implementing webp. There's a decade-old bug in Chrome, where they're using the wrong color profile for CSS, so colors are brighter than in other browsers. It's extreme enough that one of the designers I worked with spotted it in passing just glancing at my Firefox window, which led down a rabbit hole finding the bug report.

https://bugs.chromium.org/p/chromium/issues/detail?id=44872

Total aside, y'know how people do things like make their smartphones greyscale (or at least mute the colors a bit) to reduce smartphone addiction? It wouldn't surprise me if these over-saturated colors were part of why Chrome got so popular so fast...


> I wonder if the author's issue is due to the author using a Mac.

It is not, since I tested positive on Linux. What post processing would any OS even do on an image when you view it in a new tab as one is meant to do for this tutorial?


I did the same, and it took me a long time to spot it, but in the upper-right corner you see circles in the WebP version. It's outside the centre of attention, so it's not that obvious. Actually, it wasn't until I saw the second picture and knew what to look for that I spotted this in the first picture.

It's not so easy to see if the browser zooms the image, so make sure to open the image and set zoom to 100%. I also need to keep my face fairly close to my screen (12" 1920×1080, so not that large).


I always zoom in on pictures on the web to see if the compression is good or if there are artifacts.


I agree, it's not a good example to lead with.

That said, in the context of showing off your photography I can understand considering these kind of artifacts undesirable, even though they're perfectly fine for a lot of other uses. On my own website I spent quite some time downgrading my mugshot to be as small as possible without too many artifacts – it's now 4.9K in WebP, vs. 9.2K in JPEG before. Maybe that was a tad obsessive though...

I do think the author doesn't quite appreciate that most people are not photographers, and that for most images quality doesn't actually matter all that much.


Here is the diff: https://imgur.com/a/QT8oNqj

>> To the non-educated eye, this might look ok, but for a photographer it’s not, and for several reasons.

webp is a banding nightmare.


I can readily tell the difference on the guy's forehead. The webp version has less dynamic and looks like a big white spot, while jpeg has more shades.


The same image rendered with different os/hardware will almost always look different.

Different operating systems and monitors have different default gamma curves for rendering brightness and black levels. Monitors are most likely either uncalibrated, or _can't be calibrated_ to render a greyscale with just 64 brightness levels distinctly.

TFA is calling attention to "posterization" in their portrait backgrounds. They expected the grey background to have a smooth gradient, but, depending on your monitor, you should see visual jagged stair-steps between different grey levels.

When an image uses a color palette that's insufficiently variable to render the original image colors with high fidelity, that's "posterization."

(I paid for my college doing high-end prepress and digital image services, and got to work with a ton of really talented photographers who helped me see what they were seeing)


The gradients in the webp look clearly terrible to me. I'm using a normal 1440p monitor, nothing fancy


I thought it was pretty clear. I'm not even running any monitor/computer setup. The light behind her is clearly different, it almost looks like a photo with different lighting.

4k Dell monitor, Safari on a Mac.


If I view the full images of the first two in two Chrome tabs, two Firefox tabs, or download them and open then both in Preview on a 27" 5k iMac and flip back and forth between the two I see nothing changing.

There is definitely something changing though, because if I open each in Preview, switch Preview to full screen, set the view to be actual size, and take a full screen screenshot, the screenshot for the WebP image is 14% smaller than the one for the JPEG.

If I use screen zoom to go way in and then flip between the two images I can finally see some changes. The JPEG background has more small scale variation in shade. In the hair there are some white streaks that aren't quite as long in the WebP. Lots of small changes in the shirt, but it is about 50/50 whether or not any given difference there looks better in the JPEG or the WebP.


This whole thread feels like one of those "I can tell the difference between an MP3 encoded at 320 kbit/s and one encoded at 256 kbit/s!" audiophile threads. Yes, there are probably people out there with well-calibrated ears who can, but I am sure not one of them. FWIW I have a 27" 5k iMac and can't even remotely see any difference between the images.


Lots of replies here saying either: "I can't see the difference" or "Wow the difference is stark".

My takeaway as a non-photographer is: "different tools for different uses". If you're posting photography where image quality matters then use JPEG or another format that you think displays the image best. If you're writing a blog post with screenshots or other images where minute quality doesn't matter that much then use WebP.


No, in both cases, use something that is better than JPEG and Webp: JPEG XL.


JPEG XL is great except is has virtually no browser support[1]

[1]: https://caniuse.com/jpegxl


JPEG XL is clearly superior in almost all contexts, but Google killed it and then Apple is trying to support it now. Unless Google reverses its stance though it will stay dead.


The thing that I like the best about jxl is how consistent the reference encoder is. If I need to compress an entire directory of images, cxjl -d 1.0 will generate good looking images at a pretty darn small size.

Using mozjpeg (SPEG), or openjpeg (JPEG 2000) or cwebp, and I want to get even close (in bpp) to what cjxl does on the default I have to use different settings for b&w vs color and line-art vs photos.


The last time I checked, it was not possible to re-encode a JXL image into a JPEG image. Is this now supported?


It's possible to encode any image format to any other; I'm not sure what they has to do with my comment though


There's a clear difference between the JPEG and WEBP versions. Especially on the background on the right of the man.

There are clear bands of various shades of grey that circle out of the brighter areas behind the face and from the mid-right edge. They appear to join about two thirds from the middle to the right edge. That artifacting is most notable at full size, but is still visible on the smaller size on the web page.


You either have a bad screen or limited eyesight, it's quite funny to me that this is the most upvoted comment.

There's definitely very ugly "banding" going on in the gradients on the WebP versions i say as someone who's worked extensively with UX and interfaces.

I'm on a M2 Macbook Air.


I'm looking at an LG UltraFine, which as far as I know, is not a bad screen, but I can't really tell.

I've read all the comments, and zoomed way in. I can see it on one of the pairs if I pay attention, but on most of them, I still am not sure how to even look for the difference.

Last time I had a vision check, I got a 20/15, which is supposed to be better than "normal". It may have declined since then.

I don't think it's a monitor or eyesight thing. I think I don't know "how" to look for the effect I'm supposed to be seeing.


I can see a difference in the gradients, but in practical use on the average website does that even matter?

Photography portfolios are the one use case where having gigantic JPEG 90 images might make sense I suppose. Although everyone is going to get annoyed at your loading times.


It's because the author is linking to the wrong images.

See my post lower in this thread.

https://news.ycombinator.com/item?id=38656046


It's your screen. Maybe we found the ultimate image compression method here- we all just need to use the same screen as you.


He also screwed up the 4th and 5th image - one of the ones labeled "85% jpeg lossy" links to the webp.


The author is complaining about the consequences of recompressing images, which are also black and white and have a huge gradient background, and also, the post is full of flaws. I don’t know, Hacker News is better as less of a Hacker Rants.


> which are also black and white and have a huge gradient background

That's the entire point of this article. Rather than picking a dozen different kinds of images at random, it considers the problem within the very specific context of actual photographs, made by actual professional photographers, with specific (yet not uncommon) artistic/stylistic choices.

It's like showing why an audio codec sucks for cellos. Yes, there is going to be a hundred other things you may want to record (like a podcast, a rock band, etc), and most of them will not be cellos, but still that doesn't change the fact that the codec sucks for cellos.


The author just makes a ton of mistakes. Many photographers competently shoot and store RAW, and many know better than to mass convert low quality JPEGs to WebP. It’s HIS work, he can choose to make as few or as many mistakes with presenting it as possible. So I don’t think he’s representative of most photographers. It’s a technical discipline.

I guess the more technically interesting POV would be to suggest a solution. Probably he should use the black and white profile with HEIF and serve the WebP only to search engines, using the modern image tag.

Or, you could put Y information in the unused UV plane for WebP. I guess you could also decompress the original JPEGs better for the purpose of conversion. While not for him, it takes about 100 lines of JavaScript to author a Mobile Safari-compatible image bitstream, which is very little. The MediaCodecs API is great.

Anyway, the rant elevated my knowledge very little. It was more like anti knowledge. Like if you were to integrate the rant into an LLM, it would produce worse recommendations.


> [...] many [photographers] know better than to mass convert low quality JPEGs to WebP.

Correct, but this is the workflow that the engineers behind WebP recommend, so I think it's entirely fair to pick on it.

> Anyway, the rant elevated my knowledge very little. It was more like anti knowledge.

Then perhaps you weren't the target audience. I'm not a photographer, and the rant has offered me a little bit more perspective.


It could be partially placebo affect. Its not like he is doing a blinded test.


It's not, it's just that people who spend thousands of dollars and hours into photography are more susceptible to care. Same with music, most people are fine with $15 earphones while musicians or music enthusiasts will find them disgusting.


Music is probably a bad example of your point, as that field is famous for audiophiles insisting they can hear a difference for various things only for them not being able to tell the difference in a double blind test.


Just because there are some 'extreme' weirdos in the audiophile space, doesn't mean that there is no difference between cheap and expensive equipment.

While people might not be able to tell the difference between $50 and $5000 speaker cables, anybody will be able to the hear the difference between $50 and $5000 speakers.


It's more like 64kbs vs 128kbps than copper vs gold cables if you want to keep the analogy


In my opinion the worst and most distinguishable downside of webp is the forced 4:2:0 chroma subsampling. On many images with bright colors you can clearly see the color and brightness loss without an educated eye.

On comparison [1] you can clearly see that the top right balloon has lost its vibrant red color. On comparison [2] the bright blue neon art on the center has lost its brightness.

[1] https://storage.googleapis.com/demos.webmproject.org/webp/cm...

[2] https://storage.googleapis.com/demos.webmproject.org/webp/cm...


Not to stir yet stir another debate but yeah, definitely not able to perceive the difference in either of the examples you linked. It would be helpful if that site let you drag the vertical comparison bar at least. On an iPhone 14 display.


I can see it in the second link setting webp to small in the orange reflections above the rightmost outside needle tree htms. ... oh, you cant drag it? ...


thank you for that link - it is detectable but in my eyes neglegible for website use. What about saturation?

I have to ask, what could be the reason this gives me pale blue (other colors are okeyish) jpg > webp:

cwebp -pass 10 -m 6 -nostrong -sharp_yuv -quiet -q 60 -sharpness 2 $1 -o


This article didn't go into the biggest problem with webp for me: the inconveninence of the format outside the browser compared to the small space saving. There are better formats (the video-codec inspired ones like heif, avif, and what might come out of h266, or even jpeg-xl), and webp just seems like a compromise without enough upside.


I feel your pain. Right-click, save as, and ... awww-goddamn it, another WebP >:|


My favorite is the URL ends with jpg but when you save the image you get a fucking WebP. Thanks everyone for breaking the Internet in the name of Google. The best.


I always screenshot them lol


WebP is actually based on a video codec. It's just that VP8 pretty much never caught on with hardware encoders/decoders apparently.


VP8 was never competitive so most of the energy went into VP9, which did beat H264.


It beat H.264 in terms of quality/size but not in terms of hardware support. This is why Google Meet is the laggiest video conference software, they keep trying to make VP9 a thing while the others stuck with H.264. And now there's H.265.


Google marketed it that way but I could never reproduce a meaningful size savings without noticeable quality loss. You need to serve a LOT of video before even the top-end 10% savings was worth it, especially if your traffic was spread across many items so doubling your storage cost cancelled out a fair chunk of the total. I have no doubt that YouTube saw a savings but I don’t know how many other sites did, and I would be curious what the savings was relative to the extra power used by the millions of client devices which could’ve streamed H.264 at 10% CPU versus having the fan on high.


If users don't have hardware accelerated video decoding, it's so bad that it actually hurts the experience. I can't imagine that being worth the space savings. There doesn't have to be a good reason YouTube does it, it might just be someone wanting to insert their tech, which I'm pretty sure is the reason Meet uses it.


I remember doing bluray re-encodes back in that day. x264 was simply better as an encoder when compared to vp8 and you knew that at least in terms of software everyone had a compatible decoder in their preferred codec-pack.


Oh yes, with uh websites where you download said re-encodes, there'd always be a few uploads with weird encoding and the author screaming in the comments that it's better and you gotta use the bleeding edge VLC before complaining that it doesn't work.


Even worse that the original blog post, because of this you may be dealing with a JPEG image, converted to WEBP, and then back to JPEG. And then maybe someone edited that JPEG and it got converted back to WEBP!

A large chunk of the hn commentors are debating over banding they can or can't see in a best case scenario WEBP image. The reality is the bulk of the WEBP images look horrible, something I've started to really notice only recently. Of course, you can "clean" the images by using different generative upscaling processes now, which is pretty ironic how much electricity we are using because someone wanted to save 45kb.

Also this reminds me a lot about GIFs being converted to JPEGs. 25~ years ago there was a lot of nice, clean GIF screenshots (256 colors was all you needed) that got destroyed by JPEG.

Google tells developers to use WEBP but has no problem serving petabytes of video ads no one wants to watch!


Now let's talk about HEIF, an inconvenience inside and outside of the browser on desktop.


> To the non-educated eye, this might look ok, but for a photographer it’s not, and for several reasons.

There surely must be better examples to show "non-educated" plebs (to use the tone of the post) why webp is bad and to justify the post and the tone.

I'm on Android, maybe this is why all pic quality look the same?

Also - yeah, if you are making pics for educated eyes: don't use tech that is not suitable for educated eyes? Or don't outsource that decision making to others?


The authors point is that if you are making this tech, you should have educated eyes.

And given all the confident comments in this thread claiming the author is full of shit and there's no difference, I think their frustration is justified? If you can't see the difference in the first images that's fine but you probably shouldn't be confidently claiming to know better than the author, let alone designing an image codec.


There's room for different opinions.

His font choice is terrible for my legibility. Maybe for others it's great. But it made the already difficult article that much harder to read. And I like this topic. I already seriously question his sense of what is reasonable and good and for what purpose. His purposes are so alien to mine that his opinion ends up being pretty irrelevant to mine. I wish him well with his.

I can't see the things he's pointing out in the images, and I tried and tried.

I use webp extensively, there have been zero complaints from users about the images. But I don't make art sites. I make software people use to get stuff done. I don't transfer images above maybe 50-80k. Art, aside from modest marketing, is most definitely not the point.


If you tried and couldn't see, it might be like others say that it's more visible on certain monitors and setups. But then, again - if you are designing codecs or choosing them, you probably want a monitor that makes it easy to see these things. I can see them on my old iPhone screen.

It reminds me of how sometimes you see a huge billboard hideously strong 10 foot wide JPEG compression artifacts. It was someone's job to make those, too.


> But then, again - if you are designing codecs or choosing them, you probably want a monitor that makes it easy to see these things

You keep bringing this up. I don't really care. Someone designing a codec may have put this apparent problem case on the don't-care list as well. I would be in general agreement with the designer's priorities for a reasonable web codec.

I have, with some care, selected webp as a general codec for web use on most of my sites. Nobody is complaining, and my page weights and development speed is improved. I don't have to fret between png+transparency and jpg to minimize asset size while maintaining it's usability. I just use webp and most of the time it's a size/speed win with good enough quality.

Not every codec needs to be artist and photographer approved.


> His font choice is terrible for my legibility.

There may be a connection [1].

If we assume some of the people designing codecs, that he curses in this piece, end up reading it, he may simply have wanted to make sure they do remember. ;)

[1] https://hbr.org/2012/03/hard-to-read-fonts-promote-better-re...


The author's point is deeply stupid. As he admits:

> WebP re-encoding of an already lossy compressed JPEG

So... all this shows nothing. Is webp worse than jpeg? Not addressed. He re-encoded jpeg to webp and it somehow didn't magically cure the compression artifacts he's seeing! Who coulda thunk!

Any comparison starts with taking the originals, encoding to jpeg and webp, and comparing that. Or he could repeatedly encode original -> jpeg -> jpeg and compare that to what he has, which is original -> jpeg -> webp


Most of the comparisons are encoded from source. The one that isn't is because re-encoding is a specific recommendation from the services that they are criticising. They are specifically showing that yes, that's a bad idea.


Still, the author could do more to highlight the differences using zooms and annotations. The banding in the background is particularly strong and would help their point to highlight visually to the reader.


I too am on Android.

I was able to see it without full screening.

Look at the man with his face screwed up. Look at the edges of his shirt near his shoulders.

In the pictures that had bad image quality, there is a sort of glow around his shoulders, as if they are backlit.

In the pictures that had a good image quality, The gradient was smooth. There was no backlit glow around his shoulders; it just looked like a smooth gradient background image.

To be clear, I'm not a photographer. I'm a DevOps engineer. The last time I professionally wrote a line of JavaScript was at least 11 years ago.

It's easy enough to see.


See the discussion here [1], you need to view it full size to be able to tell.

[1] https://news.ycombinator.com/item?id=38653224


…so essentially WebP is fine for mobile devices and the vast majority of desktop web cases. I’m fine with WebP not being a suitable format for permanent storage of photography.


A close up section of the same zone in the images would make them visible. I could hardly see the artefacts in the first place as my attention was caught with the highly contrasted parts of the images.


No, I can see it on Android without zooming in. Not well for sure, but it is there towards the corners.


For starters, anyone that ever worked with a codec, will know that you don't compare them with ONE SINNGLE IMAGE.

This whole basic idea of the blog post is just to generate more whining and clicks and not to actually make a comparison between formats that's worth a basic smell test.


This cuts against WebP more: all of Google’s marketing was “it’s a third smaller!!!!” and then when you looked they were comparing it to unoptimized libjpeg outout and using computational metrics like SSIM which only crudely approximate what humans notice about image quality.

I did the same comparison the author did when WebP came out but used an optimized JPEG encoder and found the same conclusion: when you produced subjectively equivalent images, the savings were more like -10% to +15% and for web sites which didn’t get Google-scale traffic the performance impact was negative since it made caching less effective and you had to support an entire new toolchain.


In what way does "anything cut" against anything when you do cherry picked single datum point comparison?

There isn't a codec pair in this world where you can't make a cherry picked comparison where one of them is worse (I've done plenty of those).


Criticism of cherry-picking cuts against WebP because the marketing campaign for that codec relied on cherry-picking both the least optimized JPEG codec and the most favorable metrics for comparison. If you had humans comparing images or enabled JPEG optimization you saw far less exciting numbers for WebP - usually under 10% savings, not uncommonly negative – and there were other formats which consistently outperformed it. You can see the mood around that time here:

https://calendar.perfplanet.com/2014/mozjpeg-3-0/

Even a decade later, however, Google repeats the 25-34% claim and their performance tools tell developers they should use a modern format, which by sheer coincidence means the one they invented rather than the best ones on the market.


Except the problem isn't in a single image, it is a pattern that is frequently there and the image was only used to demonstrate it. WebP has this problem way back as one of the reason others were hesitant to support it except Google.


It is basically the same with all On2 Media marketing. From WebP, VP8, VP9 to AV1. And it has been going on for over a decade.


A bit of context: Aurelien Pierre is known to be a major contributor to Darktable (open source raw developper / catalog ; in other words, an open source Adobe Lightroom), and is known to have strong opinion about the correct way do to stuff, to the point of abrasiveness and to the point where he has forked Darktable into its own stuff (Ansel; see HN discussion some times ago https://news.ycombinator.com/item?id=38390914 ).


Thanks for the info, going to have to check out Ansel. Do you know if its still compatible with the Darktable formats?


I’m not sure what you mean by formats. It should support all the old raw/jpeg formats, or at minimum it has for me


If I cared about archive image quality (and I do), I wouldn't re-compress older images in a new format unless I could do so from uncompressed originals. Re-encoding from a lossy compressed source will make quality worse. Storage is cheap and getting cheaper.

What would make sense is choosing safe settings for compressing new photos in the new format.


> Re-encoding from a lossy compressed source will make quality worse.

JPEG-XL is supposed to reencode old JPEG files into 20% smaller files without quality loss though. In context, Google has been holding JPEG-XL back by removing support for it from Chrome and refusing to reinstate it, claiming that it did not have good enough "incremental benefits compared to existing formats" such as webp.


Wow, I didn't know that. A top google result says:

> It is possible to losslessly transcode JPEG images into JPEG XL. Transcoding preserves the already-lossy compression data from the original JPEG image without any quality loss caused by re-encoding, while making the file size smaller than the original.

I wonder how it does that and why JPEG didn't notice it could. I would re-encode to JPEG-XL, when supported. So then the situation isn't that WebP is so great but rather Chrome's not so great.


> I wonder how it does that

It's trivial to do: JPEG's last stage is a compression via Huffmann code - which is a really ancient, not particularly effective compressor. You simply decompress that stage, and compress with something more modern, yielding better savings. Stuffit did it in 2005. PackJPG in 2006. Brunsli (a Google project!) in 2019 - and it was one of the inputs to the JXL draft. Lepton did it in 2016.

> and why JPEG didn't notice it could.

Oh that's the best part - they did, all the way back in 1991. The JPEG standard allows you to choose for the last stage between Huffmann and Arithmetic Coding - which is way more effective. Unfortunately it was patent-encumbered and its support is low. It yielded 10%ish space saving which wasn't worth the compatibility headache (it has the same extension and mime-type of a Huffmann-encoded JPEG, so a webserver won't know if your browser supports it). If it only had used a different file extension it would probably be the dominant format today.


Careful with the JPEG-XL re-compression, though--depending on how you're re-encoding, jxl may use SSIM to evaluate for visual losslessness, and the whole point of TFA is that SSIM is blind to posterization, but (some) humans aren't.

Disk space is cheap. It's most likely not worth the 20% compression to lose your original images (and possibly lose metadata as well--it's quite hard to robustly retain all vendor-specific MakerNotes, for example).


JXL has Guetzli lossless JPEG compressor integrated into the standard so it produces reversible and completely standard compliant JXL images that are 15-20% smaller size. Reversible in sense that you can still convert the image back the original JPEG, that is bit exact file as the input JPEG was (it takes care of all the metadata also - it has to).

Also if you decide to forgo the reversibility you can get a bit more out of it as JXL is actually a superset of JPEG, so it can read the JPEG stream and convert it to JXL without complete recompression - it will just use more efficient structure of JXL and much more efficient (ANS vs. Huffman) entropy encoding. The additional savings compared to the reversible mode aren't big however.


The lossless thingy is Brunsli. In the last meters of the standardization, Brunsli in JPEG XL was replaced with "Brunsli 2.0", the more natural formalism in JPEG XL format, allowing for a smaller spec and decoder as well as parallel decoding.

Guetzli is a slow high quality jpeg encoder. One can use jpegli for that need nowadays, 1000x faster...


We overprovision low frequencies dramatically to avoid posterisation. JPEG XL development was never driven by SSIM, only butteraugli + human viewing. I reviewed manually every quality affecting change during its research and development.


Okay, but that isn't really the point. You can start from a perfect gradient saved as a PNG and you will still see that WebP has visible banding at -q100 while JPEG is visually transparent at -q90.


I think the author is focusing on the wrong thing. They focused on the difference in format, when they should have focused on the compression. Different image processing programs will have different compression even when set to the same number (eg "80").

I think for a truly meaningful comparison you'd need to test a variety of images including full color with busy backgrounds as well as these b&w studio portraits on a smooth gradient type bg, and test a variety of programs like imagemagik, graphicsMagick, sharp, photoshop, whatever cloud offerings, etc.

The other issue I see is use case. If you're a professional photographer trying to upload full size full quality photos, maybe just don't compress at all so you know your creative / editing work is completely preserved. That use case is not the average use case of a website displaying a reasonably sized image of reasonable quality. For many situations a significantly smaller image might be worth having a more compressed image, and for many images the compression won't be as noticeable as it is in a full resolution professional studio photo with a large gradient type background.


I clearly have "non-educated eyes" as I can't see any meaningful differences personally.


It depends greatly on your device. On my work windows machine I can see a bit of banding. On my phone, it's worse. On my macbook, it's atrocious.


Like most folks you were probably simply looking at the foreground. The background around the edges of the shirt and the edges of the picture (depending on the image) noticeably change color from shot to shot without full screening it on my small Android 12 device.

It's artifacts made in the background of the image that this poster is complaining about.


My sight's both poor and uneducated, but looking again after the defects are pointed out, they're pretty stark.


Good for you. Once you noticed the banding issue, you're cursed to see it everywhere.


very interesting, i could clearly see the difference - even before reading. and i'm using a 9-year-old MacBook Air 11'... not bad, but not exactly high-end stuff.

fascinating how perception is different.


Same here. Especially considering the ones supposedly "look like shit".

The whole thing reads like a no-so-subtle brag about how his mighty photographer's eye can spot details that mere mortals can't.


Your viewing environment will matter a lot. In a dark room with a bright monitor, the banding in the background of the example images is pretty bad (if you are looking for it). But if you have a laptop in a bright sunny room in front of a window causing back lighting, you probably won't be able to see it.


It's there. It's very noticeable once pointed out. It drastically distorts the images' 'softness' because of the harsh steps through the gradients. It does not appear as the artist intended for it to, which is the biggest issue.


The gradients on webp frequently look like video stills. Chroma subsampling reduces the density of available luminance approximations and the more heavily it's applied, the worse gradients look. High contrast high frequency details aren't affected much, but gradients can really suffer.


Chroma subsampling reduces the density of available luminance approximations

Chroma means color, and color subsampling is used to avoid taking information out of luminance channels because they are more important, so it is actually the opposite of what you are saying here.


https://www.google.com/search?q=gradient+banding+4:2:0

There simply aren't enough bits of precision in the luma encoding for good gradient support most of the time, chroma fills the gaps, and chroma subsampling produces artifacts.

Webp lossy only does 4:2:0

https://groups.google.com/a/webmproject.org/g/webp-discuss/c...

These problems would go away with 10-bit AIUI. AVIF supports 10 bit but WebP does not.


I think you're conflating a few different things. Chroma doesn't fill gaps, low resolution chroma channels introduce artifacts of their own.

This is spatial resolution, 10 bit color channels is quantization resolution of the values. Everything contributes to banding artifacts, which are just noticeable changes in values when that are meant to be perceptually smooth, but the luminance channel is the most important, which is why it isn't subsampled.

These are fundamentals of image and video compression and not unique to webp.


I was going to say, it's not uncommon to see pretty bad banding in dark gradients with WebM/VP9, so this makes some sense.


Like video, webp uses limited ycbcr, as opposed to jpeg which uses full ycbcr. This leads to grayscale jpeg looking perfect on monitors that use full rgb values, as opposed to webp which will have slight banding issues when displaying grayscale content.


So... why are we still having problems with banding in image compression? If anything, gradients should be the easiest things to compress in these images, because the compression algorithms work entirely in the frequency domain. Whatever is introducing banding here is adding more frequency coefficients and making the image bigger and worse at the same time.

Did Google/On2 just not notice that they were crushing every gradient they encode or is are all the common WebP encoders doing some kind of preprocessing pass that crushes gradients and munges luma?


I would guess the problem is that on a slow gradient, each individual block is very close to a constant. The tiny AC coefficients tend to be quantized away, resulting in a visible transition along block boundaries.

I thought the loop filter was supposed to help with this though.


Webp is encoded using limited ycbcr values as opposed to jpeg which uses full range ycbcr values. When converting jpeg to webp, there will be banding. Grayscale limited ycbcr when converted to full rgb during display ill also have banding.

Webp really doesnt have a banding issue unless you convert jpeg or display purely grayscale content.


Snarks at Safari for often not being instantly up to date with every rushed “web standard” from Google, then gripes about “Google monkeys” and the issues with…their rushed “web standard”. Pick your poison.


I dont get it.

The author seems to care highly about image quality, but also wants to squeeze out as many bytes as possible?

Bandwidth is cheap. If we are talking about photography as art, why would you be trying to scrap a few kb off in the first place?


The author is also a web designers that primarily use wordpress. Wordpress website owners these days would put their site into pagespeed insight and that tool will advise that images to be converted to webp, then demand their web guy to do it. I imagine the author got tired of seeing images on their sites ruined but can't do anything because that's what the clients want to tick off a box in pagespeed insight.


It's more nuanced than that: the author compares two lossy compressions and gives their opinion about which one is better.

It is not honest to say "use my compression algorithm, it is better" and then, when people point out that it is actually worse, to say "well if you care about quality, you should not compress anyway". It doesn't make the algorithm any better.


The repeated callouts to PageSpeed imply that their concerned about search placement, which is understandable for the profession. If your site is bumped off the first page because Google doesn't like that you're still using JPEG that's lost income for you.

It can also be an issue if a client asks for WebP. Do you give in and deliver a lower quality image and allow your art to be displayed in a degraded manner? Losing future clients who think your photos look bad. Or refuse out of dignity and lose the current client?


Because it's a substantial amount of effort to upgrade to the "new" tech, and he's showing that the "new" tech is actually worse than the "old" tech of reliable old jpeg.

> Bandwidth is cheap.

Labour is not. Just leave your jpegs as-is!


Because not all countries have cheap or unlimited bandwidth


Also planes don't, so it's not a poor vs rich topic as many seem to make it be.


You missed the point he's making: webp requires 30% more data to achieve the same dynamic than jpeg, so there's no real use for it.


Did he make that point? The only time he thought they were equivalent was when using lossless mode, which is not a reasonable comparison. He never actually compared webp at 30% more quality than jpeg.


He did, about halfway through:

WebP [lossy, 96] is actually 39 % heavier than JPEG 85 plus noise for a similar-ish look on this difficult picture, and still not totally as smooth as the JPEG (there is still a tiny bit of ringing). It’s also 30 % heavier than JPEG 90 with simple Floyd-Steinberg dithering.


> "WebP is actually 39 % heavier than JPEG 85 plus noise for a similar-ish look on this difficult picture, and still not totally as smooth as the JPEG (there is still a tiny bit of ringing). It’s also 30 % heavier than JPEG 90 with simple Floyd-Steinberg dithering."


Every time I've used webp, I've been disappointed. And when I'm disappointed, I try jxl for giggles and find much better photo quality (especially fine gradients), at a much better file size.

Let's cut our losses, ditch webp and move to jxl.


> Every time I've used webp, I've been disappointed.

In what way?


Hard to take this seriously with that obnoxious font that draws curlicues connecting letters like s and t.


I did learn from it that there's a CSS property for ligatures, and the blog has set it to discretionary ligatures.

https://developer.mozilla.org/en-US/docs/Web/CSS/font-varian...


There's pretty bad posterization in the background. If you can't see it, kick up your contrast. You don't need HDR levels of contrast to notice it.


So here’s what I don’t get about this post:

> this is WebP re-encoding of an already lossy compressed JPEG

Author is clearly passionate about imagery and quality, so why are they not re-encoding using the original file rather than a lossy copy?


> So, I wondered how bad it was for actual raw photos encoded straight in darktable. Meaning just one step of encoding.


The banding is SUPER monitor dependent, its noticeable on my 4k monitor, super apparent on a different monitor with a terrible LCD panel, and not at all visible on my iPad.

I wonder if the author took that into consideration.


Back in the early 2010's I had a cheap Dell laptop with a 6-bit panel and an integrated Intel GPU. Video on that device had incredible banding, almost all the time, because as I understand it, the Linux drivers were relatively immature and did not do any dithering. A few years later a driver update enabled dithering and the bulk of the problem went away.

As a video codec developer I was a little sad about that, actually. I had to start looking closer to see problems.


> not at all visible on my iPad.

That is indeed surprising. Is it iPad or iPad Pro? It is technically possible that your monitors only support 8bpp color depth while your iPad Pro supports 10bpp (via the P3 color space) and the WebP file has a smooth gradient only when viewed with 10bpp or more. But I can't really believe that, as the original JPEG file still looks like 8bpp and doesn't have any further color profile attached.


That wouldn't make any sense unless there's something else going on.

It could simply be an effect of brightness -- do you have your 4K monitor set to bright, while your iPad is much dimmer? (Remember Apple devices have adaptive brightness enabled by default as well.)


>Look at the original JPEG at quality 85 :

<img class="lazyload" decoding="async" src="data:image/gif;base64,R0lGODlhAQABAAAAACH5BAEKAAEALAAAAAABAAEAAAICTAEAOw==" data-orig-src="https://photo.aurelienpierre.com/wp-content/uploads/sites/3/..." alt="" />

Sorry, I can't. That doesn't actually display any image at all in my browser because you're relying on javascript execution to switch the img src to it's actual source. You don't need to do this for lazyload to work anymore. There's browser native lazyload. Just put the actual image in the src.


I came here to say the same thing.

It's bizarre how the author's attitude is that the webp authors should know better. Yet his blog cannot link to images properly without JavaScript. My browser supports lazy loading images and srcset; all the things he would want. It does that without JavaScript. Yet he tries to implement that in JavaScript and does not have a fallback to use the browser's native implementation. It's difficult to take him seriously in criticizing others' competencies when he, in a blog post about image quality, cannot include images with over-complicating things to the point of breakage.

His point on color banding is clear and others have pointed out that the luma in 4:2:0 subsampling is terrible. But Google is not in the photography business. (Overlooking his attempt to convert from lossy compression to another lossy compression.) It is in the content business but only in so far as it furthers its advertising business. It is not in content for the same reason as the author so they don't share the same interests.

Compare https://jpeg.org/jpegxl/ to https://developers.google.com/speed/webp/

> JPEG XL is designed to meet the needs of image delivery on the web and professional photography.

If you search google's documentation on webp they mention photography like four times and never as "professional photography".

It's honestly funny that he is surprised that Google is advocating for a file format that does not suit his needs as a professional photographer. Google is an advertising business; everybody knows this.

Finally, I never see critics of (or anyone commenting on) webp mention that it supports transparency. What other format is someone to use if they want lossy transparency? It's great for small low quality thumbnails of images (like jpg or png) or animation (like gif) or video. You can throw just any input at ffmpeg and ask for a webp and it will give you something useful that represents one frame of the input. It fills that niche very well.

Once JPEG XL because well supported, I'd like to use it; I hear good things. But it isn't well supported yet so webp is the only option for images with lossy transparency.


From my own experience, JPEG quality and compression efficiency can differ a lot depending on the encoder implementation. It would make more sense to compare specific encoders rather than formats in general.

In 2014 (WebP was released in 2010) Mozilla claimed that the standard JPEG format is not used to it's full potential [1] and introduced mozjpeg project that is still being updated [2]. I wonder how it compares today with current WebP implementations.

[1] https://research.mozilla.org/2014/03/05/introducing-the-mozj... [2] https://github.com/mozilla/mozjpeg


Is webp still relevant these days?

You can use picture/source/srcset to provide different image formats depending on browser support. avif for modern browsers, jpg for maximum compatibility. Means people with old browsers will either get lower quality or a few more bytes, but that seems like an okay tradeoff.


jxl for modern browser, jpg for the rest would be a much better solution, especially if the source is jpg


I can see some banding on the one labeled webp lossless. What gives? Is the banding in the source material? Are we using a different definition of "lossless" than i am used to?

Edit: i think maybe my browser is scaling the photo which is adding artifacts.

Edit2: maybe the thumbnails are scaled at different quality levels???


> maybe the thumbnails are scaled at different quality levels???

Agreed, the WebP lossless version looks pretty bad when scaled by the browser. And since virtually no website/device shows images at their native resolution these days, that's something to consider.

On the other hand, most people these days view websites on their phones, so those artifacts will be harder to see.


I dont even think its that - it seems like it was scaled badly by the author of the post not the web browser and that he is not actually displaying the lossless version. If you click on it it goes to the lossless version but the version dispkayed on page is not that version.


It's even worse than what you said: the <img> tag has a srcset attribute with many possible values so different people may see different images depending on their browser's resolution. The one displayed to me was Shoot-Antoine-0044-_DSC0085-lossless-800x450.webp, which shows clear posterization at its native size as well as when it is further scaled down by the browser to 550x309.


Damn, between that and some people having wide gaumet monitors no wonder everyone is fighting.

This almost feels like a troll post.


You have to open the images in a new tab to get the full res version. Then the webp lossless looks perfect.


Just give me a good ol' jpg. Or a png. Not everything is compatible with webp yet, but when I want to feed in an image from google images, it doesn't work.


I never gave it much thought until I started posting my 3d renders online. Began to find serious issues, especially around posterized backgrounds as the article mentions. A problem which is exacerbated by the vignettes that renderers offer.


> As a photographer, I care about robustness of the visual output. Which means, as a designer, designing for the worst possible image and taking numerical metrics with a grain of salt.

I think it's kind of silly how the author pooh-poohs averages and demands that whoever is working compression algorithms should focus on the worst possible image. If you know anything about information theory, you know that is literally mathematically impossible to make a compression algorithm that always performs well in the worst possible case.


You're taking the bare definition of "worst". He was not talking about compressing random noise


The type of image shown here is a common use case. There's no arguing that it's a statistically insignificant case.


I now hope more people understand why I am pushing for JPEG XL, practically before anyone else on HN ( apart from its authors ).

One thing I want to state is that nothing presented here about WebP are new. They have been there since the beginning ( 2010s ). The real problem is, quote:

>>So there is a real issue with the design priorities of image algos from tech guys who clearly lack historical and artistic background, and don’t talk to artists

And their marketing.


Voting how appallingly obvious the banding is to me. Couple of questions over images being mixed up aside, this stuff is important.

Perception is psychological. And image formats are political.

Perhaps some truly do experience zero banding or artifacts.

But to the rest of us... "There are four lights"

https://www.startrek.com/en-un/news/the-four-lights


> Second, I don’t know why all the techies around have a huge kink over sharpness, but the most challenging situations I have faced as a photographer were with smooth gradients. Or more accurately, gradients that should have been smooth and weren’t in the output.

I can tell you why: because it's hard, i.e. it's hard to compress efficiently. So if someone claims a breakthrough, they either did something extremely smart, or cut some corners.


I wish Slack supported webp. I end up saving an image have to run "convert image.webp image.jpg" and then upload the jpeg


I wish websites didn't have webps, or the browser could auto convert when downloading


Also: Telegram, GitHub, probably more.

(GitHub works if you rename it to a .png or .jpg file, but it's a hack).


Further, with jpeg, there is progressive jpeg. Allowing an image to show up asap on slow connections instead of trying to load the whole thing all at once. When I'm on a 2g connection, I absolutely appreciate progressive jpegs, though they are pretty rare in the wild (and pagetest doesn't even recognize them).


Author might be right about the gradient shifts in images after conversion, but at the same time, most of the websites are not using such color accurate images everywhere. Some are logos and some are with alpha channel. It is a fact that WebPs are lightweight assets to load on the user side which reduces bandwidth consumption for the user and your server. So use WebP where it's needed to save some loading time and bandwidth and use your preferred format where you want to show images as is.

If you're planning to convert your images to WebP in bulk, I wrote a shell script: here's the link:

https://medium.com/@siddheshgunjal82/bulk-convert-images-to-...


I first browsed the article on mobile without adjusting my display brightness (generally set to low for eye comfort) and it took significant effort to see the issues.

I then turned my brightness to 50% and immediately saw browser rendering issues the author may not have experienced themselves. The differences in various contexts are massive. It may be useful to take photos of my screen rendering the various artifacts at varied brightness. There are clearly some rendering optimizations (in different contexts) that create some horrible artifacts.


I might be missing something because I never delved into it, but my problem with WebP is I can't save images this way from my browser. Well, I can save them, but they don't show up when I try to view them on my system (Ubuntu Mate 20.04 on RPi4).


The problem is not the format, but the software / OS you choose to use. There are OS’s that have image format libraries, and once a codec is installed, ALL apps gain the ability to use it. This was first done in the 80’s, so if your Ubuntu 20.04 doesnt support data translations, maybe its time to switch to something else.


Might be the OS indeed. Luckily I can make screenshots and save as jpg or whatever. No need to ditch Linux for me.


That's pretty weird. I'm on Ubuntu 23 and WebP images work the same as JPGs or PNGs.

Browsers like Chrome like to associate themselves with WebP for some weird reason, but file explorers, image editors, album viewers, and everything else support WebP just fine.

I don't know what you use, but I use Nautilus, Gnome Image Viewer, and Pinta/GIMP. Perhaps the three years of improved software support make the difference?


They don't show up on older Windows versions either. The file explorer needs some sort of library to handle .webp thumbnails correctly. I'm pretty sure you can install something on Ubuntu to make them show. Maybe try a different file manager?


In general I've found that this shift to .webp breaks all the nice interoperability and composability we used to have with audio and video image files since there seems to be zero interest in making sure that simple familiar features like still work.


Yeah same. Huge annoyance. I just want to stick to the same-old universally-compatible file formats I've always enjoyed everywhere.


My issue with webp is that when it's animated, it seems random whether it gets treated as an image file like a gif or a video file. Any webp I save I have to convert to a real image file to ensure I can view/use it outside of a browser.


Webp is like usb-c in a way, multiple different capabilities in one package. Might sound good on paper, but gets annoying.


I guess I don't get the context?

WebP is barely supported. For decades the only choice in lossy compression is JPEG, which notoriously sucks for diagrams and basically anything that isn't a photograph. So the rest of the world finally gets a format they can use, and the photographers are angry that the world doesn't revolve around them anymore?

So what if it is worse for photography? Should we continue chasing our tails for another ten years before we find the perfect format? I'm sick of data visualizations drowning in JPEG artifacts.

I'm not opposed to AVIF or whatever, but I don't care about the author's complaints. JPEG is still there. If you want to use it, go ahead.


Outside of photographers, how many people are looking at super high-resolution images on the web? Even images that might have high-resolution versions are usually converted to a shrunken image 600px wide to fit inside the website's theme scaffolding.

Is that really even worth shaving 15% off the file size? If bandwidth matters, websites should look to reduce the volume of useless stock images littering their templates.

WebP seems like a gift to Cloudflare and the other companies that do the heavy lifting of caching and serving millions of images across multiple sites. For users, it's at best indistinguishable from JPEG, and at worst an obstruction to saving images from the web.


Honestly, I would have agreed wholly with you until I spend 1 month volunteering in Kiribati. 2/3G is the norm there and even few KBs would make a difference. It reminded me a lot of my childhood with 28/56k modems :/

Additionally, I believe countries like India, Pakistan, Bangladesh, ... are in similar situation infrastructure wise (please correct me if I am wrong) and so for 1/2B people would benefit from a slimmer web.


Isn't this like anything else? No one size solution typically works for everything. If you are a photographer/artist and true close to perfect rendering is for you... don't use WebP as the format to present your images.


The simple truth is that JPEG is more than good enough and has ubiquitous support. There is no reason to switch to a different format and risk degradation or reduced interoperability for slightly smaller file sizes.


I don't understand fanatically chasing smaller image sizes when JPEG was good enough for the web of the 90's. There must be a different reason to throw some of the highest paid engineers in the world at WebP and it ain't generosity.


Google spent a large amount of money purchasing On2. WebP and WebM were a way to show shareholders that they were seeing benefits from the acquisition, and if you look at Google’s traffic volume you could make an argument that even a modest size reduction would pay for the engineering time.

The problem was that this was basically only true for the largest sites. If you’re YouTube or Netflix, it pays to optimize your video encoding but for most other sites the volume just isn’t there and the performance costs for anyone who uses a CDN cancel it out because you need a lot of traffic for each format before a 10-20% byte size reduction saves more time than the cache misses take.


Images on the web of the 90s were also low-res and generally didn't look very good.


Why aren't the competing images presented side by side? Having to scroll to examine them makes comparison very difficult, especially for those of us not blessed with an experienced photographer's eye.


Comparing with Beyond Compare:

https://imgur.com/a/xatzZt7

--

Hoping the conversion doesn't add extra noise, I converted them (with ImageMagick: `convert image.webp image.png`) and compared them (Beyond Compare doesn't support WEBP).

Of course I have a non-educated eye as the article puts it, but if still with machine help I cannot see a difference in light dithering, there must be something off.

The second photo (of a man) is more clear in proving the point. This should probably have been used as the first example in the article.


Wow,had no idea BC did images. I've been using it for years!


imo, the problem isn't that WebP is bad for photos.

The problem is that Google's Pagespeed Insights and consequently a lot of resources push WebP to you as a solution for your JPG problems.

A lot of people have been duped into reencoding their JPEGs into WebPs for no reason.

Also just my personal feelings, but I feel like Google doesn't care about people downloading images or using the internet as a permanent gallery for posterity. They don't care about making each individual image look as good as it can be, so someone can in 10 years visit an almost-defunct website or an abandoned account of some user and just view a photograph as a standalone work. It feels like the use-case they're concerned with are the huge 1200px wide, utterly useless and generally irrelevant stock images they forced everyone to put on their articles when they said AMP articles require an image that big. And of course, with the thumbnails automatically generated from such images. That is, WebP's concern seems to be just about the load on the web server, and it's not thinking about the image as a file (the sort you save on your computer). Then again, this is just my strongly opinionated guess based on nothing but the fact JPG was made before the web became what it is today, and WebP was released after mobile internet access surpassed desktop.


The uncompressed WEBP image looks terrible to me with a lot of banding on Safari mobile. Did the author accidentally switch images or is Safari doing some “optimization”?


"See the posterized ring in the background ?"

Nope. I'm looking at this on a 2k 38" ultrawide monitor, comparing the two images at 190% zoom and I have no idea what I am looking at. I literally can't see a point of difference between them at all. I know my eyes aren't great, but is the difference really that noticeable? What am I missing?


Lossless webp is a good alternative to png. Why compare lossless eebp photo to lossy anything?

I used to use png everywhere in openetg, so webp's a welcome improvement that's greatly reduced asset size

Perhaps the article should be "In defense of JPEG" but that wouldn't get the clicks


Just use mozjpeg and throw away webp.


Unless the OP is using a 8K monitor with professional color grading, I don't understand how he can say that some of these pictures are "looking like shit". They all look exactly the same to me on my regular 27" 1080p, on my 27" 2K or on my iPhone.


Probably if you’re working a lot with photography compression artifacts start to become a real eyesore. Especially the first lower quality webp image does look like shit to me but I also realize a lot of other people would not consciously notice.

The banding is just not supposed to be there.


Easily visible on my air M1, 1080p gaming monitor and pixel 3


For what its worth, the website itself also isn't great. Had to turn off Enhanced Tracking Protection mode to not get text that scrolled off the screen, and then was met with weird fonts.


It seems I have an uneducated eye by their standards, because I barely see any difference, which I'm happy to admit, but I think the author misses the point of webp completely.

The format is intended to bring down the file size of graphics in general, not high-level photography which accounts for probably 0.5% of the images on the internet.

This is a case of the best daily driver car won't be good enough for a race car driver.


Yeah this article comes off as almost idiotic to me. It is entirely irrelevant unless you're supporting high-quality photography on your site, in which case, yeah obviously you're going to be careful about how you compress your images.

For the vast majority of web images, use webp if it's smaller. Minuscule artifacts and judgy designers aren't going to get in the way.


Is this blog a joke/prank?

The images don't link to the correct filetype stated.

- "JPEG, lossy, 85 : 184 kiB" → links actually to a WebP file (https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...)

- "JPEG, lossy, 85 : 211 KiB" → links actually to a WebP file (https://eng.aurelienpierre.com/wp-content/uploads/sites/8/20...)

etc...

So when the blog tells you that JPEG is so much better quality, the "jpeg" image that's actually being shown is a WebP image.


How does the quality compare at the same file size? It seems like all the comparisons have fairly significant file size differences.


I just finished dealing with a very complicated pipeline for an online media management database. WebP is great except when it's not, and when it's not, it really sucks.

I'm going to go with a technical argument here instead of a subjective one, so there's no room for argument: WebP is billed as a replacement for PNG and JPG, and advertised heavily as being usable in both lossy and lossless modes for either. This is blatantly false. Alpha channel aside, PNG is, effectivelyᵗ, 32-bits per pixel, 8-bits for each of RGB. JPG is notably not; to make good use of compression in the frequency domain possible, RGB is usually converted from RGB to YUV/YCbCr. But JPEG lets you customize how this is done, and you can choose to use the default chroma subsampling of 4:2:0, upgrade to 4:2:2, or forego subsampling altogether and use 4:4:4 directly.

WebP is, experiments aside, always 4:2:0 in default/lossy mode (regardless of the tuning profile chosen). Screenshots, vector graphics, text w/ anti-aliasing applied, etc. look absolutely horrendous to the trained eye if converted from RGB or RGBA to YUV 4:2:0. WebP is unusable for png transcodes at any quality except in lossless mode.

I'm not hating on WebP - PNGs converted to lossless WebP are still a good bit smaller, at least for large sizes. But I absolutely despise how pathetically low and biased Google's benchmarks touting WebP as the be-all, end-all have been. And the toolchain is severely compromised, because you have to manually remember to specify lossless mode when compressing a PNG to WebP and that gets harder when it's an automated toolchain and the export is several steps removed from the input. And this becomes completely Mission Impossible™ when you have a lossless WebP and you want to generate a thumbnail from it because the heuristic is no longer "source extension is png" to determine if the output should be generated in lossless mode. IMO, the WebP toolchain *and all other toolchains like ImageMagick and libvips* should pass through the "lossless" property of WebP by default, because unlike with other formats, it tries too hard to be everything for everyone at once and will fall over on its face otherwise.

I said I wasn't going to talk about the subjective side, but I just want to say that even for tiny thumbnails, we've found that their WebP versions need to be generated with at least quality 90 to ensure they will all (regardless of source image) be usable on non-mobile devices (hi-dpi ameliorates but does not resolve the situation, it's just the fact that you see the pixels physically larger); the smoothing effect for detailed real-world photos (think warzone photos with smoke and haze in the air, odd lighting, etc) is way too extreme at lower qualities. Again, the quality:size ratio is still better than JPEG, but not to the extent that Google advertised it to be, but more importantly, if you took Google at its word you would find WebP to be altogether unusable to begin with.

(None of this was about converting already lossily compressed content into WebP; this is straight from source (where "source" is a lossless format like SVG, PNG, RAW, or something like a 24MP JPEG@Q95 being shrunk orders of magnitude) to WebP.)

I played around some with AVIF, HEIC, and JPEGXL. AVIF has some severe color management issues that need to be ironed out in the various toolchains, though HEIC is a lot better in that regard but its lack of compatibility now and in the foreseeable future just makes it a dead end; but JPEGXL appears to be a really solidly built image codec with great potential, kneecapped primarily by adoption.

ᵗ palletization can, but does not have to, affect this



Boy that ct ligature is distracting though.


I see the background dithering ring on my 1440p cheap 32" monitor that's a few years old now.


This seems to be in the same spirit as audiophiles claiming they can hear the difference between various speaker cables, or the "hints of dark chocolate" in wine tasting.

Personally I see zero differences in the images on that page and unless the author has some really super-human vision abilities (possible! but unlikely) my guess is he doesn't either. WebP looks perfectly fine to me.


To me the banding in the "lossless" (do words mean nothing anymore !?) webp pictures is super clear and looks like how I'd expect low quality JPEGs to look.

It's the same kind of artifact that makes certain movies look terrible over netflix, those that have large dark blank spaces. Maybe you shouldn't look to closely because once you see it, it'll ruin your enjoyment of certain compressed media forever.

And by the way I don't think the comparison with audiophile equipment is fair. In the audiophile case we are talking about using very similar output hardware to output what is effectively the same signal. Here we have huge differences in file size (35% and more between JPEG and WEBP, a lot more than that for true lossless), and taking diffs between them shows very much that the signal isn't the same.

There is a compression limit under which you can see it's compressed, right?

https://vole.wtf/kilogram/

So it makes sense that there is some threshold sensitivity where a picture starts appearing "lossless". That threshold is going to be different from device to device and person to person.


> This seems to be in the same spirit as audiophiles claiming they can hear the difference between various speaker cables, or the "hints of dark chocolate" in wine tasting.

I can see why it would seem like that if you aren't seeing it, but it's not the case. The differences in color banding are pretty big if you are on a screen where you can see the background shading clearly.

The brightness of your monitor and the relative brightness of your room will matter a lot. In a bright room, you might not be able to see the subtle banding in the background of the images. But if you are looking at a bright monitor in a dark room, the difference is very obvious.


> In a bright room, you might not be able to see the subtle banding in the background of the images.

You are right. I just made my room dark to try this out, and now I can see the banding!


It's very easy to see the banding if you have a half-decent monitor. You don't even need to view the images fullscreen - and I say that as someone short-sighted with deuteranomaly.


I think deuteranomaly plays absolute no role in B&W images. And if any, helps to view defects that other don't. I have it.

The artefacts are visible mostly in the background, where frankly I do not care.


This is yet another reason why the WebP format has been deprecated, at least in these parts.


So true. Still have to find out how to avoid color bleach when converting to webp.


> It’s not 100 % clean either, but much better. Granted, this is WebP re-encoding of an already lossy compressed JPEG, so we stack 2 steps of destructive compression. But this is what Google Page Speed insights encourage you to do and what a shitload of plugins enable you to do, while pretending it’s completely safe. It’s not.

> I have seen a similar effect in other similar pictures : always pictures with large, smooth, gradients in the background, which happens a lot when some punctual-ish light falls off a wall. That’s not something accidental, smooth fall-off are actively built by photographers to create organic-looking backgrounds with just enough of texture to not get boring, yet discrete enough to not draw attention off the foreground/subject.

I think this rant could have highlighted these paragraphs a lot more, because these are indeed problems. The first paragraph probably refers to [1] where it doesn't say too much about recompression artifacts, and the second paragraph is indeed a well-known issue of the lossy WebP format---it tends to create gradient bands that are particularly significant when viewed on big and bright screens. It is far-fetched to claim that this requires somehow trained eyes, rather it is more or less device-specific in my opinion.

[1] https://developer.chrome.com/docs/lighthouse/performance/use...


Independently of that article, I've experimented with webp to find out when I would use it, and concluded approximately the following (of course, somebody else can have different preferences and conclusions):

- If you know how stills from mp4 videos or similar "look like" (when observed so that the compression artifacts are visible) -- that's more-or-less lossy webp. Not something you'd expect to achieve the best picture quality.

- Probably because of its origins, that's also how lossy webp handles scanned or printed images: not good.

I've concluded that I will use webp, but

1) to save the pictures for which I don't care which quality they have, and if I want to use up less bytes: specifically: if I want to save some visual information from some JPEG from somewhere only to store a picture of that not to preserve it in its full quality.

2) when serving the pictures, in scenarios where I want to reduce the amount of data delivered to others, when the artifacts I'm aware of aren't the issue.

Everything else: still no.


On mobile Safari there is no visible difference.

Could there be some default optimization going on?


Clearly, from reading the comments here, most people don't see any difference. However, the argument still stands, and perhaps - precisely because of the comments here - it becomes even stronger: there is no point in using WebP.


The article is talking specifically about portfolio pictures for photographers. I that case, it doesn't matter what most people see, it matters what the person hiring you sees. And if you are doing commercial product photography, the person hiring you is probably going to be an art director who has spent many days messing about with pictures to get smooth background on websites and in print.


On my 14in Macbook Pro I CANNOT TELL THE DIFFERENCE AT ALL


The images inline in the blog are heavily compressed and look about the same. Click through to the actual demo files and the difference becomes obvious.

I can see the difference on my LCD monitor from at least six years ago. WebP really struggles with gradients. I wouldn't use lossy WebPs for photography websites. AVIF does a lot better (-25% at no perceivable quality loss), but completely messes up the brightness on my PC for some reason; I think that's a Firefox bug.

That's not to say WebP is necessarily a bad format. There are tons of images where it easily beats JPEG without quality degradation, but these images clearly show cases where it isn't.

Personally, I use lossless WebP to replace PNGs on websites, thereby maintaining lossless quality without the PNG overhead. Lossy WebPs (and JPEGs) need to be hand-checked, though.


I’m all in .avif. Smaller files and excellent image quality. But I always have a fallback to .png or .jpg. We’re not there yet — looking at you, Edge, the only major browser that doesn’t support .avif.


AVIF > webp. (too bad once again Safari lags behind)


Can I just say how happy I am to see the "ct" and "st" ligatures in the article text? I know that took the author extra effort to provide.


I hate it, my brain wants to interpret it as a https://en.wikipedia.org/wiki/Inverted_breve


I see:

  {
    font-family: 'Linux Libertine';
    font-variant-numeric: oldstyle-nums;
    font-variant-ligatures: common-ligatures discretionary-ligatures contextual historical-ligatures;
    text-rendering: geometricprecision;
    font-kerning: normal;
  }
I guess those are "historical ligatures". I personally persuaded the creator of the Linux Libertine face used in the page to add those to it.


webp should have been skipped entirely.

Let's focus on AVIF.


> Let's focus on AVIF.

That's a weird way to write JPEG XL.


>JPEG XL

What's the legal/licensing status of that?

How does it compare technically to AVIF?


Honestly, for these cases focus on JXL. It supports lossless re-packaging of existing JPEG with compression benefits, more or less matches AVIF while having much options for much better compression times.

But if JXL isn't an option, definitely AVIF.


All the images look fine to me.


It's such a shame Google decided to block adoption of JPEG XL: it's a strict improvement over classic JPEG (you can losslessly reencode JPEG to JXL and reduce the size, due to a better entropy coder in JXL!) and JXL has various other upgrades compared to 'classic' JPEG.

In the meantime, let's hope AVIF or whatever manages to pick up the slack, and/or other browsers decide en masse to support JPEG XL anyway; that would be a bad look for Google, especially if even Apple decides to join in on the JXL party.


I must admit, I'm not sure why JPEG XL is viewed so favourably on HN, it's not something I know a ton about, but my understanding is that the big advantage of AVIF is that you can reuse hardware decoders built into devices for AV1 for the images.

It being a strict improvement over JPEG is nice for the developers not having to go back to the source image for an upgrade, but that seems like a pretty small benefit that only matters during the transitional period.

Meanwhile, if you are getting better battery life every time someone views an AVIF image, that's a huge benefit for the entire lifetime of the format, it seems to massively outweigh any advantage JXL has, to me.


AVIF kinda needs hardware decoding, because otherwise it’s considerably more expensive than the traditional codecs. Even with hardware decoding, I’m not sure if AVIF is actually faster/chaper—compared in https://jpegxl.io/articles/faq/#%E2%8F%A9speedfeatures, “AVIF” takes 7× as long as libjpeg-turbo to decode, and I don’t believe hardware encoders tend to bring that big a performance difference over software, but I’m really not sure.

AVIF reduces the amount of traffic required, but will tend to consume more power. This is the general compression tradeoff.

(Other formats often have hardware decoding support too, incidentally. But a lot of the time they’re ignored as too much effort to integrate, or buggy, or something.)


> AVIF reduces the amount of traffic required, but will tend to consume more power. This is the general compression tradeoff.

Mobile devices on battery are connected wirelessly, so traffic consumes a lot of power. The faster the radio can power back down the better, so CPU time is usually a worthwhile trade.


You kind of ignore the case where almost every device is going to do AV1 hardware decoding (which very much appears to be the trend), if that is significantly faster/cheaper battery wise then AV1 still has a big advantage. Comparing single-core software decoding speed seems like a benchmark designed to make JXL look good, not something that actually matters.

> AVIF reduces the amount of traffic required, but will tend to consume more power. This is the general compression tradeoff.

Again, you seem to be ignoring hardware decoding. Dedicated silicon can be many magnitudes more efficient than doing something in software. To take an extreme example with a ton of effort into the efficiency: look at mining bitcoin on a CPU vs an ASIC. I'm not saying the difference will be that big, but it may well be worthwhile.

As to buggy/too much effort/cost of hardware, that's precisely why it makes sense to piggy-back on AV1, a format that already has a lot of incentive to implement in hardware, and the work already done to make it work well. You need that kind of compression for video, and people are putting in the effort to make it work well, so AVIF gets that effectively for free.


Part of the reason is because it's a technically superior codec, check out John Snyer's series of blogs on comparisons, e.g., https://cloudinary.com/blog/time_for_next_gen_codecs_to_deth...

Video codecs as used for images also have big disadvantages since they weren't designed for many picture-focused workflows

> only matters during the transitional period.

which can be decades, so this matters a lot

> it seems to massively outweigh any advantage JXL has

Since you haven't listed any other advantages outside of downplaying the compatibility during transition, so that's hard to weigh. Also, it's not like, if we're talking about the whole lifetime, hardware couldn't add support


The advantage is increased battery life and performance, which is way more important to most end users than any of the advantages I've seen for JXL. People are not pixel-peeping different images to compare quality, they are annoyed when their battery dies.

As to hardware adding support for JXL, that seems extremely unlikely: image decoding is less impactful than video decoding, and the cost of adding custom decoding silicon to a chip is very high, as is adding support to software for that hardware. Being able to piggy-back on the work already done for video meaning you get that stuff for free makes it way more viable. AV1 decoding is already out there in virtually every new device, and rolling out hardware support is very slow, it's massively ahead in that respect.


Jpeg-XL is light enough to not require hardware support. Did you tried to transcode a PNG to avif? It's painful. Not the case with Jpeg XL. Meanwhile I urge you to read this article. Jpeg XL has way more features than avif.

https://cloudinary.com/blog/the-case-for-jpeg-xl


Hardware decoding can mean less battery usage, which is very big for end users.

I just don't think any of those features matter as much a battery life, most of them are about encoding speed which just seems wildly unimportant to me: encoding may be more work, but generally you view images far more than you make them, and admins and creators are in a better position to spend the time/effort to encode something, and hardware encoding may well end up making it a non-issue anyway.

People are out there running `zopflipng` and the like to try and get better sizes at the cost of more work at encode time, so it seems like that priority isn't just me.


I have problems w/ AVIF that are like the ones that guy has with WebP. Please don’t post a link to the F1 car sample image because I think that image sucks (e.g. a reflection from a surface near the driver’s head gets replaced with a different but plausible reflection.)


Image decode is a rather tiny fraction of the loading time of a modern web page, or its power budget...

Thats why, to my knowledge, nobody even bothers to use hardware jpeg encoders/decoders on phones/laptops, despite many bits of silicon having them.


JPEG may be fast enough for all of that, but is that also true for these newer ones? Decoding the .heic pictures from my old iPhone takes 1-2 seconds on my laptop(!!) As near as I could find out that's because the iPhones and macBooks and such all have hardware support for that, and my ThinkPad doesn't.


According to the article I've linked above, JXL has twice as many points (don't remember actual speed comparison numbers) in the decoding speed comparison, and is also more parallelizable


To be honest I find that a little bit too vague to be useful.

I can't find clear numbers on this, but on e.g. [1] I read it's not too fast, but I didn't try to reproduce their results, and according to some comments a number of factors can greatly affect performance.

[1]: https://old.reddit.com/r/jpegxl/comments/zwftn2/libjxl_on_an...


Then doesn't that follow the line of argument for why AV1 isn't being adopted, either? Namely, lack of hardware support?

I can understand why jxl isn't a dominant web format, but I don't see where avif has any place being a web format currently.


AV1 is being adopted? Almost every modern bit of hardware has AV1 decoding baked into it now, which is a huge hurdle to pass.


JXL is an image codec - it can afford to be less efficient (it's not! Other way around, rather) and not be hardware accelerated as its typical use case is not to present 30-60 images per second like in a video codec, it will not affect the battery life of a device in any meaningful way. Also, AV1 hardware decoding is far, far from ubiquitous so many users would not benefit at all from it.

But - back to JXL vs WebP:

I think Google had genuinely good intentions with WebP, but the effort was somewhat ruined by their culture: they relied too heavily on metrics which aren't always a good proxy for image quality, because humans looking at pictures don't scale, and Google does things that scale. We now have a codec with good metrics, but looks poor.

It's based on the intraframe coding of the VP8 format - a video codec - and I think it suffers from that. Looks OK in a video, but bad in stills where you have more time to notice its warts

Most importantly, it's almost always produced by recompressing a jpeg and causing a generation loss. I don't know of any phone or camera which produces native WebP (maybe some recent pixels? Dunno), and any professional device used in RAW mode usually implies the participation of someone who cares about the finished product and will not want WebP (and will resent when it's used without their consent by webmasters wishing to tick a box in pagespeed, as the author mentions). JXL has a lossless recompression mode in which it just replaces the Huffmann compression stage of an existing JPEG with something more modern, and this results in a pixel-accurate image which is 20% smaller than the original file - this already eat WebP's claimed space saving, and then some, with no generation loss. Based on this fact alone, there shouldn't even be a discussion.

....but let's have a discussion anyway. A JPEG -> JXL lossless recompression isn't conceptually new - Stuffit (remember them?) did it in 2005, with not enough traction sadly (unsurprisingly since there were patents and licensing costs). Basically it's _still_ a JPEG - if you decompress the final stage of a JPEG, and the final stage of a JXL (or a .SIF), you get the exact same bytestream. While yet another amazing testament of JPEG's longevity and relevance, it is also concerning: How could Google do worse than that??? When basically rezipping (with a modern algo) the existing DCT macroblock bytestream of a 30 year old codec beats your new codec, you should just trash it.

Edit: ...but I forgot to answer your question. Why is JXL viewed so favorably on HN? Because it doesn't suck, and we're sad that Google decided to be a roadblock, and pushing for their own thing, which instead sucks. At least AVIF is way better than WebP, even though it's a monster, computationally.


What you're ignoring is that WebP is from the year 2010. JPEG XL is from 2022. Incidentally, JPEG XL is also a Google project, making your ranting about how bad they're at image formats pretty funny.


Hi! I'm aware that JXL partially originates from Google's PIK - and also Brunsli??, but I had indeed forgotten that WebP started in 2010, wow, 13 years old already.

I'll therefore correct my statement: "How could Google do worse than that??? When basically rezipping (with a modern algo) the existing DCT macroblock bytestream of a 18 year old codec beats your new codec, you should just trash it."

Also, Stuffit's SIF format is still 5 years prior to 2010 so that point stands.


I didn't compare with stuffit. If it's better than JXL recompression, perhaps they had put more focus on lossless recompression. Perhaps they had less realtime constraints in decoding speed.


JPEG XL is viewed favorably on HN because it's the underdog to evil Google. Before they wrote their complaint article about Chrome removing support (after significant time of noone using the format), noone here gave it a thought. It's not like anyone is attacking Firefox for not enabling it either.

This is not a format quality thing, this is "let's have a chance to complain about Google" thing again ;)

I mean, this whole posted blog is doing a comparison on a single image. Anyone with a bit of thought would dismiss this as ridiculous in first second... but there's the Google name and the HN haters are out of the woods.


Firefox nightly has support according to https://jpegxl.io/tutorials/firefox/, of course you're wrong that nobody is attacking FF, but given its tiny niche compared to Chrome it's obviously much less consequential, so the volume attacks on Chrome would dwarf anything FF-related (Safari was also criticized, and they've recently added support)

> after significant time of noone using the format

That's also fasle, this is too new of a format for any significant time of no use to materialize, besides, requiring flags that vast majority of users will not enable is a huge factor limiting widespread use


JPEG XL research, development and maintenance happened/happens mostly at Google Research. Chrome devs removed it from Chrome, but it is still a codec built mostly by Google.

Here the reasons why Chrome devs made the difficult decision to remove JPEG XL from Chrome: https://groups.google.com/a/chromium.org/g/blink-dev/c/WjCKc...


I don't know, I liked it on its merits before. I'm sure others did too.

Seamless legacy support is very valuable. And it still performs pretty well compared to competitors. I think it's a good default for a lossy network format.


The support was never complete to begin with, so the removal wasn't due to nobody using it. Some rivalry between different teams inside Google is more likely.


What a shit take. JXL did have plenty favorable responses on HN before Google removed it for reasons that they never applied to their own formats. And FF did get plenty of complaints for not supporting JXL but those are often shut down with the opposite variant of your take.


As I work with codecs I've been following the situation quite closely and the attention to XL was pretty much zero until Google decided to not support it.

Moreover, this whole topic is about a comparison over a SINGLE IMAGE. Anyone who ever came close to codecs would immediately dismiss this as ridiculous. Yet here we are.


I will respond to you since you posted about this so called "SINGLE IMAGE" three times in this post already.

Ackchually, the blog post contains a comparison over TWO IMAGEs. But since you work with codecs, surely you understand that the blog post is complaining about how WebP interacts with gradients in general and not just about the specific images in the blog post.

JXL was getting plenty of attention before the Chrome debacle. Of course it was less than WebP and AVIF but JXL wasn't getting pushed or championed by anyone (other than Cloudinary I think) so JXL didn't have the marketing powers the others had.


To make a conclusion about how a codec handles image features you need to to quantitative comparison across a big enough data set to make conclusions about any kind of generalized quality.

This goes triple for modern codecs like JPEG XL, VP8/9, AV1/AVIF, etc. because they deliberately make tradeoffs when compressing based on how the image will SEEM to people, not how pixel correct it is. Note just how many people say they barely notice a problem - this is where WebP made the tradeoff. JPEG did it elsewhere (e.g. text).

Cherry-picking a single image is useful only for fanboy screeching.


The author explains why thinking in terms of averages "across a big enough data set" isn't enough.

>Call me crazy, but I don’t give a shit about averages. For a gaussian "normal" process, probabilities say half of your sample will be above and half will be below the average (which is also the median in a gaussian distribution). If we designed cars for the average load they would have to sustain, it means we would kill about half of the customers. Instead, we design cars for the worst foreseeable scenario, add a safety factor on top, and they still kill a fair amount of them, but a lot fewer than in the past. [...]

>As a photographer, I care about robustness of the visual output. Which means, as a designer, designing for the worst possible image and taking numerical metrics with a grain of salt. And that whole WebP hype is unjustified, in this regard. It surely performs well in well chosen examples, no doubt. The question is : what happens when it doesn’t ? I can’t fine-tune the WebP quality for each individual image on my website, that’s time consuming and WordPress doesn’t even allow that. I can’t have a portfolio of pictures with even 25 % posterized backgrounds either, the whole point of a portfolio is to showcase your skills and results, not to take a wild guess on the compression performance of your image backend. Average won’t do, it’s simply not good enough.


> To make a conclusion about how a codec handles image features you need to to quantitative comparison across a big enough data set to make conclusions about any kind of generalized quality. > > Cherry-picking a single image is useful only for fanboy screeching.

Do you really expect a photographer to prepare a quantitative codec comparison benchmark? All they have is anecdotal evidence, and I think it is fair for them to criticize and make decision based off of their own anecdotal evidence.

> This goes triple for modern codecs like JPEG XL, VP8/9, AV1/AVIF, etc. because they deliberately make tradeoffs when compressing based on how the image will SEEM to people, not how pixel correct it is. Note just how many people say they barely notice a problem - this is where WebP made the tradeoff. JPEG did it elsewhere (e.g. text).

No one is going to sit here and claim that WebP performs better on all images or JPEG performs better on all images. Obviously there is going to be some kind of tradeoff.

TBH, my gripe with WebP is not that it's worse than JPEG. IMO it is in fact better than JPEG in most cases.

My problem is that it is only an incremental improvement over JPEGs. We are breaking compatibility with the universal image formats and we get the following benefits:

- 15-25% better compression

- animation

- transparency

- lossless compression

On the other hand, we could break compatibility, adopt JXL and get the following benefits:

- lossy compression on par with WebP

- animation

- transparency

- lossless compression that is marginally better than WebP

- actually kinda not break backwards compatibility because you can convert JPEG -> JXL losslessly

- enhanced colorspace support

- progressive decoding

- very fast decode speed

- support for ultra-large images

Adopting WebP would be great. But why adopt WebP when instead you can adopt JXL which is superior in terms of features and on par in terms of compression?


Google haven’t explicitly decided to block adoption of JPEG XL. They removed an incomplete implementation from Chromium which had never been shipped, because it was a maintenance burden and they weren’t ready to commit to supporting it. That’s quite a different thing. It may indicate a broader strategic direction, but it doesn’t necessarily.


I want to believe.

Having an immediate upgrade path to all pictures from the past is too good an opportunity to pass up.

We rarely get a free “compress losslessly” button for our archives.


The called it technically inferior based on opinions. They didn't do a thorough technical review, and why would they, they have webp. This was absolutely a strategical thing, it's naive to think it isn't.


Yeah, I'm quite hopeful that this is one where the developer backlash will cause a U-turn. I suspect it was seen as something that most people didn't care about, and now that it's clear that they do then likely something will be done about. I can't see any reason why Google would be strongly against it's inclusion.


That charitable interpretation would have been okay unless the Chrome team (yes, "Google" is not a single entity here) tried to publish a faulty benchmark [1] that has been thoroughly criticized [2] which never has been answered so far.

[1] https://storage.googleapis.com/avif-comparison/index.html

[2] https://cloudinary.com/blog/contemplating-codec-comparisons


Apple already has! Safari has Jpeg XL enabled by default.


Seems Safari has it enabled by default now, and Apple has support at the OS level. Firefox at least has it under a flag. Chrome team are the odd ones out here.


Agreed. It's especially infuriating that their arguments against jxl would have applied to webp too (even more so) but for some reason that was pushed trough (as were other Google formats).


I know this is not constructive and I'm sorry, but I just can't read the text with those st and ct ligatures. It makes me feel like the author is trolling with them and I shouldn't take the text seriously. I know that's an exaggeration but that's what the design makes me feel.


Agreed, I don't understand why anyone would use ligatures like that on body text. Add them to titles you want to look particularly fancy if you must but please don't mess with the readability of anything longer than a paragraph.


But why to draw hairs between all 'ct' and 'st'? As a non-native speaker, is that some English weirdness that I have yet to learn about?


It's how English was written in the "olden times". At that time, little flairs (such as ligatures) were pretty common, and were very fanciful. Some simpler ligatures (like ff) survive today, but embellished ones (like ct) were toned down. It's just a stylistic choice to draw them one way or another, but it's jarring to see the fancier ones in "modern" texts because we're used to the simpler styles.

Fun fact: The German "eszsett" (ß; U+00DF) is a ligature for "ss" (specifically the "long s"[0] and a normal "s") that evolved over time to be one "letter".[1]

[0]: https://en.wikipedia.org/wiki/Long_s

[1]: https://en.wikipedia.org/wiki/File:Sz_modern.svg


To me it just feels out of place, like a calligraphy ligature got accidentally mixed with a standard serif font. From what I've seen this kind of ligature is usually applied to many other letter combinations as well, in which case it at least looks consistent.


According to the Wikipedia page for eszett [0] it evolved from "sz", as the name "eszett" suggests. (I only realized the link with "z" when I saw "tz" ligatures on street signs in Berlin.) Given that its typographic origin is sz, and given that its name literally says sz, I wish the spelling reformists had gone for sz rather than ss!

[0] https://en.wikipedia.org/wiki/%C3%9F


> It's how English was written in the "olden times".

Exactly.

(skipping some minor details)

When we started printing instead of writing, we dumbed the letterset down into fewer mechanical pieces. Thus earlier printers in English had to use the letter 'f' for the discarded "long 's'" letter, back when the long 's' was still expected by readers.

And that dumbed-down letterset was the one that then made it to typewriters and then our keyboard today.


The same goes for the ampersand – “et” slowly morphed into “&”.


It's one way it was written, I bet. If my experience from old Norwegian church books is any indication, there were a lot of ligature fads. Some liked to replace all double consonants with a single consonant with a line over, for instance. It had a good couple of decades.

Do we really need to continue this stuff on computer screens.


Ligatures are old fashioned in English but still very common in French. Some ligatures are actually mandatory (like the oe in cœur, heart) while others like st are pretty common in proper typefaces such as those used for novels. The author is probably French (Aurélien).


They also space out question marks, exclamation marks and colons, which is standard in French but not in English.


I've been writing almost exclusively in cursive for my entire life past age 8 and that font looks crazy to me. I learned both D'Nealian and Zaner-Bloser in different schools and have seen a lot of my grandmother's writing, which was semi-Spenserian.

The stroke just doesn't go in the right direction for those ligatures. My guess is that this font is based on a French (or maybe other latin) script.


These ligatures are definitely French ligatures. See for example this picture from French wikipedia,

https://commons.m.wikimedia.org/wiki/File:Ligature_typograph...

But also French typographical ligatures (well beyond the syntactic ones that are mandatory) aren't really related to cursives, they are a typographical convention. like the cursive s doesn't look like s and wouldn't have a ligature with t from the top of the t in cursive. (However, at least for French cursives it's common to do a single cross for double tt which I guess is a ligature?)

I also only learned cursives in school. In fact writing in script was forbidden and not taught at all.


Perhaps the website was designed for a french audience, and an alternate theme not created for the english localisation of this article...


I also found it distracting, which is a shame, because the actual core of the argument the author is making is an important one.


Yeah, that got on my nerves too. Ligatures (like kerning) are supposed to help make reading more fluent and otherwise blend in, not stand out like a sore thumb. Really not sure what those ligatures are supposed to say - "wow, look how cool this font is, it has ligatures!!!"?!


At first I thought I had an unusual amount of very small hair on my display.


Very small, curly hairs... hmm. (I might have thought the same thing while reading this on the toilet... so you can imagine the extra layer of credibility that afforded.)


I wiped my phone screen a few times until I accepted it is part of the text. Is that a toggle in the text editor? And if it is then should reader clients have a counter toggle for it? I'm thinking autocrlf style.


Luckily Reader Mode or disabling CSS takes care of the oddity.


I wonder if that would still work if the ligature were done in Unicode instead of CSS?


Interesting question! The st and ct ligatures used in the article don't seem to be part of the precomposed Latin ligature set, and what is there strikes me as far less obnoxious [1,2]. I expect it's possible to hack something together with combining characters, but also that the visual result would be far too ugly for the tastes of anyone who was desiring ligatures in the first place.

[1] https://en.wikipedia.org/wiki/Ligature_(writing)#Ligatures_i...

[2] https://superuser.com/questions/669130/double-latin-letters-...


U+FB06 (LATIN SMALL LIGATURE ST) does display the same as the obnoxious ligature in the article if you have the right font. I actually used that for the "st" in the word "still" in my comment above.

In the HN comment editor it is the same as in the article, with that stupid curve connecting the s and t.

In the rendered comment in Chrome, Firefox, and Safari on my Mac it is using for regular comment text some font where the s and t are joined much less obtrusively. In fact at first I thought HN was replacing the U+FB06 on output with separate s and t. E.g., these two look very similar for me: still still.

For rendered code blocks on Safari and Chrome it is using the font that has the curve. On Firefox it does not have the curves. Here is a code block example:

  still
  still


I was going to suggest blocking web fonts with uBlock Origin, but I get them even with JS and fonts blocked.

Turns out a CSS rule does this:

    font-variant-ligatures: common-ligatures discretionary-ligatures contextual historical-ligatures;


I didn't really understand what everyone was talking about since I use a combination of uMatrix and noscript and noscript blocks webfonts by default. But, after whitelisting his site, eh, it didn't bother me. I guess it's what one is used to.


It does really mess up the typography


> I'm sorry, but I just can't read the text with those st and ct ligatures.

The whole font is bad. It looks pixilated and blurry, which can only be explained by that being the intended look. It's bizarre.


It doesn't look pixelated on my machine, it looks pretty nice - here's a screenshot at 200% zoom:

https://i.imgur.com/WE1tbIB.png

The ligatures are definitely an unusual artistic choice...seems like the sort of thing you'd finally get used to 4 chapters into a book, but until you're immersed in it, it's quote distracting.


The font looks much less ill-defined and irregular if I tell the browser to enlarge the text to 130%. There are still some issues at that size.

But I'm not going to judge it by how it looks after I do a manual adjustment. The way it's presented is terrible. It somehow manages not to fit into the pixel grid of my 15-inch 3840x2160 screen. These are not large pixels!


HN, where text looking bad on your machine means the designer intended for it to be unreadable. Isn't that more bizarre?


It's like that tiktok voice when used in technical demos. Just can't focus on anything else.


Somehow these ligatures trigger subvocalization in my brain, with the silent "author's voice" speaking with a lisp. Probably not what was intended.


I assumed they were intentional for editorial effect: deliberately designed to disrupt the flow of a reader attempting to follow the text, in the same way that he claims the compression artefacts disrupt the perception of a skilled photographer trying to use the heavily compressed webp and jpeg images.


I couldn't read it because it was a tiny column of text at a tiny font size with the rest of the screen being wasted space. Thank goodness for reader view.


I even went so far as to check the html code of the page to find out what was going on there. It is annoying, almost snobbish.


This site made me add my first-ever global Stylus sheet:

* { font-variant-ligatures:unset!important; }

I'm pretty infuriated that this is necessary, why would someone abuse readability so much for their own personal satisfaction about how "cool and unique" they are? Usability. Comes. First.


It's their website. They can do with it whatever the hell they want with it. They owe random people on HN exactly nothing. And you can also do whatever the hell you want on your website, because that's your website.

You also need to reset "font-feature-settings" by the way.


and he can globally remove whatever he wants from being displayed in his browser with global style overrides!

what a glorious world we live in


They didn't just add a tip how to override it, they said this should not be done at all.


I wouldn't turn off all ligatures, some of them are actually useful.


I read this comment. I thought, "wow typical HN, someone has taken the time to write this big blog post and the top comment is about some stupid little typographic detail."

Then I went to read the text and... :(


I find it interesting how many comments here (presumably from "tech guys") confirm what the author wrote:

> So there is a real issue with the design priorities of image algos from tech guys who clearly lack historical and artistic background, and don’t talk to artists, who anyway have largely decided that they were above science, maths and other menial materialistic concerns.

I am a tech guy, and when a photographer tells me that an image looks worse than another one, if I don't see it, my first reaction is more "can you try to explain to me why it is worse?" and less "I don't see a difference, so you must be wrong".

I would be slightly offended if an artist told me that there was nothing wrong with `if (vAluE < 3 ) {return true; } else {{ return false;}}` just because they cannot see the problem.


While I agree with the rational component of the article (webp may be inappropriate for artistic photos) I had to force myself to read it. The "t" in the font screws me up completely, I tried twice to wipe the screen of my phone then thought that maybe it's a background picture getting in the way.

So overall I find author's aesthetic sense very questionable which contrasts with his high-moral-ground tone.


For me it's the horrible layout. For God's sake, stop making narrow columns of text. Having the text take up most of my monitor is much more pleasant to read.


I think the opposite. When text in a webpage takes up all my monitor's width, I go into Developer Tools and manually add a max-width rule so that I can read the text comfortably.

And AFAIK, all HCI literature seems to agree with me.


Long lines of text cause significantly more eye strain than reasonably short ones. Generally, one should try to have ~80 characters per line of text.


Research suggests optimal line length is 50-75 characters

https://baymard.com/blog/line-length-readability


Discretional ligatures… well, they require discretion, which the author seems to sorely lack.


Haha, I used iPhone’s reader mode, which I do most of the times.


The "t" or the "ct" ligature?


Wow I am reading it at 170% zoom, and in the fourth paragraph the word "distribution" which contains the "st" ligature is automatically cut and "hyphenated" between the "s" and the "t" letters. But the ligature remains : half the ligature at the end of one line, and the other half of that ligature at the beginning of the next line ! This looks wrong. CSS has probably missed an edge case here. Or is it the job of some "text renderer" in the browser ?


"st", also.


It's the "historical-ligatures" feature of used font, if you aren't in the reader mode already, F12 and

    document.body.insertAdjacentHTML('beforeend','<style>p { font-variant-ligatures: common-ligatures discretionary-ligatures contextual;}</style>')
should turn it off. (Was "too much" for me either.)

But besides this, I found typography of that article quite nice; interesting that there are thin spaces before "?" and "!" and wide spaces (not double spaces) after sentences - also "old school" (and often frowned upon). I guess some WP plugin does it, but I admit don't remember seeing seen this anywhere else recently. (And I like it.)


Tech guy working in media here. 100x this. I often can't tell the (perceptive) difference between video encoded with codec A and B, but I do have objective metrics such as bitrate, framerate, CPU/GPU power required to encode/decode, device quirks, etc. When I doubt, I always defer my decision until after I can consider the input from my colleagues.


I think it's more "the target audience for webp is people closer to and arguably less trained than my eyes and I can't see a difference" which is a pretty reasonable take. But tech people aren't typically the best communicators and so I'm not at all surprised it comes off crass.

mp3 is "worse" than flac but if you say it sounds bad I'll absolutely tell you you're wrong and to get off Hi-Fi forums.


If your goal is to batch convert a wide variety of lossily-compressed source material at a significantly lower bitrate without obvious loss of fidelity, MP3 is not great.

That appears to be the author's specific complaint against WEBP, and seems fair.


First of all, the tone of the article invites equally acerbic criticism. Calling devs “image coding douchebags” is not exactly going to win anyone over.

Secondly, there’s a bizarre assumption here that someone can’t be both a tech guy and an artist, which is nonsense.

Thirdly, there’s a likely incorrect assumption here that artists weren’t consulted, or that the authors of the format weren’t aware of the tradeoffs that were being made.


This entire article reminds me of the ones a few decades about about the utter indignity of mp3's, and how us peasants that use it _AT ALL_ or at the very least with any bitrate under 320bps was just criminal.

Then proceed to play the flac's in their car. Ok.


This is more like MP3 versus one of those random codecs available for Windows 3.1 that had some big company behind it that one day got bored with codecs and now makes industrial pizza ovens. As google is a major proponent of WebP and Google is known for dropping projects and services with no notice, and the fact that webp gives very little value on the server side and webp creates objectively worse visual presentation, it would be best to consider WebP as deprecated for any new development.

edit: I would also like to note that there is no technical reason to use WebP. The only reason it is used because Google is literally bribing you with "better rankings" for using webp. In other words, it is strictly marketing-driven.


All of these new formats like webp or avif look like shit. They look like screenshots from videos, which is what they literally are.


> here I am, loosing faith in humanity

<sigh> Me, too, buddy. Me, too.


The author may be right but he definitely does not understand the difference between good and good enough.


Is it really unreasonable for a photographer to have a higher standard of "good enough"?

Anyway, his point is that JPEG was already "good enough", and WebP is not actually "good" for his purposes despite claims that it's better than JPEG for all purposes.


His claim is too broad. Why not serve RAW files? For the real enthusiasts


Because you wouldn't see the difference in quality while the size difference would be huge


Yes, there is some banding, because it's a web format designed for small file size. 10-bit AVIF has smooth gradients in smaller size, thought not as well supported yet.


But why should it be worse than JPEG in that respect? It's a much newer format and supposedly much better.


It's just a happy accident that the way JPEG compresses things and smooths them out visually happens to be an advantage in this particular edge case.


I wouldn't call it a happy accident; JPEG was carefully designed to look good for single-frames with the limitations of the human eye taken into account.

WebP is based off of a video format, and tradeoffs there are very different.


I don't get the point of complaining about losing such small details that non-educated eye can't see for a compression format.

That's the whole point of compressing the image, isn't it?

To me, it looks like webp does its job.


OP is a photographer and is pretty clear about that being part of their motivation:

> Stick to JPEG at 90 quality (or at least 85) if images matter to you, e.g. if you are a visual artist. If images are pretty decorations for your textual content, it doesn’t matter.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: