Hacker News new | past | comments | ask | show | jobs | submit login

I've noticed the same issue with WebP and have gone back to JPG/PNG for most things (jpg for photos, png for UI-type images)

I think the real problem is, like many of the commenters here, most people can't tell the difference because desktop monitors have been stuck in a deadzone of zero innovation for the last 10 years. I'm sure half the folks here are viewing his example images on a 2012-era HD 1920x1080 LCD, which is definitely part of the problem.

It's bizarre. Smaller displays (Mobile phones) and larger displays (4k TVs) have fantastic pixel densities now considering their viewing distance. However any panel in the range of 20"-40" is stuck in the mid-2000s.

Also, I think the author would have done us a favor by using example photos with lighter backgrounds (or changing the background color of his post to black). The harshness of the black images on white don't allow the eye to adjust enough to see the issue. If you put those images on a dark background its super easy to tell the difference.




I have no problem seeing the artefacts on both my 2012-era displays. One of them is a rather good at the time 30" 2560x1600 IPS monitor, the other is an entry-level 27" TN 1080p TV.

So I don't think display quality really is the problem here. Maybe the drivers, or post-processing filters. Or maybe everyone doesn't have an eye for this. I have an interest in image processing, and that's the kind of detail one tends to notice with experience. The author of the article is undoubtedly more experienced than me and noticing these details may even be part of his job. He most likely will be able to notice these problems on crappy monitors, as well as telling you in which way that monitor is crap.


Someone else noted the author is sending different images to different monitor types... so no wonder everyone is seeing different things.

Generally though i would expect wide gaumet monitors to make a significant difference for these types of artifacts


I have an extremely hard time perceiving any difference on a 27" 4K monitor. I am not even sure I really see them.

The examples are just bad. If you want to show something, screenshot and enlarge it to show the artifacts.


This seems to be highly subjective. I had absolutely no problem seeing those artifacts without any pixel peeping, they're that obvious.

WebP image gradients just looked broken (posterized) except the lossless one, which was (obviously) perfect.


It's hard to see in the first set of images, but the second set is much clearer. In the WebP example, look to the right of the subject, about 1/6th of the image's width from the right edge. There's a hard transition between shades of grey. The JPEG version directly above it also has banding but each band is narrower so the difference at the edges is more subtle.


> enlarge it to show the artifacts.

One might argue that if you need to enlarge it to see the artifacts, then the artifacts aren't perceptible enough and the codec is already good enough for the use case.


But we are philistines not pro photographers


He was talking about the background, not the foreground.

The difference is in color around the edges of the picture in the background change noticeably on a non-fullscreen image on my Android 12 device.


> The examples are just bad. If you want to show something, screenshot and enlarge it to show the artifacts.

Yes! Where's the red underlines and diffs? I can see the background banding, but the foreground looks the same at a glance except that some of them look ambiguously "off" in ways that could just be placebo.

You'd think a visual artist would be more interested in visual communication and not just a wall of text with un-annotated photos.


I think he was complaining specifically about the background banding.


I downloaded the images and then compared them via Beyond Compare.

After that it was pretty obvious what the author is talking about.


The article is about the background banding.


Laptop and desktop monitors have been advancing just fine over in the Apple world with high ppi, brightness and color accuracy being standard for nearly a decade... it's just expensive and so one of the first corners cut for PC as most folks simply don't care.


I see the rings easy on my few years old AOC 1440p monitor. PC users can have way better monitors. Studio colour accuraccy or fast hz gaming


I could see them, but only after turning my brightness up close to the max. I usually have it very low.


> I've noticed the same issue with WebP and have gone back to JPG/PNG for most things (jpg for photos, png for UI-type images)

Wait... I agree for JPG but if you use lossless WEBP instead of PNG, isn't it simply the same pixels, just with a file about 30% smaller than the corresponding PNG file? (and 15% smaller compared to already heavily optimized PNG files like when using zopfli/optipng/etc.).

Isn't the "lossless" in "lossless WEBP" actually lossless when converting a PNG file to WEBP?

FWIW when you convert losslessly a PNG to WEBP, then decompress the WEBP back to a PNG file, then convert again that PNG back to a WEBP file, you get the exact same lossless WEBP file. It's also the same WEBP you get when you encode losslessly from either a PNG or that same PNG but "crushed" with a PNG optimizer.


Yeah but I just don't fw webp and other weird formats. JPEG and PNG are tried and true, also it's nice how the extension indicates lossiness.

On the technical side, webp support still isn't like png. Tried dragging a webp into Google Slides just now, got "unsupported image type," which is ironic. I'll try again in like 10 years.


> On the technical side, webp support still isn't like png.

Oh that's a good point.

I see lossless WEBP mostly as a way to save bandwith where PNG would have been used. If you've got a pipeline where, anyway, you already "crush" your PNG file, you may as well also generate a lossless WEBP file and serve that: all browsers support it. And you can fall back on the optimized PNG should the browser not support WEBP.

I mean: I use WEBP, but only lossless WEBP, as a replacement for PNG when I'd serve PNG files to browsers.

But for that one usecase: showing a PNG file in a webpage, I don't see that many downsides to lossless WEBP. It saves bandwith.


At this point in my life, I just don't have time. I basically use either mp4 or PNG for all web "images/animation" when doing web pages. I don't detect browsers or the like. Unless there is some revolutionary new image/video tech, I'll stick with them for the foreseeable future. I only bother with JPEG when it's straight from the phone/camera and I don't want any reduction in quality from the original high rez.


Only if you can accurately detect browser support and serve the PNG instead, which means added complexity. And you have to store both.

Also, if users download your images and use them elsewhere, webp will still be more annoying for them. Though it's not very common that you want them doing that anyway.


https://caniuse.com/webp

Any updated (modern) browser should be able to see webp just fine, I'd rather just serve it without a backup plan if I'm planning to have webp in my website.


The browser support for webp is fine, problem is everything else. If you only care about displaying the images (not letting people use them elsewhere), you only use lossless webp, and all your backend infra supports it, then sure.


I'm on a 27" 4K IPS screen here and have to squint/zoom in to see the difference the author is writing about. While it's nice some people really care for the best result I think most people aren't going to notice or care about it.


I'm guess it's also true that HN is definitely the wrong audience for this post. As the author suggests, if you spend all day in VScode/VIM, you're among the segment of computer users who looks at images the least as a percentage of time spent on a computer.


Yes, but at least there are a decent amount of font 'connoisseurs' here ;)


It's like the audiophile equivalent of using $500 speaker wire. Nobody normal really cares about the difference, if there's really any difference at all.


I caught it on my Android 12 without full screening. He's talking about the background, not the foreground. The backgrounds color noticeably changes from shot to shot around edges.


I have to zoom in to really notice that. But both the jpg and webp have distortion - webp slightly more. Both have difficulty with edges.


I think we're talking about two different things. You're not noticing the forest for the trees. I'm talking about big huge macro effects that become more apparent when you zoom out, not less.

There is a difference in the gradients of color. One hasn't the guy looking backlit and one doesn't.


At default zoom the image is 20% of the width of my monitor so it's hard to see artefacts. When zoomed in the posterization is noticeable but jpeg at 85% is about as bad as webp. I don't see any substantial difference in lighting.


>because desktop monitors have been stuck in a deadzone of zero innovation for the last 10 years.

That's a weird thing to say unless the pixel density is your one and only measure. Regardless of that, the posterization should be perfectly visible on a 2012 FullHD monitor, or even a 1366x768 TN screen of a decade-old laptop. Most commenters here are probably viewing the pictures on a scale different from 1:1.


> That's a weird thing to say unless the pixel density is your one and only measure.

Is it though? We now have OLED TVs and OLED smartphones.

Where's our OLED PC monitors?

On every measure, if you care about colors/contrast/black+white levels/resolution/density, the average computer monitor has fallen far behind.

You can't even buy a smartphone that has a panel half as bad as most PC monitors on the market. And, at least in my area, you'd actually have to go to a lot of effort to find a non-4k TV.


> Where's our OLED PC monitors?

https://computers.scorptec.com.au/computer/Oled-Monitor

They've been around for years.

PC monitors have been improving constantly with high refresh rates, local dimming HDR + 10 bit color, adaptive sync, OLED and more.


Only on the unusual high-end gaming monitors.


OLED is overwhelmingly reserved to high-end TVs and phones as well, so I think that point is moot.


My base iPhone 12 mini from years ago has OLED, so do a lot of cheaper Android phones. Gaming displays are far less common than these.


Phones have a smaller display which makes them easier to manufacter.


Yeah, that also supports how the iPads don't have OLED yet.


> Where's our OLED PC monitors?

https://www.displayninja.com/oled-monitor-list/

Mainly targeted towards the gaming market at the moment.


some of those prices are insane. Why are they so much more expensive that OLED TV's of similar size? Frame rate?


I dunno about TV much since I don't use them, but I have some ideas why it might be:

- Framerate - Response time - Adaptive sync - (how prone to burn-in is OLED? Monitors often have way more static images to TVs)

I assume combing these all might just make it more expensive than just individually each feature


> - Framerate - Response time - Adaptive sync - (how prone to burn-in is OLED? Monitors often have way more static images to TVs)

The much more complicated electronics plus Supply & Demand. Demand for TVs should be way higher then for high end monitors.


Not true. Monitors now are 1440p or 4k. Even at work for me.

The "issue" is that monitors last a LONG time. And thats good. We dont touch them or fiddle with them. They tend to just work. Phones and shit we keep dropping and breaking, then the battery gets bad.

Also for gaming you may even want 1080p 200hz monitor for high refresh rate and FPS over pixel density.


You also can't write software bad enough that you're forced to upgrade your monitor due to poor performance.


You almost can. The Windows Terminal app has a performance issue on gsync monitors. I think it's being treated like a game but the app only renders at 60 fps or something, maybe lower, which I guess forces the whole screen to refresh at that rate which causes mouse stutter


> They tend to just work

They really don't...


> I'm sure half the folks here are viewing his example images on a 2012-era HD 1920x1080 LCD, which is definitely part of the problem.

I just looked at the first two images of the post.

First on two mid end LCDs: one ASUS IPS from this year and one BenQ TN from 2012, both 24" 1920x1080 (~91 DPI). The difference between the images is clear on both.

And before posting, to make sure, I pulled out a 15" 1024x768 (~85 DPI: basically the same) NEC TN LCD from 2002. And a NEC CRT roughly 15" viewable 1024x768 from 1998. Both on VGA connectors (so there is the typical noise from that, which still doesn't cover up the posterization). The difference between the images is clear on both.

All monitors viewed from 3' away.

People are simply accommodated to poor image quality, including posterization. AAA FPS video games display it on static art backgrounds in the loading menu, and I can never tell if they are intended. Show them a 240Hz monitor with 30ms input lag and 5 frames of overshoot artifacts and viewing angles worse than 1998, and they'll be wowed.


It’s quite noticeable on a 2011 MacBook Air, too. The issue is less pronounced if you don’t have a decent display but it’s more that people are not used to it. Like bad kerning, it’s something you’ll notice everywhere if you train your eye to look for it, but otherwise probably don’t notice except that some things feel less appealing.


Also, only a tiny fraction of PC monitors have color gamuts wider than sRGB, proper HDR support, or any kind of calibration.

Recently I’ve been dabbling in HDR video, but I realised that the exercise is futile because I can’t send the results to anyone — unless they’re using an Apple device.


I see thing rings easy on my few years old AOC 1440p monitor.


Pixel density isn't the issue. 2K-4K computer monitors are pretty common. But they tend to suck in other ways compared to a MacBook screen. And yes I can tell the difference between the images on my MBP.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: