Makes sense, but I hope that there’s meaningful progress on microLEDs soon. While OLED burn-in has improved a lot, it’s still a concern for the lengths of time that people keep screens for… my 5 year old VA panel TV still looks identical to the day that it was unboxed after thousands of hours of usage at high brightness, and I’d hope any TV replacing it would be capable of the same or better. Same goes for monitors.
* smaller size (that is, the light comes from a tiny pinpoint in the middle of the pixel rather than a whole rectangle being lit up). This actually helps make the display look a lot sharper.
You don't want small pixels. This causes optical aliasing and associated problems.
Edit: By small pixels I mean a low pixel aspect ratio(area of lit pixel vs total area per pixel). If you consider this as a DSP problem(instead of a time series problem, a brightness value over distance or angle), there's effectively no reconstruction low pass filter. You can think of small lit pixel area with larger distance between like zero padding. This can make some things appear sharper, but by doing so it'll appear less like the image source wants it to.
No, the perfect pixel is a sinc() function of appropriate bandwidth. A pixel of the form "dot with a dark area around it" is a better approximation of sinc() than pixel of the from "fill all the available space evenly".
And it doesn't mean the bright part of a pixel should be as small as possible, no, it should be tuned for bandwidth.
Even eye receptors (cones) approximate sinc() in the same fashion.
Sure, but at least for RGB subpixel displays common to LCD TVs, each subpixel has neighboring dark-area not only in the form of the black matrix, but effectively also that of the two other colors.
I know someone who uses an ancient samsung oled phone which they had left on overnight regularly for youtube for years and that had the faintest of burn in only visible on a pure white screen.
That would depend entirely on how many hours you used it. They don’t age like meat, over time. They age from having current running through the pixels. So their degradation depends on the kind of content you watch, for how long, and how bright you configure the picture. I have thousands of hours on a 6-year-old OLED and it looks perfect, but I only watch films in a dark room so I don’t think it’s very demanding.
The Samsung Galaxy S III is 11 years old and had an AMOLED screen (does that still count as OLED?). People are still using them with custom ROMs somewhat, and I haven't heard of display issues.
There's also the first revision PlayStation Vita that had an OLED screen and is a similar age. I have heard of yellowing on some models, unsure how prevalent burn-in is. Of people I know with a Vita or several, most got a second gen only, or switched to a second gen and so the old one didn't get used as much.
Much less old, there's the Switch OLED, and WULFF DEN on YouTube has one he's leaving constantly turned on to see what happens to the screen. I think we're 1.5 years in or so and he's done a handful of videos showing what it looks like now.
If you're using a 12 year old LCD panel, you're probably hobbling yourself with terrible resolution, response time, refresh rate, color gamut, viewing angle, backlight evenness, black level/contrast, etc.
It's absolutely bizarre how luddite HNers can be when it comes to keeping even remotely current on hardware.
If it meets the poster’s needs, what’s the problem? Better that old electronics continue to get used than for them to be junked.
Though my primary monitors are newer, I have monitors that are coming up on a decade old that get used as secondary monitors because I don’t need anything fancy for that use case… it just needs to be a functional screen. Some people have similarly undemanding needs for TVs.