Hacker News new | past | comments | ask | show | jobs | submit login
PHOLED Will Transform Displays (ieee.org)
220 points by bookofjoe on Dec 21, 2023 | hide | past | favorite | 191 comments



This video is a great primer on the state of the art of display technology right now and seriously changed how I view each tech. Seems like there is a lot of convergence between the main technologies and they all borrow from each other in different ways in their pursuit of the ideal display.

https://www.youtube.com/watch?v=TyUA1OmXMXA&pp=ygUjZGlzcGxhe...

IIRC talks about PHOLED as one of the upcoming technologies to get to the pinnacle.


I'm still waiting for my affordable microled displays.

But the last few years they became noticable used.

This year in a outdoor it event I had to check the display wall behind the speaker to check it out.

It was full color, fast, bright (we are talking no cloud hot bright summer day and that display was in the sun and it was LEDs!

Crazy impressive



I'm still waiting for my Helio Display - https://en.wikipedia.org/wiki/Heliodisplay

It looks like the last time their website had content was 2016 - https://web.archive.org/web/20160726091433/http://www.io2tec...

But it is (was?) essentially a projector that used lasers and a small amount of ultrasound-generated fog (which they really downplayed in favour of a lot of talk about focused air currents) to project a 2D image in space. They termed it a mid-air hologram.

Perhaps not the best tech for viewing anything much, but surely we all want to recreate Princess Leia's projection from Starwars?


I've got a Looking Glass Portrait and it is spooky how effective it is (although it depends a lot on the quality of your depth map.) The software is terrible though and getting stuff onto it is a chore.


Their upcoming model (currently on Kickstarter) is internet connected, so will presumably be easier to put content on it

https://www.kickstarter.com/projects/lookingglass/looking-gl...


Is it worth $250 though? A 2 hr battery seems short, though if the software can automatically convert all my old photos into 3d, I would buy. (NERFS and gaussian splatting)

I think my friends would love to see our old memories come alive again


I've converted a bunch of old photos to 3D for my Looking Glass portrait -- it's more like 2.5D because the depth map generators tend to slice photos into planes which pop out but if you're artistically inclined, you can tweak the depth maps using Photoshop etc. -- and it is quite startling to see old photos come to (semi-) life.


So much marketing speak and AI hype yet not even the slightest hint at how the thing actually works. I presume it's a high-density LCD with some kind of micro-lens array in front of it?


Apple has been rumored to be working toward MicroLED displays in their products for a while, popular opinion is that the Apple Watch will be the first one to make the jump like they did with OLED.

Now that they have the Apple Watch Ultra at $800 and in comparatively low volume (I assume), I won't be surprised if it shows up in the next version of that, then makes its way to phones and elsewhere.


Makes sense, it has the smallest display so it is likely a good test bed for the manufacturing process.


Unless it's small enough that they just go with the other manufacturing tech where the display is on wafer and you don't have to deal with the pick and place mess. Which it almost certainly is.


Isn't the Vision Pro is using microLED?


Micro OLED for those


And yet even decent HiDPI displays (I'm talking 300ppi+) are barely a thing outside of the Apple ecosystem, or affordable. Seems like everyone is okay with 20 year-old display resolutions and keep pushing instead for higher refresh rates. I for one just don't want pixelated/blurry fonts and care little for refresh rates.


This is driving me insane. There are literally 5 desktop monitor models in the market today that provide a natively scaled experience for macOS users. All over $1000. The only monitors that would provide the equivalent screen real estate of my 2005 30" Apple Cinema Display @ 2560x1600 are the $2800 Dell 6k and the $5000 XDR.

Apple released "retina" scaling in 2012. It's been more than 10 years.


I had a high dpi sony trinitron during the dot com crash. It’s been 25 years, not ten.

Linux (and all the Unix’s) had a huge advantage back then because of X11’s superior high dpi support.

I still don’t understand why they have spent the last ten years trying to switch from pixel perfect rendering with variable dpi to fractional scaling.


> I still don’t understand why they have spent the last ten years trying to switch from pixel perfect rendering with variable dpi to fractional scaling.

Is there even a difference, except for the calling it dp instead of virtual pixel? Fractional scaling should be able to render just as pixel perfect if you're not using legacy applications


Back when there were only “legacy applications” they were all pixel perfect and almost everything responded to the X11 DPI setting.

Then someone in Linux land decided that stuff should be blurry by default because that was the stop gap MacOS chose when they backported a DPI setting to MacOS X.


Has changing dpi ever done more than scaling text? I'm pretty sure it hasn't, which makes it strictly inferior because everything just looks weird and not at all like it should, with tiny icons to boot.


You’re 100% correct with the insanity. I ended up buying the last supported 27” 5K iMac with the intent of using it until support ends. Then I’ll replace the innards with a display driver board, making it an external display. This is a significant cost saving vs the equivalent Studio Display.


I'm not quite sure what you say is true.

Apple Notebooks have 224-254ppi, the external displays 218ppi. The only higher ppi displays from Apple are on the iphone, and they are not special, most decent (android) phones from 5 years ago have 400ppi+. Funnily Apple was dragging their feet in this space back then.

Apple is more consistent, but it really isn't hard to get 4k laptop displays now.


> Funnily Apple was dragging their feet in this space back then.

Isn’t that in part due to Apple being slow to adopt OLED along with its unusual subpixel arrangement which requires higher ppi to look good.


No matter the ppi, UIs still look worse on an OLED compared to an LCD. Simply because they don't have pixels arranged in a grid, and UIs have lots of straight lines in them, those lines will always look fuzzy, even if just a little bit. High-density "retina" LCDs, on the other hand, render straight lines extremely crisply.


Not really. Mainstream Android phones had high density OLED and LCD displays before iPhones.

For example, by the time of the iPhone 8's 326 PPI, LG had the G6 with a 564 PPI LCD.


My LG G3 had a 538ppi IPS display and it is from 2014, 9 years ago.

Now that I think about it, smartphones have stagnated a lot.


806 ppi for my Xperia XZ Premium from 2017


...and Openmoko phones had HiDPI screens around the time of the 1st-gen iPhone.


As I get older I care about bigger fonts and therefore bigger monitors. Higher resolution does nothing for me really. I need bigger fonts not better anti aliasing. My main is a 4k43"144hz which is basically the same dpi as a 27" 2560x1440 which I have 4 of in a semi sphere setup. If I ran that 4k at half the resolution to get nice fonts I'd have massive eye strain. Instead I can keep the monitors at a comfortable arms length so I don't squint up close and still get tons of text on screen.

I also get a lot better frame rates than pushing 5 4k or 8k screens.

I tried out apples absurd 8k display and it's just so small I'm getting half the text on the screen which is basically throwing money at Apple for no reason.

The 4k43 is glorious for games and for fusion360. Also as a grow light :)


I've been rocking 4k48"120hz for a couple of years now (an LG C1 48").

It's game changing, and the OLED shines when I do color sensitive work. It feels like having a huge canvas in front of me. Initially I tried doing fancy window locking arrangements and I still do that occasionally. But over time I moved towards treating it like a big desk that I can resize all my windows on as needed.

However, I would probably move down in size a little next time given the choice. 43" sounds sweet. The issue, at least two years ago, was that these smart TVs are massively subsidized. Mine cost about $1000 I think, probably cheaper now. While similar specs on slightly smaller monitor would be considerably more expensive.

EDIT: I just checked and the 43" 144hz OLED monitors I can quickly find are $1500+, while the updated version of my screen, the LG C2 48", is on sale in a local shop for $600.

This screen is a TV but it's still by far the best monitor I've ever had. The only minor problem is that low brightness and reflectivity of the screen means I had to reposition my desk, and it still becomes an issue for about 30 minutes every day when the sun moves to just the right position to bounce and reflect off the wall behind me on to the screen.


Would have done what you did but it literally won't fit:( my coworker has that at home it's amaze


For monitor I just look for screens with 87-94 PPI, for to avoid to scaling. Therefore the size of the monitor for me is determined by the combination of inches and resolution, what gives me centimeters of the screen.

     5:4  1280x1024 19" ( 37.68cm × 30.15cm )  86.27 PPI 
    16:9  1920x1080 25" ( 55.35cm × 31.13cm )  88.12 PPI
    16:10 1920x1200 24" ( 51.69cm × 32.31cm )  94.34 PPI
    16:9  2560x1440 30" ( 66.41cm × 37.36cm )  97.91 PPI
    16:9  2560x1440 32" ( 70.84cm × 39.85cm )  91.79 PPI
    16:10 2560x1600 30" ( 64.62cm × 40.39cm ) 100.63 PPI
    16:9  3840x2160 46" (101.83cm × 57.28cm )  95.78 PPI
People that are used to higher PPI may think those PPI have screen-door (to see the pixel), but for me this only uncomfortable if I see a 81 PPI monitor (16:9 1920x1080 27" [59.77cm × 33.62cm] ) at 45cm of distance. Anyway, I just want the sharpness and uniformity that only comes from not scaling either by the monitor or the operating system.

My ideal may be a 16:10, and better a 5:4, with 40cm of height and around 87-89 PPI, nevertheless the monitors with 16:10 aspect ratio use old generations panels what make me to discard them, IPS and VA ghosting. And TN doesn't exist there (at that size the angles limitation of TN would be an issue anyway, nevertheless for 1280x1024 19" is perfect). The modern panels are only under 16:9 aspect ratio.

Unfortunately for me I do not like the 16:9 aspect ratio for monitors due if it is small I'm losing visual space top and down like if it were a letter box, and if the monitor is big it forces to move the head much.

The matter is, I think the marketing of 2K, 4K, 8K is secondary, without knowing the panel size and the distance from what will be viewed firstly. So I usually recommend to do a table with the PPI predilection and width-height preferences. The one I use for to search monitors have the lines of text that gives the panel height with the font size I use more.

> The 4k43 is glorious for games and for fusion360

That gives a reference, 16:9 3840x2160 43" ( 95.19cm × 53.55cm ) 102.46 PPI

Maybe you see this from 65-90cm of distance, at least, due the width, case contrary you would have to move the head and even the body, what makes the fonts look tinny.

I guess the panel manufacturers introduced 16:9 as monitors -stopped developing panels with aspect ratios for computer monitors better said- for to avoid to have two lines of panels, computer monitors and TVs. Mere guessing.

PS: https://www.sven.de/dpi/


Woah, do you have a picture of this setup you can share? I'm really curious!



You're talking smartphone displays? You certainly don't need anywhere close to 300ppi on a computer monitor to have crisp fonts.


The point is only valid about desktop monitors, so I assume he meant 200ppi.


Yeah 200+, my bad!


Outside of the Apple ecosystem you have still apps struggling with high dpi, it is a pain to mix with non hdpi displays, less performance/higher power consumption.... and for what? A bit more relative crispness? Maybe I'm oldschool but native resolution even with visible pixel size looks way more sharper with OS ui borders than something that is scaled.


i'm no expert in display manufacturing, but as per my understanding driving higher refresh rate is much different challenge than creating more densly packed pixels.

display overclocking has been a thing for the longest time, which also implied that getting more refreshes is often a product of display controller and how reliably your display can work in higher voltage.

getting high yield on larger displays with high ppi is still tricky iirc, especially when some 1080p displays can still come with dead pixels.

i am sure companies would push for higher pixel count displays if economics were rightly aligned.


There are HiDPI displays on the top end models of pretty much every major manufacturer. They do command a premium however.


> Replacing the fluorescent blue with phosphorescent blue will mean a more balanced pixel structure and could enable higher-resolution displays in the future. In the near term, the switch will lead to an approximate 25 percent gain in efficiency

I would have expected a 50% gain. According to the quoted efficiencies, the blue fluorescent subpixel needs 4x more power (at 25% efficency) than the phosphorescent red and green subpixels (at near 100% efficiency). So making the blue phosphorescent as well should reduce 1+1+4 to 1+1+1 power, a 50% reduction (technically a 100% gain in efficiency). Why is the near term gain only 25% ?


My professional experience is that blue sub-pixels for a 5500K balanced white use about 50% of total power (rather than the 66% you show). My understanding is that this is because even though Blue 460nm is shorter wavelength (higher photon energy) than RG(530&610nm) it is also less significantly less bright (in photons/sec/solid angle).

I've struggled to find a good webpage, but roughly in subpixel power% it comes to 45%+35%+4x(20%)=160%. By improving blue efficiency it could become 45%+35%+20%=100% and require only ~2/3 of the original power and total display power efficiency by ~50% (ignoring all the computation, RC losses, coms, etc).

White balanced power is independent of the number of pixels (or pixel arrangement) such as RGB vs RGGB, but RGBW or RGBY or RGBC can improve efficiency (and reduce this relative improvement %).


I don’t know the answer, but I think there is usually 2 green subpixels, so 1+1+1+4 to 1+1+1+1 for that facet


You’re thinking of the Bayer pattern for sensors, which have four equal RGGB squares. When an RGGB pattern is used for displays, the more numerous green subpixels need to be smaller to maintain the correct white balance, resulting in the same 1+1+4 power profile.


Yes, for cameras it is called Bayer. For displays Samsung coined Pentile. It is by far the most common array of mobile OLED subpixel designs (RGGB), used in mobile devices (eg iPhone, Galaxy, Pixel, Huawei, Oppo, Xiaomi).


GP's point was that it's not the number of pixels, but the power draw that matters here (for power consumption) and while there are more greens they are smaller and draw less.

This design pattern happens because humans are more sensitive to greens, but that doesn't mean you need more green output.


Exactly, for a white balanced screen you need the same number of photons whether they are at a higher resolution (like Pentile) or not. Adding a White (or Yellow or Cyan) subpixel for RGBW can improve efficiency for the less saturated part of the color gamut, but obviously not the pure red green or blue colors.


If I understand correctly you don't want the same number of photons. You need more blue photons.


I'm saying photons here, because there are so many words that have very specific meanings that are not typically understood (Nits, Candela, Lumens, Luminance, Brightness, Illuminance, Luminous Flux (Lux), and Luminous intensity) by the lay person. However, just looking at the 5500K black body the power density of any Blue is well below that of any Green or Red, and since the energy per photon is higher at shorter wavelengths (blue) that means the photon flux is significantly lower (about half the number of photons/sec/m2 of green).

http://hyperphysics.phy-astr.gsu.edu/hbase/phyopt/coltemp.ht...

Just for the plot.

The reason that Cyan (between Blue&Green) and Yellow (between Red&Green) subpixels added to the full pixel can be more efficient for white (or unsaturated colors) is that they are detected better by the eye (perceived brightness) for an amount of power or photons than a primary Blue or Red.


The "More from Spectrum" at the bottom shows what are the timescales of this kind of progress: "almost ready" = 10 years.

https://spectrum.ieee.org/bright-blue-pholeds-almost-ready-f...


Hardware is hard. Especially when it means inventing new physics and chemicals.


That doesn't really explain though why someone would say they were "almost ready" when they very much weren't.


My takeaway from this article is I should change my color scheme to amber instead of white on black.


Isn't this common knowledge by now? OLED-optimized color scheme is green and yellow on deep black, with orange or red highlights. White is avoided, and blue even more so. Helps prevent burn-in, and minimizes energy use. The tech featured in OP is still great for those who require accurate color reproduction, of course.


I love how we're back to amber after all these years of basically never using that color


It's not mentioned in the article, but it seems like VR displays especially could benefit from the higher resolution and better efficiency.


Yep, and computer displays. IMO smartphone displays don't need more resolution.


Retina computer displays are already high enough as well.

MacBooks and iMacs don't really need further pixels either.

VR seems like the only mainstream focus now on increasing density. It seems like the Vision Pro is going to get us halfway there from e.g. the Meta Quest, but there's still going to be another big jump to get to Retina-equivalent.


> Retina computer displays are already high enough as well

The options for larger screens are more limited than I would like, personally. 4K @ 27” is pretty good, but side-by-side with a 5K display, I can see a difference.

If you want to extend that level of sharpness > 30” though, you rapidly find your only options are a small number of incredibly expensive Apple and Dell displays.


> Retina computer displays

What's that? It's not a standard I'm familiar with.


But they do need higher efficiency


Not just VR, but think AR wearables as well. This is accretive to enabling smaller power sources without sacrificing capabilities, thus improving form factors.


I love OLED displays my laptop and tablet: when working at night, it's a wonderful complement to eink (when working at day)

I thought 4k was great, but if I can get a 25% increase in dpi or a better efficiency, I'm very interested!


I can’t wait for OLED displays in Macs. Night mode > dark mode ( https://untested.sonnet.io/Heart+of+Dorkness)

I built an OLED friendly reading app (midnight.sonnet.io) and I’m waiting to add night mode to my writing app (enso.sonnet.io) since I occasionally use it in darker environments.

I also made a simple obsidian “night mode” config I use on my OLED screen.


Wild! I am working on exactly the same thing now for Lunar (https://lunar.fyi), and I'm also calling it Night Mode ^_^ what a coincidence

I've been trying to make "white regions in dark backgrounds" less painful for months, but doing that at the system level on macOS is incredibly hard. I see you're doing it with CSS filters, which make sense in the limited scope of an article. But applying something like that on the whole macOS UI would cause confusion.

I already use something similar on the iPhone: I read on the Kindle app which has white text on black background, then I have a full red Color Tint filter on the Triple Back Tap shortcut which I use before reading. Very similar effect to your solution, although I don't have images in my books.


> I've been trying to make "white regions in dark backgrounds" less painful for months, but doing that at the system level on macOS is incredibly hard. I see you're doing it with CSS filters, which make sense in the limited scope of an article. But applying something like that on the whole macOS UI would cause confusion

Can I suggest you "my one simple trick" when I was doing the same on Windows?

Increase contrast, a lot, in the original RGB space, then only keep the R channel, then invert the picture.

It's like doing a "black and white" mode, but as "black and red" and avoids losing "faint colors".

Also, you remove the color consistency problem (IIRC the perception of colors is not symmetrical on light and dark backgrounds, I think it was pioneered by Ethan Schoonover for Solarized)

BTW the inversion should be optional, to be nice to apps using a dark theme (ex: many terminals by default) and may work best on a window-by-window basis if that's possible on the Mac.

The best results are when using a system light theme + light themed apps.


> Night mode > dark mode

Absolutely!

> I built an OLED friendly reading app

Very nice! On windows, I use a program that runs matrix operations on the color space, so that I could increase the contrast, invert, then only keep the red chanel

On wayland I can do that with wl-gamma: for an equivalent of your app but at the wayland level, try: `wl-gammarelay-rs & busctl --user -- set-property rs.wl-gammarelay / rs.wl.gammarelay Temperature q 1000`

> I also made a simple obsidian “night mode” config I use on my OLED screen.

I had similar setups for my editors, but removing syntax coloring and using the raise contrast + only keep the red channel turned out to be simpler to generalize


>MacBook Pro With OLED Display Likely Still at Least Three Years Away

https://www.macrumors.com/2023/10/11/macbook-pro-oled-three-...


> Night mode > dark mode

Can you elaborate what that means?

I looked through the article linked but couldn't find any obvious explanation.


Red letters and icons like in submarines so your "day mode" vision does not get activated.

Here is the astronomical software Stellarium: https://rasc.ca/sites/default/files/SMP-red.png


Instead of the dark areas being dark gray, they are pure black. To me it looks worse tbh.


Night mode usually means red on black.

At night, a black background with a faint red foreground is ideal: one extra advantage of this "submarine mode" is the lack of blue light, to avoid disturbing sleep

Dark gray is acceptable as a proxy for the black background only if you don't have an OLED display (or if you're not in a dark room)

Some people with vision problems report that white on black cause visual artefacts for them, but it's often because they use a brightness that's too high


I hope this leads to a phone / smart watch that lasts multiple days. Does anyone know how energy requirements break down between CPU and display in a typical device?


There are smart watches that last multiple weeks, e.g. most Amazfit watches. The older models also used to have always-on transflective displays, which were vastly better to use a than the current OLEDs that need weird wrist gyrations to turn on.


These have existed for decades. The problem is the ever demanding software filling up any of the hardware gains made in the last twenty years.


Wearables are the one area where I'm really not happy with the efficiency of software.

They don't really do anything that couldn't be done on early 2000s handhelds, why don't we have any kind of app-capable micro RTOS with an accompanying app store on Android for this stuff? An ESP32 and a microSD seems to be all any of it really needs.


This describes Pebble to a T. Customers bought the Apple Watch instead.


Pebble(Like Garmin which seems to be the modern equivalent, that I used to use before getting a Galaxy Watch) only had a tiny amount of flash storage and couldn't do maps or voice recording very well.

The also used e-ink on one, which was probably slow, and it didn't have a touchscreen.

It seems like they were explicitly trying to keep the features fairly simple, but they could probably do a lot more with similar battery life.


I've just shopped for a running watch, and while the AMOLED screen watches look nice, they claim to have ~50% the battery life in GPS mode compared to transflective LCD display watches.

The article suggests a near-term 25% efficiency gain from this tech, so seems unlikely to translate to >2 day battery life.


I have a Garmin with an OLED display and I usually have to charge it once a week or so. The GPS is by far the biggest power consumer. During activity tracking with the GPS turned on the battery lasts maybe 10 hours. It doesn't feel like a change in display technology would meaningfully change that situation.


> GPS is by far the biggest power consumer.

There are some cool research papers about delayed processing of GPS data in the cloud. The idea is you turn the GPS on for just a few milliseconds, record the raw radio data (without getting a GPS location fix), and do that every 10 seconds or so.

Then later you upload all the collected data to a big cloud compute cluster which can figure out all the locations (and where battery life doesn't matter).

People are using that technique to have GPS trackers with years of battery life - handy for things like tracking animals.


Tracking animals would seem to be a different set of requirements than, say, turn-by-turn car navigation.

I'd imagine that for a lot of research, longer lifetime would win over real-time ish data, and possible you don't care so much about precision and granularity either. You probably want to upload semi often or risk losing the whole thing, but otherwise minimize batter use.


GPS already isn't good enough for car navigation, which also integrates wheel rotation and steering angle sensors. Even on a bicycle the tracking is noticeably better when you add a wheel rotation sensor to a GPS head unit.


Interesting! That doesn't really change the argument, does it?

"Where am I right now", is a different requirement than "where have I been, roughly, over the last 6 mo"


Yeah I was trying to agree.


Do you have any idea what the storage requirements are for a solution like this?


Pretty high - each location sample is hundreds of kilobytes if I remember correctly, although it was possible to trim that down if you knew there was a strong signal or you were happy to have some probability of an incorrect location.

Annoyingly I can't seem to find the paper now.


Why do these Garmins outlast Apple Watches by 5-10x? Is it because they're bigger? Or does the Apple Watch do more things?


Do you use it with the display always on, or do you have it wake up whenever you look at the display?

I have a Garmin watch with a transreflective display, and end up charging it ~once a week with 1-2 hours of activity tracking per day.


Just disable the screen on an Apple Watch and it lasts for 2 and half days.


I have a not-very-fancy Android smartphone that will easily go over a day between charges, if I let it. I can bump that up to 2.5 days between a full charge if I turn off Location and Bluetooth. (Which I usually do since I typically need neither.)


One problem for OLED screens compared to LCDs is their rather low maximum brightness. Unfortunately it doesn't seem like this new blue dye will change much about that.


That doesn’t seem to match what I understood from the article. At one point they say explicitly that it will enable brighter displays.


But it won't bring it close to LCD levels:

> In the near term, the switch will lead to an approximate 25 percent gain in efficiency; manufacturers can take advantage of this to increase battery life, reduce the size of the battery, or enable a brighter display.


The 25 percent gain in efficiency is achieved by reducing the waste heat from ~ 20% to ~ 0%, so (if heat is the only limiting factor), they should be able to make them 0.2 / ~0 times brighter. That number could be much greater than one.


Interesting. When OLED screens age, or burn in, do they get yellow? That is, less blue? If they simply get darker, this means the red and green subpixels deteriorate as well, which means improving just the blue subpixels doesn't solve the problem.

I personally think the relative sizes of the subpixels already reflect how much they age. Larger ones are presumably larger because the dye ages more quickly, and larger pixels don't have to be as bright per area. So improving the blue dye would allow us to make the blue subpixels somewhat smaller and/or brighter. The current size difference is not very large though.


There aren’t any dyes in oleds. They emit photons directly at the correct wavelength.

(Ignoring the topic of the article, which explains why that’s an oversimplification.)

If they yellow, it would be due to different colors dimming at different rates or because there is a plastic protective/anti-glare coating, and it yellowed due to UV exposure.


> There aren’t any dyes in oleds. They emit photons directly at the correct wavelength.

I meant the new blue PHOLED material.

> If they yellow, it would be due to different colors dimming at different rates or because there is a plastic protective/anti-glare coating, and it yellowed due to UV exposure.

No it would be because red+green=yellow.


Somehow I feel like I want to pronounce it monosyllabically, so like foaled, rather than like followed or (even worse) faux lead.


Pee holed


Sounds about right


I was thinking fuh-led, like your guide is a Vietnamese dish.


faux lead is exactly how I pronounced it lol.


This might even rub off on glow-in-the-dark technology. Exciting! Most glow-in-the-dark materials in blue are feeble by comparison to green and must rely on various inefficient tricks.


This seems like well-disguised marketing speak. The biggest weakness for OLEDs isn't brightness or battery life; it's burn-in, or rather burn-out. Will blue PHOLEDs make it so device manufacturers will stop telling me to set the Taskbar to auto-hide? If not, I don't see why anyone outside the industry should give a damn.


> Will blue PHOLEDs make it so device manufacturers will stop telling me to set the Taskbar to auto-hide

It will help, it says as much in the article. OLED burn-in is a function of how hard the pixel is being driven. Greater efficiency means less current required for the same brightness means less heat generated means longer lasting displays.


What's the state of MiniLED for gaming/movies? Isn't that best of both worlds? No burn-in, higher brightness (I know OLED would be pain to use in a room where blinds cannot be pulled down all the time). And image quality can match, or be better than OLED?


MiniLED is just LCD panel with an array of small white LEDs acting as a local array dimmable backlight.

MicroLED is like OLED but using tiny LEDs for each sub-pixel. MicroLED is still far from affordable prices and also not really able to make high resolution like 4K in a common sized TV or display. MicroLED are mostly still like 100"+ to reach 4K pixel density.


You are thinking of MicroLED. MiniLED is mostly just marketing and not close to OLED.



How are you interpreting these? Contrast, local-dimming, and black-level are all fairly poor compared to the the perfect scores received by OLED monitors.

To be fair, MiniLED displays are significantly cheaper than OLED displays, but their performance isn't really comparable.


Oh, I agree. $1,200 monitors are mostly better than $300 ones.

A more interesting comparison here IMHO is with panels using the same technology, sold around the same price (IPS or VA) and lacking Mini-LEDs backlighting.

Rtings seems to conclude that Mini-LEDs backlighting is far better than full panel back lighting.

Hell, I'm using one right now (KTC M27T20, paid 330 euros) and it's just amazingly good... even if not as good as OLED.

Also, right now, OLED also suffer from text fringing issues : https://www.flatpanelshd.com/news.php?subaction=showfull&id=...


burn in as well no?


Yeah, also, but to be fair, I've got a 6 years old OLED phone without any burn-in, and a 3 years old OLED laptop without any burn-in.

So, I don't know, I guess unless you purposely display a static white element for hours every day for years, you won't live to see a burn-in on an OLED display.


Don’t forget LCD’s garbage response times. “MiniLED” screens smear a lot and lack the crisp smoothness of OLED’s 0.1ms response time


You still have motion blur with OLED due to sample-and-hold. Techniques like backlight scanning or BFI are needed to achieve true motion clarity.


Yeah, I'm actually personally pretty excited about MiniLED. It's not "perfect" in terms of lighting like OLEDs or MicroLEDs are, but they also don't suffer from many of the downsides like burn-in.

The fact that it's not as good as OLED on one performance metric doesn't mean it's smoke and mirrors, it's just a middle ground technology that makes different tradeoffs.


MiniLED is better for displays in bright environments that need better lifetimes.


I was specifically referring to MiniLED, because it is actually available in the market as of today. And it does appear to be close to OLED.


They are capable of displaying brighter whites and darker blacks, but not at the same time. :(

Check the "starfield" test: https://www.youtube.com/watch?v=MVSQTHYZXD0&t=715 - the LG OLED TV shows all the stars on a perfectly black background, the Sony LED TV shows all the stars, but the background is not perfectly black. The TCL MiniLED TV shows near-perfect black background, but is missing most of the stars!


The additional contrast of MiniLED is very low-resolution (2000 pixels or so per dimming zone). It is useless for text, for example.


>What's the state of MiniLED for gaming/movies?

If it is within your budget, take a look at Sony X95L MiniLED TV. If there was an award for least complained TV set on avsforum in recent history ( if not the whole history ) it would be the X95L.

Although I am eagerly waiting for the 2024 series to see what Sony has to offer in terms of MiniLED.


Or save your money and get a X90L and some bias lightning (ie. Medialight MK2 Flex).

But at this time of year, I'd probably rather wait for 2024 models.


Lots of OLED TVs out there. I'm not aware of any that need the blinds pulled down to use.


I sometimes need to limit outdoor light for watching a non-OLED TV, can't imagine not having to do that even more with your average OLED


MiniLED is trash.

MicroLED is the only thing that will dethrone OLED


I was planning on buying a QD-OLED TV next year... maybe I'll wait.


FWIW, I don’t think you should wait. I bought this year’s samsung s95c qd-oled, and it is such a nice looking display that I feel we’re well into the territory of diminishing returns of further improvements. I was struck by how much nicer 4k hdr movies look compared to the last few times I went to a movie theatre.

The only real downside is that now I notice just how much content is not 4k hdr. Improving the upscaler’s software would probably make a bigger real world difference than improving the panel, at least for me.


Microled will be the next game changer. Current oled panels have something like 800-1000 Nits peak brightness?

Dolby Vision can be mastered at up to 10,000 nits and microled displays will apparently have the ability to reach that brightness.

HDR content will be absolutely unbelievable, getting blinded like you're outside.


Not disagreeing, but miniled displays are already out, apparently blow oled brightness out of the water, and are much cheaper than oled. (Black level isn’t quite as good.)

https://www.tomsguide.com/opinion/this-is-the-tv-im-most-exc...

(The 98” is $9999. 85” is $2799, but it was down to $2299 earlier this year.)


MicroLED != MiniLED.

A MiniLED display is a traditional LCD display, but the backlight is divided into addressable sections called dimming zones. A few of Apple's high-end displays use this technology. The downside is that each pixel isn't 1:1 with a dimming zone, so there are "blooming" artefacts where a zone overlaps a region that needs to be lit up.

MicroLED, much like OLED, is where each pixel self-illuminating. For OLED there are organic materials that emit light in the red, green, and blue. For MicroLED each individual sub-pixel is an LED.

Making a MiniLED display isn't too hard, depending on the number of zones. Making a MicroLED display is quite hard because the LEDs need to be microscopic and also there need to be millions of them. There are some MicroLED displays available for sale today, but they're huge (you don't need the LEDs to be as big if the display is massive), and they also cost hundreds of thousands of dollars, e.g. https://www.samsung.com/us/televisions-home-theater/tvs/micr....


Thanks. I assumed microled was lcd with ~ one led behind each pixel.


Samsung LED TVs suck. They use a very specific coating, and if there is any significant light source any blacks turn into less-deep than on LCD.


You mean a sunny room?


There’s always a better one a year away in the OLED game. Brighter, faster, better colors, cheaper, whatever.

It’s a bit like waiting for a faster PC in the 90s. At some point you just have to buy.


From my experience these new types of displays show up in small form factors first before the yield is good enough to build larger panels such as for televisions.

I have yet to see this in small panels so you may need to wait for quite a lot longer than a year.


Any modern OLED (last 5 years) will do you very well.

Newer ones have some additional bells and whistles, but there isn't anything on the near horizon worth waiting for.


which tv do you have right now ?


I went back to an LCD phone. So much better for reading - for me. I have no idea if it was because of pwm flicker or something else. I just hope they keep making phones with LCD screens.


>says Michael Hack


Relatively common surname dating back long before computers


We used to hack lots of things. Jungles, reeds, grain, timber, enemy Viking tribes.


also common is "Hacker"


how will we know when to purchase a device with one?


I’m just waiting for a tv that has instant startup times like phone displays, why they cant replicate instant sleep/wake like mobile phones can is a mystery to me.


I don't think it has anything to do with the display.

It's the TV software waking itself up from power-saving sleep mode, possibly combined with some HDMI negotiation, which may involve waking up a second device from sleep like your Apple TV or Xbox.


The actual display has a pretty much zero turn-on-time (probably under 20ms).

The thing that takes ages to boot up is the 'smart' functionality, on screen display, hdmi link training, etc.


My AOC takes good 15s to switch on (or between different sources). I was always wondering what is it doing all that time


My cheap TCL TV has an instant-on mode, but it does use more power. Phones don't start up instantly from being fully powered off, either.


No, but unlike TVs they consume mere tens of milliwatts when the display is off (most of it going to maintaining the cellular connection), and their batteries would be dead within a couple hours if they used as much power as TVs are allowed to suck when "off"

It's certainly possible for a TV to be ready to be fully on within a second while consuming <1W; monitors do it all the time. But TV makers barely have the software expertise to respond to button presses within a second when on; developing responsive low power states is an order of magnitude harder.


Most large TV makers make smartphones too, so I doubt software expertise is the issue. Probably more about market forces. (Consumers don't generally differentiate TVs based on startup speed.)


My Samsung TV (LCD) is fairly quick to start up. I think slow startup times are generally a case of bad software.


TVs do not need software beyond the minimum of firmware needed to drive the thing.

See: Monitors.


I have a monitor that takes an absurd amount of time to turn on, like 5-10s


Would you please name and shame this monitor?


Mine's Acer V223HQL. It takes ~5 seconds to power on.

This is my disagreeable take, but the reason UIs are always slow is because slow UI imposes less cognitive load on users, and also developers. You're doing less, that's less work for your brain. Only very few impatient vocal minority wants quicker responses. I care, but clearly I don't belong to the majority.


NEC MultiSync PA Series PA311D-BK-SV 31.1" UHD

Honestly thou zero shame. I absolutely love the monitor and won’t replace it till nec makes a good oled. it looks amazing easy on my eyes with zero fatigue perfect colour and made a great replacement for my nec Pa271w which I’ve had going on 15? years now and somehow it’s still looks better then most monitors of similar specs from today

Unless they enshittyfi I will continue getting nec monitors because while they are expensive they don’t make my eyes hurt like others


I have an Acer Ultrawide that takes an absurd amount of time to wake or mode switch. Literally enough time for me to say out loud "I hate this monitor, it takes an absurd amount of time to wake"


Technically correct and actually wrong :-).

TVs do need software beyond the minimum to support the price asked. TVs are a cut throat, low margin, business. And the only way to eek out a bit more margin is to have some "feature" that makes your offering marginally better than your competitor's offering. That margin can be the difference between a going concern and going out of business.

So from the manufacturer's perspective they do "need" that extra software. Until someone establishes the 'spyware free, dumb tv" market that will continue to be the case I'm afraid.


As a counterpoint, I buy Sony TVs exclusively because they do a much better job of tuning the panel. They don't even make the panel. Sony just slaps Android on it. I'm definitely not the only one there. Sony has been known for their color accuracy for a long time now.

(Sony TVs even have a pretty decent user accessible API)


> (Sony TVs even have a pretty decent user accessible API)

Do you have a link where I can find out more about this? Google is failing me.


The API is the same as what they have for the "professional displays". You can set a key on the TV and then use that to authenticate.

https://pro-bravia.sony.net/develop/integrate/rest-api/spec/


> Until someone establishes the 'spyware free, dumb tv" market that will continue to be the case I'm afraid.

Well, there is Sceptre. Unfortunately they don't seem to be available outside US.


Insignia also makes spyware-free dumb tvs. They're Best Buy's in-house brand, which means you can at least get them in all of N.A. (maybe the same for Sceptre, I don't know). The last time I checked, they only had models up to 43". Probably meets the demands of the kiosk-mode market.


They also don't sell OLEDs.


But as long as consumers keep buying them with the software, because they want to watch Hulu and YouTube and Netflix directly without purchasing an extra device (not unreasonable for the average consumer), TVs will come with the software.

You can argue all you want but the market always wins.


As long as consumers have basically no choice to avoid a smart TV, they’ll keep buying them whether they want the smart part or not.

(I agree you’re generally correct, but at this point we don’t even have a choice)


You're free to buy digital signage if you want, and some people do. You do have a choice.

But it's just not what most people want. Most people really do want their TV to natively run streaming services.


Signage exists, but it takes a lot of effort and research to buy compared to a “normal” TV. Normal stores just don’t have non-smart options, I bet most people don’t even know it exists.

I know several people who love not having to use extra boxes due to their smart TVs, I totally get it. I just wish it hadn’t pushed out all other options, especially on the high end where subsidies from deals are less necessary.


Best Buy sells dumb tvs (their in-house brand, "Insignia"), and their brick-and-mortar stores always have a bunch in stock. Options are limited to 32, 40, and 43" displays, though.


Oh really? I had no idea.


Yep. I've needed a bunch of dumb tvs this year (don't ask) and they're what I landed on. Best Buy overnight delivered a half-dozen ~$100 tvs to my doorstep for free (I bought them one at a time), and I also grabbed some off the shelves at two nearby stores.


Also, as far as I can tell, I just can't get a commercial display equivalent of one of the recent oleds sans smart crap.


Can you point me in the right direction so that I as someone that didn't know this exists up until now could learn enough to know what and where to buy?


Digital signage is getting a lot of smart features too, it's just a bit behind the consumer and hospitality markets.


My monitor is very simple, 3 buttons: on/off, and backlight +/-, no OSD, no scaler, one input. And it still takes a few seconds to display something.

I guess the OS, GPU and monitor are doing all sorts of back and forth before the monitor finally gets the signal it needs.


Isn't that also down to legislation mandating devices to be off and have a maximum off power draw? Mind you, phones seem to be doing alright in that regard.


Their software takes awhile to boot. If they had something like VRRoom internally, the user experience would be a lot better, switching inputs faster etc.

https://hdfury.com/product/8k-vrroom-40gbps/


Sure, that's annoying but I cannot understand how syncing up to an HDMI signal can take 5 to 10 seconds. Frankly, I can't understand how it's not measured in milliseconds. WTF are TVs doing? Is the protocol so bad at getting a picture to the screen quickly, or is it the TVs? Just switching from SDR to HDR blacks out the screen for multiple seconds. Come on.


There is a (limited) solution to this in the HDMI 2.1 spec that’s starting to become available in TVs now: https://www.hdmi.org/spec21sub/quickmediaswitching

It doesn’t solve long initial syncs but it does solve the desync when changing framerate, e.g. when you switch from a 60Hz UI to a 24Hz movie.

As for desyncs between SDR and HDR, that is presumably already a solved problem, I have an LG C1 from a couple of years ago and that can switch between range modes without desyncing.


Slow SoC perhaps or badly optimized drivers / OS.


If UI is chugging for on the order of seconds for seemingly trivial operations, I think just saying it's "badly optimized" is like saying that it's a bit of a walk from my place to Costa Rica. It could be ordered of magnitude faster and still true.


How else are they going to make you look at their logo for 15 seconds?


But will they maintain brightness better than current power LEDs? Some of my rooms are noticeably dimmer than when I first put in the LED bulbs they have now.


I'm holding out for Continuously Obstructed regenerative non-alloyed hybrid OLED (CORNHOLED).


P-HOLED is a very unfortunate name.

They will surely come up with a better name before this goes to market.

It's probably supposed to be pronounced PHO-LED, but some people are definitely going to read this as P-HOLED.


I doubt it, we live in a world where screen resolutions are touted in multiples of "K" and cameras are sold by number of megapixels.

Tech jargon is the worst.


There’s also Pi-hole…


But the hole in Pi hole does mean hole...


Curious to know who patented this. LG has strong licensing rights on OLED, hence the Samsung branded QLED. It'll be interesting to see if display manufacturers try to purchase the rights to new display tech.


FWIW: QLED is LCD garbage. QD-OLED (also Samsung) is state of the art.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: