Hacker News new | past | comments | ask | show | jobs | submit login
Can Dell’s 6K monitor beat their 8K monitor? (stapelberg.ch)
248 points by secure on July 3, 2023 | hide | past | favorite | 453 comments



I think there are three groups of people with opinions regarding 120Hz vs 60Hz refresh rate.

Group 1 can tell the difference between the two and strongly prefers 120Hz.

Group 2 can't tell the difference between the two and prefers the more affordable options.

Group 3 has not done a lot of comparisons between the two refresh rates, or has used 120Hz displays that were actually in 60Hz mode, or compared lower resolution 120Hz displays with higher resolution 60Hz displays.

I think a lot of people mistakenly think they are in Group 2 when they are in Group 3.


I don't think I'm in any of those groups.

I own a variety of monitors and can easily tell the difference between 60hz and 120hz. All things being equal, I of course prefer 120hz (or 165hz as some of my gaming monitors support).

I also own monitors at resolutions from 1440p to 4k.

For doing work (programming, where I'm mostly looking at text), resolution makes a huge difference. I only do coding on high DPI screens and I would upgrade to 5k or 6k or 8k displays if I were confident that my hardware and OS would support them well. (TFA was very helpful in that respect.) In these settings, high refresh rate makes only a marginal difference to my experience.

For gaming, refresh rate makes a much bigger difference, and resolution makes a somewhat smaller difference -- my hardware can't reasonably drive many of the games I play at 4k or higher anyway. So I just use cheaper, lower-resolution monitors that operate at high refresh rates for gaming.

Someday I guess I'll just be able to spend $300 for an 8k monitor at 240hz and then I won't have to make this kind of choice. (In fact, in the several years since I last bought gaming hardware I think the options for high-refresh-rate 4k monitors have gotten much better; I might use 4k for gaming if I were buying today.)

But for now, I'll always pick resolution over refresh rate for doing work, and it's not because I can't tell the difference.


I want the best of both worlds. There exist 144 Hz 4K 27" monitors, such as the LG 27GP950/27GP95R. They're still LCD panels rather than OLED, but I am confident that OLED panels with these dimensions will arrive fairly soon.


I've been using two 27" 4k 144hz monitors for over a year on my personal machine. I'm doubtful if my work setup would be able to drive them to full capability.


I thought I was in 2... till I actually got a 120Hz screen. Since I flip between my desktop at 120Hz and Macbook Air M1 at 60Hz frequently I discovered I really notice.

Its always so much nicer when I leave the Air. Everything is so much smoother.


John Linneman from Digital Foundry made a great point about refresh rates (or framerates in his context). Lower refresh rates generally is perceived as way worse when you often switch around between high and low refresh rates. Case in point if you change from 60hz to 120hz and move your mouse etc. you will definitely feel that its way smoother, and 60hz would feel extremely sluggish. But if you just stuck with the 60hz for a while eventually you will perceive it as smooth again. They use the example of 30fps seemingly feeling very bad when you keep switching between Quality and Performance mode on console games, but 30fps is actually okay for many instances. Of course 60fps will feel smoother no matter what, but the point is is that the difference is heightened when you switch back and forth.


Same difference between 30 and 60hz I imagine. Feels like you go from mega choppy to smooth as butter.


A variable rate between 30-60 is absolutely worse than consistent 30fps.


If you display those frames on a 60Hz screen, it's an awful effect. If the screen natively supports variable frame times, there's a lot more upside and a lot less downside.


I get used to 60 pretty quickly, but I really notice it when I move back.


If anyone is on a high refresh rate screen and wants to see the difference side by side, https://testufo.com is a nice comparison.

In most browsers it will go up to whatever refresh rate your monitor runs at, including 144 or 165 hz, probably 240 but I've never owned one of those.

I have a hard time believing that anyone with functional eyesight wouldn't be able to see the difference between 60 and 120, but I can definitely see not caring enough to spend the money just to make scrolling and cursor movement smoother.


I switched from 60Hz to 240Hz on my desktop. The difference in smoothness is immediately noticeable, but I realized I don't really care. When I don't look for it I can switch from 240Hz screen to my laptop and not notice anything. Both run smooth enough for a good user experience.

So I personally would always prioritize a good pixel density unless one plays a lot of games that require low input latency.


Even for games where fast reactions aren’t important, I still appreciate the smoothness of high frame rates. I’ve found 1440p / 165 hz with freesync to be a good balance, with enough resolution and things running as smoothly as I want to turn graphics options down for. 60 feels low but anything over 90ish I don’t mind.

Screen wise I got a pair of Dell S2721DGFs that I’m quite happy with. IPS panel and commonly on sale for $300. Weakest point is backlight inconsistency for dark scenes but it’s a compromise I’m willing to put up with.


What about the other very large group, like me, who can easily tell the difference but it's very much not worth the price of admission?


Exactly. I can easily tell the difference and couldn't care less.

Brightness, pixel density, and contrast/blacks (including glossiness) are what I care about in terms of image quality.

Maybe if I were a gamer it would matter, but I couldn't care less about how smooth scrolling or cursor movement is. It doesn't change my productivity or affect my comfort at all.


I'm in the same boat, except that I recently discovered that in my advanced years (early 40s) I am now one of those people who are "sensitive to certain flashing light patterns or textures". It's not often that it happens, but every blue moon I will be looking at a screen and suddenly begin sweating profusely right before I lose consciousness for a brief bit. Most recently when that happened I passed out on top of my left arm and I was really concerned that it was numb for a good 40 minutes after the event. I called an emergency nurse line and explained the symptoms, and they assured me it wasn't a heart related event, but a reaction to the chaotic light patterns I was subjecting myself to.

Edit: I never got around to my point, which was even though I'm part of the group who doesn't really care about refresh rate, it appears that my body might care.


You probably should get yourself checked for photosensitive epilepsy.


Same here. I don't care about any of those things.

All I want is a large monitor (currently 43" + 27"), so I can get my work done, then turn off the monitors and collect my paycheck.


I feel like productivity increase stopped at 1080p-1440p@60 for most office work, 4k was diminishing return point for art work. Extra real estate / pixel density hasn't added much to the process, maybe 5k for video people who want to work in 4k. There's some peole with sufficient visually acuity that "feels" better working with more density and refresh rate, but it's hard to rationalize the cost on performance grounds now.


Yeah, we’ve been in an awkward spot for the last few years where the interfaces haven’t quite been there to eliminate the trade off. I’d like 120Hz, but I don’t need it if I have to forgo resolution to get it. Once I can get both for a reasonable cost, then I’m all for it.

I’m still annoyed there are no 5K 27” monitors in the ~$500-800 bracket. I’d rather that kind of pixel density with decent quality in that price bracket first, and then let’s talk higher refresh rates.


I think it must be a visual processing difference in people's brains. I find 30/60/120 equally playable in gaming, but other people here are calling 30fps intolerable.

Similarly I've used a 1080p monitor for years and I'm fine with it. I don't need tiny pixels for coding and messing around with pixel scaling on Linux seems more trouble than it's worth.


> I find 30/60/120 equally playable in gaming, but other people here are calling 30fps intolerable.

I think many people only remember "30FPS" as a stuttery mess between 15 and 40FPS on low-end hardware, which indeed is a terrible experience. In many games a stable framerate is actually more important than a high framerate, but people associate it the other way around because the only number they see in the corner of the screen is the framerate and those two metrics tend to coincide depending on how performant the hardware is.

And in some games low input latency is very important, which requires high framerates.


I think it depends a lot on the way you use it.

30fps might be fine for playing Solitaire, but it is definitely a limiting factor for first-person shooters.

Personally, after a year or two of programming on 30hz, I absolutely want 60hz for office use. Having lag while dragging around windows is not fun. Having a 1080p+ monitor is more about screen real estate than it is about precision: you just can't fit three windows side-by-side on 1080p while still having stuff remain readable.


> I think it depends a lot on the way you use it.

That makes sense. For programming I use a tiled wm so don't drag windows around at all, I disable all animations so haven't felt any need for high framerates personally.


Yes, I went ahead and got a 4K 120hz monitor even though my use-case is coding, because it was so jarring switching between my M1 ProMotion display and my main monitor. Just scrolling/cursor movement is so much smoother, 60hz vs 120hz now feels like what 30hz vs 60hz used to feel like.


Which one did you get? And don't most 4k 120hz monitors "bloom"? Eg the area where your mouse cursor is brighter and when you move it it moves with a bloom around that area?


I can easily tell the difference between 60hz and 120hz+, but it absolutely does not matter to me for web browsing and software development.

FWIW I do spend a lot of time with 120hz and 144hz displays.

For gaming it's awfully hard to go back to 60hz after tasting those high refresh rates. In fact, I've been saying that for 25 years. At the tail end of the CRT era a lot of higher-end CRTs could do like 90 or 100hz at 1280x1024. Most PC games couldn't push that many FPS even at lower resolutions but it really helped reduce flicker and eyestrain, even when staring at text in Windows.


I'm shocked that group 2 exists at all. Just wave your mouse around and watch the cursor. Or open up quake and turn around while keeping your eye focused on a stationary object.


Yep it's very obvious when moving the cursor in a circle since you can see every frame.

I'm sure group 2 exists. It's just that all its members have an undiagnosed vision disorder.


There are also people who can't really focus on moving objects very well. They generally look extremely blurry to me even if moving relatively slow.

A few years ago e.g. the Simpsons began to use these computer aided 3-d pans which hurt my eyes to watch. Pausing, they are all in normal resolution but when watching them the screen turns into a mush for me and I can just see color blobs.


> Just wave your mouse around and watch the cursor

I just realized that I literally never do this. It is indeed incredibly jarring. even at 60Hz. I think my eyes basically immediately "jump" to where I want to move the cursor, so I don't really "track" it. Do other people track the cursor with their eyes?


My eyes naturally snap to moving objects and track them. And even if I only look at the spot where I want the cursor to be, I can still see it moving and leaving frames behind in my periphery. And that's at 144hz. I'm thinking of getting a 240hz or 390hz soon though.


I used to be in group 3, then I got an iPhone with 120Hz display and fell into group 1. When you turn on Low Power Mode it restricts the display to 60Hz and it’s extremely noticeable for me now. Luckily I don’t notice the difference for monitors because I don’t have any 120Hz displays yet.


It's certainly specific to your use cases. 60hz is totally fine for all use cases, which I think is why this is a contentious topic at all. But higher refresh rate and content that actually puts out enough frames to utilize it does definitely make it a smoother experience. It is night and day to me, but I play fast moving games at 144fps on a 144hz monitor, so if I am missing half over half the frames I do notice.

I guess that's another group, people correctly in high refresh rate mode but not consuming content that reaches that refresh rate.


I find that higher refresh rate seems more impactful on otherwise rather choppy experiences, or games, or phones. I'd go for it if it was free and came with no or extremely minimal compromises to resolution or picture quality, but the underlying display tech is pretty important. Of course, if I were spending $6k on a screen, it better have higher refresh rate and incredible picture quality, because that's not an amount of money I spend on anything except taxes.

I recently had the opportunity to try out an ultrawide 144hz VA panel at approximately the same pixel density as my Dell Ultrasharp 30" that came out a great number of years ago, and it was... fine. Used it for a few months and then sold it. Swapped the 30" back in and am back to my old stance that whatever replaces this screen is going to have to have wildly better picture quality than the ultrawide; I haven't even thought about it since. The picture quality on the VA wasn't necessarily bad enough to notice while I was using it, except for the viewing angles, but comparatively it didn't hold up at all.

Having used another 144hz display right at again the same pixel density at 27" next to my MacBook pro being actively used as a second screen, I think the density and quality of the MacBook display sort of nullified anything I got from refresh rate.

For me to replace my current screen, what I'm hoping for is basically for this 6k to either come down to earth in price, or for competition to come in and build an equivalent. If I could get that for ~$1500CAD eventually (only half of what they are right now on sale), refresh rate would still only be a nice to have. More important to me is a good KVM feature, thunderbolt/one-cable connectivity, and better aesthetics I guess above most other features. A nice webcam would be nice too, but I don't really care much about that being built-in unless it's invisible.


It's a bit complex in my case. For smartphone, 120Hz vs 60Hz OLED is night and day, especially for scrolling. Never leave from 120Hz+. For PC monitor, 120Hz vs 60Hz LCD is distinguishable but not much benefits. Maybe due to I use Windows and non-smooth scroll mouse. It's nice for gaming but needs $$ for GPU.


For gaming you need $$ for CPU as well. It's all fine and dandy if your GPU is capable of rendering/filling the ~1 gigapixel/s of 4K120, the trick is keeping it fed with enough data so it can render all those pixels.


I also find group 3 (in this situation and similar) are often proud of their ignorance too, which can be frustrating. They both haven't had enough experience to properly see the difference and actively try not to.


I love 120Hz on my iPad Pro for general responsiveness (though the response time is dreadful) and 120Hz on my LG OLED for gaming but I’m frankly just blind and will take 4K on 27” at 200% scaling any day for desktop work.


I own 360hz e-sports monitors, multiple 120hz 4k monitors, and still work on a 60hz XDR with absolutely 0 qualms?

I think you're missing a Group 4 that realizes no one's actually made a top quality 120hz display yet and isn't about to sacrifice clarity for refresh rate when it comes to productivity. (fwiw, the MBP XDR comes closest but it's not a display)

If you value overall picture quality, then the vast majority of 120hz+ screens are either blotchy messes or have unacceptably low pixel density (or both)


Lots of people can tell the difference and simply don’t care. I care about black levels but I don’t pretend that people who don’t care are clueless.


As someone who has a 240Hz 1920x1080 Dell monitor, the 60Hz is intolerably choppy and 120Hz is the budget/compromise option.

Seriously, getting at least a 120Hz monitor makes for a world of difference because it makes everything smoother. Increases in resolution beyond 1920x1080 meanwhile are going to quickly hit the point of diminishing returns, anything beyond 2160p particularly so.


I have a 240Hz screen as well and decided to go back to 4k with 60Hz. Yes, the difference is very noticeable in direct comparisons, but in reality I think about other things when using my computer and 60Hz still feels smooth enough for desktop use. Meanwhile text on a 1080p screen above 15" feels more like reading low-res bitmap fonts and strains my eyes nowadays.


Its not a resolution issue, its DPI and viewing distance. I read text on a monitor all day, 4k in a 24inch frame at desk chair distance is the minimum for me


I find it hard to believe anyone wouldn’t notice the difference when comparing side by side.

It may very well not be worth the price, but that’s different.


I have 120Hz on my phone and the difference is remarkable. But I'm still sceptical. I just wonder if rendering at 120Hz then averaging the two frames to display at 60Hz would have the same effect. It seems like a pretty simple demo to set up (if you know how). Does this exist somewhere? To convince me that real 120Hz is actually worth it.


I always thought >60 Hz was basically nonsense, but since I got the iPhone 13 Pro with 120 Hz screen I'm converted.

It's like having a small ache in your back - not really an issue, no problem to live 100 years with it. But then one day...it's gone. Do you really want to go back now?


I can tell the difference but it doesn’t do much for me (except in touch screen phones where I’m scrolling with my finger). I’m not a gamer, and I’m weird enough to turn off smooth scrolling in browsers and word processors because I hate it. So it’s really just a matter of how smooth the mouse cursor’s movement is.


I own 60hz, 90hz and 120hz displays and honestly can't find the difference between them. Everyone loves higher refresh rates but I can't feel ANY difference between them and they all feel the same. But I can see the difference when a game is at 60fps and 120fps on all of them. Not sure why


> But I can see the difference when a game is at 60fps and 120fps on all of them. Not sure why

Because twice the refresh rate means half the latency. And in games like shooters any mouse movement causes the entire screen to change, which makes lower latency very noticeable compared to just a moving cursor on the desktop. This is why VR feels really bad on low framerates, the latency between head movements and screen updates is too long.


I know I'd notice the difference and would strongly prefer 120Hz, but I still stay away from them for now because if I got a 120Hz monitor, I would also need to buy a new MacBook Pro to match the refresh rate. (Also not sure if the M1 Air could even drive 4K 120Hz.)


If you can't tell the difference between > 60hz and 60hz, you may want to get your vision checked. I'd say once you get to 75-80hz the returns start to diminish, and by 120 things are hard to notice, but 60hz and 120hz differences are definitely noticeable.


No, I have 144hz and 60 hz monitors side by side at home, and 120/60 hz side by side at work. I can certainly tell the difference. It doesn’t bother me much, although it’s enough that I have held off on the recent Apple display mostly because it only does 60hz.


After experiencing 120Hz it ruined 60Hz monitors for me


I can tell the difference between 30 and 60fps and I just don't care. 60fps is smoother sure but I don't find it better in any objective sense, just different. My brain is quite capable of filling in the missing frames itself.

Maybe I have a "vision disorder" but since even modern PS5 games are increasingly going back to 30-40fps (preferring graphical quality over framerate) I think that is actually the majority position. High framerate lovers are the coffee snobs of the computer world.


Is there a Group 4: Can tell the difference between 120Hz and 240Hz?

I use the 32" Odyssey G7 which is 2560x1440@240Hz.


Yep. It really depends what you do with your monitors. For gaming I'll never go back to anything less than 240hz. Even high end video cards struggle to render modern games at 240hz consistently at 4k. 1080p is fine.


Group 4 does video/photo/graphics work so they prioritize color accuracy and high resolution, so they would always trade >60Hz for more resolution, at least up to a point that has yet to be achieved. And that isn't even factoring in cost, which is a big one since Group 4 is a smaller market than Groups 1-3 so the displays targeting them (like this Dell 6K) tend to be the most expensive on the market by a large margin.

Realize >60Hz makes exactly zero difference for video playback, so the only areas for improvement are cursor movement and scrolling smoothness.


Yes, since many years my minimum requirements for any monitor are 4k resolution and 30-bit Display P3 / DCI P3 colors. There are plenty of very affordable monitors like this.

I would like very much to also have an 120 Hz frame rate, but then the price would become much higher.


Mine 27" MSI Optix MAG274QRF-QD has 188% sRGB, 127% AdobeRGB, 133 DCI-P3, 10bit@120Hz or 8bit@165Hz, cost around 450$. A would say solid price for this results measured by spectometer.


60 is not a multiple of 24.


Nor is 120. Even with a 240Hz display, capturing and editing should use 3:2 pulldown for best end user playback. https://en.m.wikipedia.org/wiki/Three-two_pull_down


Check your math, son.


My point is none of these refresh rates are multiples of 23.976.


So set your monitor to 119.88 instead of 120.

Theres a reason why monitors give you both x/1000 and x/1001 framerates.


I didn't know this was possible, but in such a case why not set a 60Hz display to 48Hz, 59.95Hz, etc.?


48hz is supported on few monitors, not often (but can be hit by gsync/freesync/VRR); 24hz is sometimes supported (often seen in TVs).

And you can set it to 59.94, nobody is stopping you.

This entire comment chain started with, essentially, what is the LCM of 60 and 24 (or 59.94 and 23.976), and its 120 (or 119.88).


I think your point is that 1/1000 frames would need to be 6:5 at 120Hz?

I'd be more concerned with the automatic black frame insertion most LCDs do to increase contrast.


Theoretically maybe, but 3:2 pulldown is used for playing 23.976 fps video at 29.97Hz. Since this is HN maybe someone with more knowledge about how video editors and modern TVs typically handle this can jump in here. Regardless, I think this would actually have more impact on the end user viewing experience than the job of video editing. The time between frames is tremendous from the standpoint of a video editor, and editing is usually (traditionally) done by feel: press button when the cut should happen, mark it, then arrange the timeline accordingly. Lag aside, frame rate and which frame is actually on the screen at that time matters much less than whether the software knows which frame should be on the screen at that time. Hopefully that makes sense. For this reason, resolution and color accuracy will still take priority when it comes to display hardware.


I worked on display drivers and TCONs, but mostly for mobile/laptop rather than TVs/Monitors. I'd be fairly shocked to see the defects you're describing coming directly from within a device, but going through multiple translations PCIe>eDP>TB/DP/HDMI... especially if they're not well tested or badly negotiated is certainly a possibility. I wouldn't trust most external connections or monitors for video editing, unless they're specifically tested.

Note that 1/1000 is glitch every 40 seconds so it's quite visible to an "eagle eye". I'll ask.


The answer from a Pro was Genlock so you match the 23.97. "It doesn't matter if you drop a frame every once in a while, you're going to see it a dozen times... as long as it's not the same dropped frame!"


The worst part of incorrect refresh rates for me is on panning footage and you get those janky blocky tears in the image.

>The time between frames is tremendous from the standpoint of a video editor,

This sounds like something I've heard from people with a head full of fun stuff talking about the space between the notes. There have bee times where that absolutely makes sense, but I'm at a loss on your time between frames.


> The worst part of incorrect refresh rates for me is on panning footage and you get those janky blocky tears in the image.

That sounds a lot more like rolling shutter artifacts than 3:2 pulldown. What kind of camera are you using? Are you shooting with a CMOS sensor?

https://en.m.wikipedia.org/wiki/Rolling_shutter

> This sounds like something I've heard from people with a head full of fun stuff talking about the space between the notes. There have bee times where that absolutely makes sense, but I'm at a loss on your time between frames.

Haha, fair enough. If you ever feel like diving in yourself, I passionately recommend In the Blink of an Eye by Walter Murch.

https://en.m.wikipedia.org/wiki/In_the_Blink_of_an_Eye_(Murc...


It has nothing to do with 3:2 pulldown. It is all about refresh rates of the monitor. I've shot for years on global shutter (specifically Sony F55), so it absolutely 100% was not a rolling shutter issue either. The same footage can be viewed on another monitor and the tearing issue is not present.

Edit to match your edit: "The book suggests editors prioritize emotion over the pure technicalities of editing."

This totally depends on the content and level of production. I've edited content from properly staffed productions with script notes with circle takes and all that stuff. It's always fun to stack up the various takes to see how the director feels about the takes from the day of the shoot and seeing it edited context. It's also fun to see the actor's variations from take to take.

On shoots with barely enough crew so the camera op is also the boom op, it's basically all feel from the editor.


> The same footage can be viewed on another monitor and the tearing issue is not present.

This is what I was hoping someone would chime in about. I have never looked into whether it would be handled differently, but I would not trade a higher resolution display regardless. Maybe it could potentially influence where I cut in certain rare situations, but sounds unlikely.


Basing edits because of how footage looks on a monitor with a non-compatible refresh rate just sounds like one of those problems that strikes me at my core especially when someone acknowledges it but does it anyways. Does it matter in the end, probably not, but it still goes against everything. It’s one of those things of seeing people “get away” with things in life blissfully unawares while someone that is well versed and well studied can’t catch a break.


I hope you get sleep at night. When I worked as a video editor years ago, I unfortunately had a boss who I needed to please and this kind of rabbit hole obsession would have added a significant barrier to doing so. More resolution, on the other hand, made me straightforwardly much more productive.


This doesn’t make any sense. Why would you want to use 3:2 pulldown unless your display is interlaced, which AFAIK will never be the case for any modern display?

And even if you did use it, it doesn’t do anything to help with the extra 1000/1001 factor, so what is the point?


3:2 pull-down works for converting 24fps to 60fps. It doesn’t matter if the target f is fields or frames.


Yes, it does. 3:2 pulldown produces interlaced 60 fields/s. On a digital display, it must be deinterlaced, and the only "correct" way to do that is to remove the pulldown, producing 24 fps. If you just deinterlace it as if it were originally 60i, you'll just end up with something similar to 24p converted to 30p by repeating 1 of every 4 frames (with a loss in resolution to boot). So for digital displays, 3:2 pulldown is pointless at best, destructive at worst.


The film industry should stop using 24fps, it's a waste of people's time and energy. At least they should move to 25fps which is what most of the world uses as a frame rate, if not 30fps.

For the stupid North American non-integer frame rates, just change the playback speed by a fraction and get on with life. Or drop 1000/1001 frames for live, people won't notice.


here here. we finally have a standard starting with UHD that does not included interlacing. finally. hallelujah the chorus of angels are singing.


> Why would you want to use 3:2 pulldown unless your display is interlaced

At this point, the only great reason is that it's an industry standard, but that alone is more than enough reason to still do it, evidenced by the fact that so many people still do it.


who in the world wants to use a 2:3 pulldown pattern on a progressive monitor? the majority of my career has been in properly removing 2:3 pulldown, the other portion was back in the bad-ol-days of putting it in.


> who in the world wants to use a 2:3 pulldown pattern on a progressive monitor?

At least everyone tasked with editing 3:2 pulldown footage for 3:2 pulldown distribution, which is most of the video editors in North America the last time I checked.


Who wants 3:2 content for distribution? No streaming platform wants 3:2, and they all want the footage delivered as progressive scan. Some will say things like "native frame rate", but I find that a bit misleading. There are plenty of television shows shot on film at 24fps, telecined to 30000/1001 with 2:3 introduced, then place graphic content rendered at 30p. The term "do least harm" gets used so that taking this content to 24000/1001 so the majority of the content (that shot on film is clean) while leaving graphics potentially jumpy (unless proper frame conversion with an o-flow type of conversion that nobody really wants to pay for).

Edit: also, any editor worth their salt will take the telecined content back to progressive for editing. if they then need to deliver like it's 2005 to an interlaced format, they would export the final edit to 30000/1001 with a continuous 2:3 cadence. only editors unfamiliar with proper techniques would edit the way you suggest.


Admittedly, I haven't worked as a video editor since 2011 and never edited telecined footage, but my understanding from friends is that little had changed. Specifically I have heard them complaining about it. That streaming platforms specifically want progressive scan makes plenty of sense to me of course, but conflicts with what I've heard for whatever reason.


I can’t say as I fault them as I’ve spoken with teachers that don’t know how to handle telecine content. I also know plenty of editors that have no idea the purpose of a waveform/vectorscope. Again, neither did some of those instructors.

For people never having to work with this kind of content, it makes sense. I’d equate it to modern programmers not knowing Assembly, but can write apps that perform adequately. There’s plenty of content shot on modern equipment delivered to non-broadcast platforms that will never need to know the hows/whys old timers did what they did


We've had the technology for a while now to make an 8K HDR 120 Hz OLED monitor. The panel tech exists, the scalers exist, the ports and cables also both exist.

We'll have this available for purchase any decade now... any decade. Soon, perhaps as early as the 2030s, 2040s tops!

All joking aside, I'm tempted to create a GoFundMe page for this and see if there's other people out there interested in convincing a panel manufacturer to print some 8K monitor-sized OLEDs...


I've got the opposite request. I'd love a native 720p oled of a decent size for playing Switch (and PS3/360, I've got a bit of collection) games on.

It's probably a niche market, but most games on the switch run natively at somewhere between 720 and 1080p and look awful scaled up to 4k.


Why not 1440p with integer 2x scaling?


It's very uncommon to find a monitor or TV that actually does non filtered integer scaling, most apply a blurry bilinear scale that looks awful. And afaik there are no external HDMI pixel doublers on the market, I would buy one in a heartbeat.

Native 720p does look slightly better than integer scaling to 2x because of subpixel stuff, and generally liking that LCD look. But integer scaling would be enough really.


> And afaik there are no external HDMI pixel doublers on the market, I would buy one in a heartbeat.

Sounds like a fun project, you should build one.


I did look into it a little bit, I think the reason none exist is because it's actually pretty difficult.

Taking in a 720p hdmi signal and pixel doubling to 1440p or tripling to 4k and outputting that as a valid HDMI signal without adding more than a frame or two of lag would require custom chips. Although maybe a software solution involving a gpu might be possible?


FPGA is probably a safer bet if you want to minimize latency.

Implementation is left an as exercise for the reader.


Take a look at the HDFury products. The X4 should do what you need with those lower resolutions.


Interesting. Did you investigate FPGAs?


Or maybe Nintendo should upgrade the 2014 Android SoC of the Switch so games could run on modern resolutions.

There must be TVs or scalers out there you can buy that are used by retro gamers and let you runt 1:1 scaling on modern TVs. Or you can emulate the Switch games and run them at natively higher res.

Either way, I'm sure there are solutions for you out there instead of display manufacturers having to go back 20 years in time to serve an outdated gaming console. Just a thought.


It wasn't a serious suggestion for a product, I'm aware there's no market for it.


Why not buy an older second hand 720p TV?


I have one, but it's a pretty crappy LCD panel, OLED would be nicer.


Look for plasma TVs instead. I have an old 42" 720p Samsung plasma TV at my parents house and it beats all 1080p LCDs in contrast, sharpness and response time. Perfect for vintage gaming. Pretty sure you could get one dirt cheap on the used market.


Since most computer OSs keep some things in the same place all day, wouldn't you get a lot of burn in with an OLED?


While most monitors these days have some protection built in to combat burn in, I believe it's still an issue, yes. It's why Monitors Unboxed (a brilliant YouTube channel, by the way) doesn't recommend[1] using an OLED monitor like the Asus ROG Swift PG42UQ for productivity work.

[1] https://www.youtube.com/watch?v=MNBmFJ68SCw


_Do we_ have the cables and ports? AIUI 8K HDR 120 doesn't fit in DisplayPort 2.1 without DSC.


Just buy a 8K tv?


"Is a matte screen the better option compared to the 8K monitor’s glossy finish? <...>

Surprisingly, I found that I don’t like the matte screen better!

It’s hard to describe, but somehow the picture seems more “dull”, or less bright (independent of the actual brightness of the monitor), or more toned down. The colors don’t pop as much."

A useful post. Whilst I've not yet had the luxury of owning a 6k or 8k monitor his comment about matte and glossy screens is very useful (also I'm more likely to trust his opinion than many others because his discerning experience is backed by the fact that he's actually using an 8k monitor—and that takes money and commitment).

Over the years I've made many comparisons between matte and glossy screens and I have to agree with him for the same reasons. Despite being more susceptible to reflections, glossy screens always seem brighter and somehow sharper than matte ones of the same resolution and brightness. I've never investigated the reason in depth but I suspect the matte finish disperses the light from the screen and the net visual effect is a 'rounding' of transients resulting in lower contrast somewhat like offset printing which never looks as sharp as letterpress (where the ink dries thicker and looks blacker at the edges of the imprinting, this increases visual contrast even though the resolution may not be much different).


Matte never made sense at all. I have no idea how matte ever made it into PC monitors. I mean, before lcd screens all screen surfaces where smooth glass. Then came LCD in plastic. Did the early LCD tech necessitate that the outside layer be not as smooth for some reason? Maybe it was just a coincidence that was later cargo-culted? Anyway, there's a reason macbooks, and almost every TV uses a glossy surface.

Their main supposed advantage, that they mitigate glare, is just false. With a glossy screen you get a small point of reflection. Matte attenuates it just a tiny bit, but never enough so that is not bothersome. You can usually tilt/angle your screen to fight glare, but matte makes it harder because the glaring spot is bigger/more diffuse. Also, higher brightness is one of the things that fight glare the most and matte reduce it a lot!

Mobile phones, the devices we carry outside the most and should be most susceptible to glare issues never ever come with matte screens! How is matte even still a thing?


I strongly prefer matte, to the point of immediately excluding glossy from my initial selection.

In my experience matte only has issues when the sun is shining directly onto it, whereas glossy will immediately glare in areas which aren't incredibly light. Got something with black on your screen? Guess what, it's suddenly a mirror! Additionally, glossy is way more susceptible to fingerprints, so you have to religiously keep cleaning them.

> There's a reason macbooks, and almost every TV uses a glossy surface.

I always thought it is just because it looks "premium".

> Mobile phones, the devices we carry outside the most and should be most susceptible to glare issues never ever come with matte screens!

Yes, and I often notice myself tilting the screen away from reflections. That's a lot harder with a desktop.


I hate glossy monitors, the mirror is so distracting, it catches my eye all the time. Even if the reflection is faint, it is unexpected movement.


I prefer matte monitors although it is not that important to me.

>You can usually tilt/angle your screen to fight glare, but matte makes it harder because the glaring spot is bigger/more diffuse.

I would need to be able to switch quickly between a matte monitor and a glossy one to be sure, but IIRC the main advantage of matte is a many-fold reduction in how often I have to swivel my attention to the issue of glare. In other words, it is the need to "tilt/angle the screen" that is the main cost of glossy for me because I cannot do that without taking some of my attention away from the task at hand. In contrast, if I am already constantly holding the screen in my hand a la a smartphone then I can adjust the angle with very little conscious attention, so IIUC I have no preference for matte screens on smartphones, which if my preferences are common, would explain the absence of matte screens in smartphones.


> Their main supposed advantage, that they mitigate glare, is just false.

It is very true, contrary to what you claim. The glare on my old trusty Dell U2413 matte display is next to non-existent compared to Apple M1 Air that is hooked to it.


maybe I wasn't very clear in my point. I'm not saying matte doesn't attenuate glare. I'm saying that it does some, mainly by diffusing it and spreading it around and I personally can't stand any amount of it. It's binary for me: If I get any amount of glare I tilt my screen away until there's none.

Another issue is the nature of the reflection. In a glossy screen, the reflection often is a mirror effect that, at least for me, is easier to ignore. Reflections in matte shine up the surface's tiny individual indentations and that blocks more of the screen under it.

And what about screen protectors? You can make a glossy screen matte with one, but not the other way around.


That's just like, your opinion, man. It is clear that you don't like matte screens, but it is a far road from your preferences to universal statements like "Matte never made sense at all."

Also, screen protectors? Last time I saw one was ~1994.


haha, oh of course all of this is implicitly prefaced with "In my opinion".

> Also, screen protectors? Last time I saw one was ~1994.

Ugh, I think you're thinking of those tinted glass things you'd hang on old CRT monitors? Never heard of those being called "screen protectors" though. By screen protector I mean replaceable covers for flat screens:

https://www.aliexpress.us/item/3256805177079468.html?spm=a2g...

My proposal for the industry: ship screens as glossy and add the matte as an optional screen protector. Note that before around 2010 this was already how they shipped displays: the matte was a plastic layers that you could frequently peel off. Nowadays, most matte displays have the surface finish done right on the polarizer layer.


I prefer matte screens by far. Sure, they don't look as shiny, and that's the point!

It allows for a more exact color representation with less eye strain, especially in brightly lit conditions. No lamps glare, sunlight doesn't bother you as much, etc.


I always buy glossy TVs and matted monitors. The TV is only used for movies and I close all the blinds for the full cinematic experience. But I don't want to work in a dark office, it does not feel good and I don't think it's healthy either. Sure, glossy looks a lot better and the reason is simple: opacity. The anti reflection treatment is not 100% transparent. But glossy is unusable in a healthily lighted room.


"But glossy is unusable in a healthily lighted room."

As I said elsewwhere, I don't think matte is a problem in well-lit office environments as the usual office apps usually don't warrant the extra contrast and sharpness. That said, I used to run an IT operation and the matter arose quite often. What was telling is that some people simply didn't care about glare, they either moved their heads or just tilted the monitor a little whereas others found glare intolerable. I suppose that's why both types are available.

BTW, I'm one of those who doesn't find glare a problem, for some reason I 'tune' it out (I do the same with audio, I can listen to the radio through atrocious noise and static that drives others crazy).


I agree. I thought I generally prefer matte for office displays. But even since purchasing a Studio Display, matte display just look dull. Also the Studio Display seems to avoid reflections by auto adjusting the brightness when the room is well-lighted.


Yeah, that makes sense. In an office one's more likely to be doing say word-processing in a highly-lit environment where detail and contrast aren't that important. I normally do photo editing in subdued low-light conditions where gloss is not an issue.

Edit: Incidentally, I'm typing this on my Lenovo ThinkPad which has matte screen but I'm using an external monitor with a glossy screen which I prefer (to be fair, it's also slightly larger). No, it wasn't a mistake at purchase time, it was given to me. :-)


Also helping reflections on the Studio Display is its antiglare coating which is unusually good for a glossy display, at least when looking at it head-on. It’s one of the parts of the monitor where Apple went the extra mile.

Previous Apple displays have been decent in this category too, though. The 5k 27” iMac’s coating was also pretty good and what they use on MacBook Pro 14/16” is only a little worse than what’s on the Studio Display.


Absolutely, hell no, if you are talking about the screen itself.

The difference in crispness between the Dell UP321K (~280ppi at 7680x64280 at 31.5 inches) and this new U3224KBA (~220ppi at 6144 x 3456 at 32") is dramatic.

The new Dell 6K is about the same as the Apple 6K (the XDR Pro Display). I had that 6K Apple and the 8K Dell side-by-side on my desk for a few months (until I could find a buyer for the dramatically inferior Apple). During that time, I asked anybody who happened to stop by to look at the screens and compare them.

Almost everybody could see the difference immediately.

A few people, admittedly, said they looked pretty much the same. I think there's a certain level of eyesight required to discern the difference. Having said that, I got my first prescription glasses 2 years ago, and I can easily see the difference with glasses on or off.

I cannot see pixels on either one. But what I can see is a slight fuzz at the edges of letters on the lower res display. Moreover, I need my glasses to use the 6K for long stretches and I don't even really need to wear them with the 8K. That's what made me so attached to this monitor.

I was therefore bummed to see Dell release this 6K. I want a modern version of the 8K! Two cables sucks, it's quite a compatibility nightmare to get configured properly on Linux if you want to use cheaper monitors alongside it (because the software support for multiple displays at different scaling factors is pretty bad still on most Linux). On Mac it never worked at all, until they released the M2 chip(!).

(It has always worked fine on Windows, just plug it in.)

Having used this for a few years, I don't ever want to go back to 6K, or 5K, or 4K. I just want 8K to get cheaper so that I can rock 3 8K displays, instead of the 4K-8K-4K I suffer today.... also, 120Hz please.


Are you sure you're not just experiencing some other difference in manufacturing, like a difference in the matte finish?


That's a good question, but yes, I am very sure.

I have compared them with magnifying glasses, and really tried to get into why such a (seemingly, at first) small increase in resolution made such a huge difference.

I've compared standard 27" 4K, the LG and Apple 27" 5K ), the Apple 6K and the Dell 8K. It really is just a lot more pixels. The fuzz is definitely small, but there.

Another good test is open VS Code full screen, divide it up into panes, and make the text as small as you can read it. I mean really small. The text on the 6K just looks awful while the 8K still looks good.

Sometimes I do actually do this and put on my glasses (the text is too small for me to read without assistance, but it is still pretty crisp). It feels like a nerd superpower.

There are actually a bunch of us 8K superfans out here on the internet, but it's been a lonely party, because the Dell UP3218K seems to be still the only 8K computer monitor you can buy. (As in a roughly 32" desktop monitor, not a TV.)


>The new Dell 6K is about the same as the Apple 6K (the XDR Pro Display).

I don't think they're in the same class.

Spec shows Apple's Pro XDR is 2.5x brighter and have a much better contrast ratio along with 10bit and P3.


Sorry, I meant about the same resolution.

But in that case I'd expect the Dell 6K to fare even worse than the Apple 6K in a head-to-head comparison with the Dell 8K.


> I need my glasses to use the 6K for long stretches and I don't even really need to wear them with the 8K

It’s not obvious to me why this would happen. Can you share why this is?


No, I found it really surprising myself.

With my glasses on, though, the 6K (or 27" 5K) look less crisp but somehow not fuzzy. Like I can see the antialiasing at the edges, but this doesn't look blurry to me. (It just looks more like a 4K display, where the antialiasing isn't great, you can see the pixels.)

With my glasses off, I cannot make out the antialiasing or pixels the 220ppi screens, but it looks fuzzy/blurry. Like its right at the edge of what my unaided eye can resolve.

After a while (hours), looking at the blurry text has me rubbing my eyes or starting to squint.

The 8K screen doesn't look blurry at the edges of letters. It just looks sharp.

I have wondered if this is something specific to exactly how much my eyesight has deteriorated with age, and maybe when it gets a little bit worse, it won't bother me as much. (?)


Did Apple slip some better scaling in with the M2 chips? Or maybe with the OS update?

Or perhaps you're referring to HDMI 2.1 support?


No, HDMI 2.1 doesn't matter (yet) because there are no 8K desktop monitors (meaning 32-inch size or similar) on the market that use HDMI.

The only 8K monitor has for years been the Dell UP3218K, which uses DisplayPort -- and requires two DisplayPort cables, actually, to get 7680 × 4320 at 60Hz.

Apple has never supported this on any of their machines -- they just couldn't drive the monitor. (It worked, but only in 4K mode.)

They quietly changed this with the M2 machines. I had a MacBook Pro M1 Max that couldn't drive this monitor at 8K. Then I found this GitHub thread[1] where it was revealed that M2 Pro can drive up to one of these 8K displays over Thunderbolt (to DisplayPort). And the M2 Ultra on a Mac Pro or Mac Studio can drive 3 of them.

I don't think it is scaling, per se, but rather that Apple has never supported the full DisplayPort spec. That 8K monitor apparently needs support for something called "dual SST" and Apple never supported that in their software. More details are in the linked GitHub discussion.

So, I don't know why they didn't make this work on the M1 Ultra, too, but Apple gonna Apple. So I went down to the Apple Store and bought a Mac Studio M2 Ultra the day I read that. Now I can plug my Mac into my KVM switch and use this monitor on Mac just like I always could with Linux and Windows.

[1]: https://github.com/waydabber/BetterDisplay/discussions/199#d...


Very awesome. Thanks for sharing these little known M2 updates!


Slightly off topic but I can’t wait until there’s a 120 Hz 8K monitor. It’s the only thing holding me back from upgrading from 4K. I wonder if the current limitation is on the panels, cable bandwidth or absurd price tag…


Absolutely. Mediocre resolution increases has been one of many disappointments for me when it comes to technology over the past 20 years. We had CRT monitors with better resolution than 1080p back in the late 90's early 2000's before LCD panels saddled us with 1080p resolution for 15 years. 4k is the bare minimum that should be available right now. I can't wait until I can have a 16k monitor at about 3'x5-6' on my desk. Maybe it will happen in my lifetime, but I'm not holding my breath.


8K is four times the pixels and therefore four times the bandwidth as a 4K monitor.

It took us a long time to go from 1080p to 4K. It has taken even longer for 4K at 120-144Hz to be practical.

It’s more likely that you’ll end up with intermediate steps to 5K, 6K, than getting 8K 120Hz.

The other limitation is lack of demand. You need a gigantic monitor for 8K to be worth it, and you need a powerful video card to drive it. The number of people who would buy such a monitor is very, very small.


>You need a gigantic monitor for 8K to be worth it

I have a 4k 24" monitor that I can still see aliasing on with AA disabled.

8K 32" would give me more real estate and should, in theory, completely eliminate the need for AA.


Which makes me wonder what's the point of the article's author, 4k vs 6k on 32 inch one is already far into diminishing returns, 8k on 32 inch is just numbers for numbers sake


I don’t know if that’s necessarily true. I use a 32” 4k 144hz monitor at 100% scaling just fine. I’d loooove to replace it with an 8k monitor with similar refresh rate to run at 200% scaling and keep the same amount of workspace I have now


I think you're going to be waiting a long time, even the geforce 4080 and 4090 don't support displayport 2.0.

Additionally much of the demand for >60Hz is for gaming purposes, and there is nowhere near a powerful enough GPU anyone can afford that would be able to render games in high quality or extreme detail level at 8K above 60 FPS. Right now a GPU that costs $1500 USD can maybe render a 4K game with extreme detail level at framerates that vary between 55 to 75 fps.


The 3090 does support displayport 2.0, though. The 40 series are geared towards productivity use cases where people are fine with 60hz.


What is your actual use case apart from technology fetishism?


120 Hz feels a lot smoother for scrolling or anything with movement on the screen.

8K is the point in which at the monitor size I use, individual pixels would be too hard to see. They're already a bit hard to see at 4K, but at 8K it'd be perfect.


Just sit further away or get older :)


I've recently started noticing latter has been happening to me without my consent ;)


I'm trying to avoid either solution


Tell HN when you solve the latter!


Perhaps parent is a gun shrimp or pigeon.

I try not to be overly sapien-centric when making assumptions about my fellow HN readers.


An interesting part of the recent book An Immense World was its coverage of how mantis shrimp likely don't use photoreceptors like human's do.

Marshall now thinks that the mantis shrimp sees colors in a unique way. Rather than discriminating between millions of subtle shades, its eye actually does the opposite, collapsing all the varied hues of the spectrum into just 12.

From Science: "A Different Form of Color Vision in Mantis Shrimp"

The mantis shrimps (stomatopods) can have up to 12 photoreceptors, far more than needed for even extreme color acuity. Thoen et al. conducted paired color discrimination tests with stomatopods and found that their ability to discriminate among colors was surprisingly low. Instead, stomatopods appear to use a color identification approach that results from a temporal scan of an object across the 12 photoreceptor sensitivities. This entirely unique form of vision would allow for extremely rapid color recognition without the need to discriminate between wavelengths within a spectrum.


I can't speak for him, but for me? Straight integer scaling of 4k and 1440p. I loathe fractional scaling, and I cannot wait for the day that I can run an 8k display at > 90hz without compromise


I definitely can't afford an 8k monitor, but this is also the reason I'd want one for gaming. Realistically, you'll probably never run it at 4320p, but depending on how demanding the game is, you can choose between 2160p, 1440p or 1080p without weird scaling. With a 4k monitor your only option is to go all the way down to 1080p.

Manufacturers can also justify making pretty big screens with that many pixels. Samsung's 8k TVs are 80-something inches. Since it's that big, you can sit further back, making up for any loss in pixel density, and still have a huge picture.


> I loathe fractional scaling

Why? Is that an OS problem?


Not the original commenter but, it... has a certain look to it. It's not something that can be totally solved by the OS or any scaling algorithm because fundamentally you have to deal with rendering fractional portions of a pixel to a physical screen where there is no such thing as a fractional pixel. It just always looks different, and IMO worse, than integer scaling.

You have to trade off distortions for bluriness and you can't avoid both of those artifacts at the same time. The higher your pixel ppi, the less noticeable any scaling artifacts will be though.


I don't understand why 'distortion' (things being rendered ± 1 pixel over on some screens vs. on others) is a problem unless your GUI frameworks and/or apps count on pixel-exact layouts for some UI elements. But why would they? Isn't the entire web built on a reflowable format that works pretty well? Shouldn't those tiny 1-pixel differences be like the easiest possible variation for a GUI system's layout engine to cope with?

Do we have lots of scalable UI elements that expect to line up with raster images a certain way on most operating systems?


For me, a 1 pixel asymmetry in a button at medium ppi is noticeable and mildly distracting. I don't mind it much for UI, but I understand why others would want to avoid it. I tend to set my ui and toolbars to auto-hide. Personally, the reasons why I avoid it are text rendering and gaming usage.


This post has more information, as well as tons of other related info: https://tonsky.me/blog/monitors/


This was really helpful, thanks!

It's a shame (for me, who can't afford HiDPI displays to replace his current ones, and would have difficulty pushing all those pixels even if he could) that Apple removed subpixel anti-aliasing. :(


This is the exact same reason I want it.

But with OLED.


In CAD/EDM tools higher resolution means more productivity, (to a point) as you can fit more useful information on the screen at a time - you can “zoom out” more and still keep a useful level of detail. Especially useful in schematic and pcb design where dense areas of interest can be spatially disparate. I don’t like large screens and currently use 24” 4k screens which seem to be either unavailable or expensive, they were ~$350 in 2015 and don’t seem to have any equivalent nowadays.

The 120 Hz i don’t understand however i am not a gamer.


Once you try a high refresh rate monitor, even for work, you just don't want to go back. Every movement and animation is buttery smooth.

Try setting your refresh rate to 30hz for an hour.


People say this, but I’ve had people fail double blind tests for 120hz vs 90hz vs 60hz. I’ve yet to find anyone that can reliably tell 144hz vs 120hz.

What people mostly notice is latency not refresh rate.


I'd agree with you. I think people who use those sorts of displays frequently could tell the difference between 60 and 120hz but 120 vs 144hz seems way too close together.


Took me 2 seconds to notice that a friends monitor was at 60hz by just moving the mouse. Just look at the distance between each cursor icon update as you move it.


Some people use really slow mice (low resolution and/or configured for low sensitivity), which I suspect is a factor here.

It's noticeable for me on every single mouse movement— no special effort is required to move the cursor 'quickly'.


That’s a rendering artifact.

You can show multiple mouse images at 60 fps which shouldn’t trick people if they can actually see 90 vs 60 vs 120 fps.


Anecdotal, but my friends and I have passed such tests. YMMV


You mean double blind 144hz vs 120hz when latency isn’t an issue?

If you don’t mind me asking how old are your friends?


That's crazy to me, just move the mouse around really quickly and you'll quickly notice it's not 120hz (if you are used to it).

What blind test did you use?


They watched normal desktop use and a video game loop.

I added mouse trails to verify people were actually noticing the FPS not just artifacts from the rendering pipeline.


Oh, pretty sure the mouse trails will kill any ability, that’s by far the most noticeable thing about the higher refresh rate in my experience.


Agreed. I have a number of 4K 144 Hertz monitors, I'd like a 6 or 8K monitor but until they have it in high refresh create I'm not switching. I'm not much of a gamer, but when I do occasionally game it is significantly more fluid as well.


Theres a hill people are willing to die on. Resolution matters for me, I don't care about refresh rate.

I'm constantly switching between my M1 Air and work M2 16 Pro and refresh rate has never bothered me.


I have 60, 144 and 165hz displays. I have to say I don't really see much of a difference. Around 30 hz yes. But not over 60. It's probably some sort of genetic vision difference thing.


Even moving the mouse quickly should be a very different experience between 60 and 165


I appreciate that. I can't discern the difference like most people for some reason, unless I am really looking for it.


I guess you could look at the use-cases for 120Hz displays on MacBook Pros.

It's useful for smoother scrolling amongst other things.

I'd like a display that has parity to my laptop but is just bigger so I can fit more on it.


MBP's 120hz display is a lifesaver for me.

Previously, I actually often felt motion sickness when scrolling through code on small 60hz laptop screens. Had no problems with larger (> 23") desktop screens, though.


Smooth scrolling does that to me, scrolling where screen just jumps a bunch of lines at once doesn't bother me


Literally a 3'x5-6' monitor that can render text as crisply and cleanly as print. That's all I want.


8k@120Hz is going to need one heck of a video card


It's venturing into cryptocurrency-space-heater levels of pointless number crunching to render at that level of detail for anyone who has human eyes.


> It's venturing into cryptocurrency-space-heater levels of pointless number crunching to render at that level of detail for anyone who has human eyes.

Depends on your monitor size.

Might be a waste at 27", but if you want to use a 48" display, I can assure you that you'd notice the move from 4k -> 8k.


Yeah, even driving the display at 4K, you start to notice the higher pixel fill factor for 8K displays above 48in. I love my 65in 8K Q900 - even though it mostly lives at 4k (120hz!).


Can you explain this a bit more, I tried googling but I can't quite understand what you mean here by pixel fill factor and how it would differ between the resolutions?


The gaps between the pixels tend to be smaller the higher the resolution - so even if you drive the display at a lower resolution, it can look better the same lower res display that has larger spacing between each subpixel.


Interesting, does this give you a noticeable benefit for text, or does this mostly apply to images or video?


User interfaces shouldn't load a GPU very much even at that resolution.

8K in a game will use tremendous amounts of power, but it's not pointless. It's like having antialiasing turned on. And high frame rates are important for motion because normal rendering only gives you a single point in time.


I’d argue that AI driven super resolution like DLSS should be more than sufficient to upscale 4k to 8k with minimal performance loss and acceptable image quality even for gaming.


Not for tmux + firefox


Unless people are far more sensitive than I am, I don't see how >60Hz is needed for a desktop workstation environment. High frame rate is really only noticeable for very fast reaction time gaming.

4K 120Hz may be noticeable if editing ultra high frame rate video on a video editing workstation, but if you are a video production crew with a camera capable of recording at that framerate, you probably already know that.


I can immediately tell just from moving the mouse a little. I wouldn't say it's needed either though.


When I throw my iPhone into low power mode and it drops to a 60FPS cap, it is immediately noticeable.


Many people are more sensitive than you are. I can easily tell the difference between 60 and 120 on both my phone and my desktop.

Though response times also matter a lot for ghosting and such.


Yeah, I'm a gamer and I definitely notice the difference between 60 hz and 120+ hz.

But for desktop productivity? I don't feel I gain anything from it. 60 hz is fine.


60Hz is really noticeable when you’ve been using 120Hz even for a few minutes. 120Hz feels a lot less tiring and work in a terminal, editor and websites is just a lot smoother.


It's immediately noticeable when scrolling in a browser, dragging stuff around or just moving the mouse. If you haven't seen it in person, go to an Apple store and do a quick comparison between the Macbook Pro (120hz) and the Air (60hz), or iPad vs iPad Pro. They're always next to each other.


Not in text mode. Hercules FTW


Not for productivity.


Not if you ignore overhyped (imo) rtx graphics and return back to gaming worlds, instead of smoke-and-neon-lights-in-mirrors pseudorealism.


I'd likely not be gaming at 8K for a while. But for productivity tools it'd be amazing.


ARM macs can probably handle that, the M2 can do 10 8k video streams at once, 22 simultaneously on the Ultra, and people are running 4k 120hz on the M1 with a couple hacks.


I would love a 120Hz 8k 40 inch monitor. You could use 8k for productivity and then 4k for gaming.


For some reason all 40 inch monitors have disappeared off the market!

For me 40 inch is the sweet spot for coding: any larger and it gets too pixelated at 4K (you can make out the pixels, but that's OK for coding), the UI scale can be set at 100% so all is well proportioned. Entire classes/methods fit on a screen without scrolling.

I own two Philips 4K 40 inch monitors, they only cost ~$600 at the time, and I dread the day they stop working. I would be first in line for 6K or 8K, or any ≥4K really, at 40 inch.


It's been pretty amazing how stagnant the monitor space is. I too am really craving an 8k@120 monitor, although there's a decent chance I'll balk at the price.


It’s crazy how much of a regression there was in resolution and picture quality when we went from CRT to LCD displays. In the late 90’s you could get a CRT that did 2048x1536 no sweat with great color and decent refresh rate. Then suddenly LCD displays became the standard and they looked awful. Low resolutions, terrible colors and bad viewing angles. The only real advantage they had was size. It took a decade or so to get back to decent resolutions and color reproduction.


LCDs didn't replace CRTs because they offered better quality to consumers. They were worse for all the reasons you mentioned and then some. LCDs were cheaper to make, much lighter and less frail so they cost less to ship, and they took up much less space while in transport, and while sitting in warehouses, or on store shelves. We were sold an inferior product so that other people could save money. Gradually, some of those savings made it to consumers, especially when it became possible to generate profit continuously though TVs by collecting our data and pushing ads, but it was always a shitty deal for consumers who wanted a quality picture.

I imagine that in the future, people will look back at much of the media from recent decades and think that it looks a lot worse than we remember because it was produced on bad screens or made to look good on all of our crappy screens.


While I appreciate a bit of sarcasm, I'm not sure if this is what actually happened. In the CRT era, you either had good monitors which were expensive or a bunch of actually crap monitors. I had the former, but most of the people had latter and using those monitors for any extended period of time would give you headaches and dry eyes because of poor refresh rates, and terrible flicker.

As a personal anecdote: when I was choosing components for my first desktop computer (instead of using dad's work laptops), I selected components which are affordable. Also, as a coincidence a local IT magazine had a big test of desktop CRT monitors. So I've chosen some inexpensive one which wasn't terrible and as every kid asked parents for money. My mum who was already working on computers on her job had a look through that magazine and said that she'll pay for the whole computer only on the condition we buy the best monitor on that test. So we did (it was a trinitron Nokia @ 100Hz which was a lot), and I think with that move she saved my eyes long term, as I'm in my early 40ties and the only healthy thing I still have are my eyes. In any case, I've soon realized when I got that monitor is that I'll never save money when buying stuff which I use all day long.

Back to the topic. CRT monitors also were space heaters, and had a large volume which was only fine when being permanently placed on a geek's desk.

When LCDs arrived they actually were considerably better than average CRTs. The picture was rock solid without flicker or refresh rate artifacts, perfectly rectangular (a big problem with an average CRT as a matter of fact) and very sharp and crisp. All for a little bit more money. After two or three years they were actually even cheaper than CRTs. And I forgot to mention, they took much less space so you could place it on a POS counter or wherever. It took much more time to replace the top end CRTs, but I guess this is always the case when talking about some tech product.


It's still not reached a point where you can just choose high resolutions with no drawbacks.

2048x1536 19" (135ppi) at up to 72Hz was common at reasonable prices in the late 90s if my memory is correct. Although OS scaling sucked and text looked weird due to the shadow mask at that size. 1600x1200 (105ppi) was the sweet spot for me. And actually in my first job in 2004 I had two 20" 1600x1200 (100ppi) LCDs that I recall were reasonably priced and they were nicer overall. This was around the time LCDs became the default choice. Then "HD" became a thing a couple of years later and you are right, for the next ten years virtually all monitors were "widescreen HD", which was 1280x720 if you fell for marketing of the lower-priced units or or 1920x1080 at best. Anything higher was very expensive.

In 2012 the retina macbooks came out and I got a 13(.3)" with 2560x1600 resolution (227ppi). This was the first time for me that LCDs were finally great. But you couldn't get a resolution like that in an external display. So at that time I mostly just didn't use external monitors until 2016 when suddenly 4K 27" (163ppi) became reasonably priced. So I used 2 of those for years and they were good enough but still left me wanting.

Now still to this day, 4K is the max for external monitors at reasonable prices at any size. About 2 years ago I got an M1 macbook and realized it only supported 1 external monitor. I felt like I needed to keep the real estate I was used to and anyway, with the pandemic and WFH, managing multiple monitors with multiple (work and personal) machines sucked. All I could really find at a reasonable price was 32"/4K and 49" ultrawide. I begrudgingly downgraded to a 49" 5120x1440 monitor (109ppi). I will admit that going from 60Hz to 120Hz was nicer than I expected.

So in 2023 my laptop screen is great and has been great for 10+ years but this was my story about how I am still using the same pixel density as I did 25 years ago.


IBM T221 (2001, over 4K) was popped up from the future.


Very cool but not really relevant to what was/is available for reasonable prices.


Second hand one was very cheap ($600?) in 2010 IIRC, still futuristic at the time.


How much did a 2048x1536 CRT monitor cost though? That's usually high and I bet it probably priced similar to what a 6K or 8K monitor is today.


Also that CRT was probably 21" max, and weighed 20% of the human looking at it.


You are way too optimistic about the weight. ViewSonic p225f with 20" visible display, that reportedly was capable of 2560x1920/63Hz weighted 30.5 kgs!

I am not sure with that dot pitch of 0.25mm it was worth it.

[1] https://www.backoffice.be/prod_uk/ViewSonic/p225f_viewsonic_...


If anything - you can get it waaay cheaper now.

I'm struggling to remember how much they did cost, but with $2000 price tag for the top of the notch machine the monitors on the low end tended (well, AFAIR, don't take my word for it) to be less than $150, and hi-end were like $700 for not the ultra-uber-special cases.


Not at all! Your common as milk Philips and ViewSonic 19-21” could do that easily!


> The only real advantage they had was size.

And Moore's Law. LCDs are semiconductors so their price goes down by a factor of 2 every 18 months.

However, even size would be enough. CRTs were ridiculously heavy. My GDM-FW900 was almost 100 pounds. And I used two side by side. I had to shop specifically for a desk that wouldn't collapse when I put them on it.


The power draw is also lower for LCD.


I agree, but we got LG's 16:18 DualUp monitors a year ago. Having a 43'' monitor in the middle and these two on the sides creates a better setup than it was than what was previously possible.


That's basically what I do. 24" 4K in the middle and 2 17" eizos beside it. They're 1280x1024 though so I have 200% scaling in the middle and 100% at the sides. This causes some OS issues in FreeBSD (I mitigate with xrandr) and Windows which is still screwy to this day. On Mac it works perfectly but I don't use Mac much anymore.


I wish some monitors would have good hybrid uses like being able to do say 6K 60Hz and some much lower res (2K) at 120 or something.

I can definitely live with 60Hz on desktop if I have to, and I can't game at 6K anyway, so doing 2K@120Hz gaming and 6K@60Hz or 8K@60Hz desktop work would be ideal, and wouldn't get into the silly bandwidths of 8K@120.


I run the DELL G3223Q (144Hz 4K) and mine calibrated at ~98% DCI-P3 for reference. I'm quite happy with it.

https://www.rtings.com/monitor/reviews/dell/g3223q


I have the same monitor but would get an 8k equivalent in a heartbeat. I run it at 100% scaling, and would love that sweet sweet 200% scaling


I also bought one of these, but ended up returning it.

The technical aspects of this review are interesting, especially the wide range of machines tested. I finished writing a review of my own[0] a few days ago, which is less technical but goes into some other aspects of the monitor.

[0]: https://www.wells.dev/dell-ultrasharp-u3224kb-squandered-pot...


> One might surmise that Dell is aware of the issue given the monitor's Mac-inclined target audience however, which makes the lack of mention of it anywhere in marketing materials or documentation feel somewhat dishonest.

Wouldn't be the first time. My previous Dell monitor, the U3818DW, had a bug with its USB-C implementation that only affected Macs (somebody proved via protocol inspection that they were actually implementing part of the protocol wrong; it just didn't manifest on Windows). Their support forums for the issue are full of a single Dell representative whose sole response was to remind all of the Apple users that _technically_, Dell never qualified this monitor for macOS, and it's their fault for not carefully reading the technical documentation before purchasing. (Naturally, many of the reviews on Amazon are of people plugging it into a MacBook Pro.)

Really soured me on the idea of buying a Dell monitor for anything Apple related after that.


Expected apothesis but got a computer monitor. Its not clear what the alternative monitor would be.


I returned it not because it wasn’t perfect, but because its list of issues was too long. With exception to the antiglare coating grain (it was one of the most bothersome), I could’ve lived with some of its problems if they weren’t so numerous. No monitor comes totally problem free after all.

The other issue is its cost. Expectations raise alongside price, and at an MSRP of $3200 they’re pretty high, and this monitor in my opinion doesn’t meet the bar… with its build quality, problems, and lack of polish it’d maybe make sense at half that price, but even then that antiglare coating would make it a hard sell. They’d need to at least change the coating to make it resemble that of a more traditional matte monitor (which has no grain or sparkle effect).

And you’re right, there isn’t really an alternative except the Pro Display XDR, which is priced far beyond what many of us would find reasonable, uses increasingly dated local dimming tech, and has no extra functions beyond the hub on its back. That’s part of why I was disappointed: everything was lined up for Dell to have been able to knock this one out of the park — it would’ve been easy — but they didn’t.


Just give me a 2K monitor that turns on instantly, doesn’t hunt for sources, consumes less than 20W of power and I’ll be a happy guy. Bonus for having physical buttons to select explicit input sources without delays.


I would say that less than 30W might be a more realistic goal[0]. Maybe lower end monitors consume less power than these... I'm not sure.

Also, it just seems strange that anyone is asking for a 1080p monitor in 2023 outside of Esports. Are you sure you want 1080p (2K) and not 1440p (2.5K)? It bothers me when people use 2K to refer to 1440p. 1080p is 2K[1].

[0]: this chart has a few recent monitors and gives a general idea of how much power monitors consume: https://youtu.be/Wik4DhEaj_8?t=527

[1]: https://en.wikipedia.org/wiki/2K_resolution is very clear on this subject, and I agree completely. 1080p is much closer to 2000 horizontal pixels than 1440p.


I use 16:10 1200p monitors primarily which are also arguably 2k. I recently used a 1440p monitor and it actually is quite nice. I see zero use for me for a 4k monitor; I tried it and was completely nonplussed (and it makes remote-desktop usage significantly worse).


Yeh, I meant 1440p. 16:10. Was not aware these were referred to as 2.5k, thanks! I actually have a 32” one so no scaling is required and I think it only takes about 30W so it doesn’t feel like I’m sitting in front of a heater in summer. Honestly, anything below 50W is fine. My biggest gripe really is that it takes 7 seconds to turn on.


It is kind of weird that I don't think I've ever seen any monitor reviewers report the monitor's time-to-wake.

Certainly, one of the most impressive things about the M1 laptops when they came out was that they woke instantly from sleep, including the screen.


You ain’t getting a modern monitor with less than 20W of power unless you want something very small and very dim with very low refresh rate.


Nowadays, I think it is doable with modern displays.

My 165Hz 1440p 32" (31.5") LG with 8 bit color and two external speakers consumes ~20W, measured at the wall.

My 14" laptop screen is about 19.8% of the area of that display. I have loosely measured it from ranges of ~<1W at min brightness to ~8W at max brightness. It is 1440p, 60Hz, 14", HDR (DV). I usually run it at ~20% brightness, which seems to be around 3W.

My office has the usual LED lighting at night or a large window + sliding glass door during the day. Not a dark cave by any means!


Why is power a concern?


> To drive that resolution at 60 Hz, about 34 Gbps of data rate is needed.

... But then the table just below states the MacBook does it at 8gbps without DSC. What's going on here?


Copying my reply from Mastodon:

> I honestly don’t know. Maybe this is a bug in the monitor’s firmware and it does use some version of DSC after all? Or you’re right regarding lack of DSC support and it uses more than 8 Gbps but displays it incorrectly? Not sure. But something is definitely different about the monitor’s link when a MacBook is connected…

Then I read https://news.ycombinator.com/item?id=36581855, which points out 24 bit colors vs. 30 bit colors. The MacBook links up with 24-bit colors indeed. Maybe that’s the difference?


I feel like a fish out of water - I understand the importance of having an external monitor, maybe even 2 monitors if you're on desktop, but...8K ? 32 inch? curved? 120hz ... wow I feel like I have absolute no problems sitting here with a 27 inch 60 hz monitor alongside my Macbook pro...


Higher refresh rate does make a noticeable difference, if you have 60hz and 120hz or higher side by side you notice it instantly from the mouse cursor movement and scrolling websites etc, it's just buttery in comparison. Definitely not a must, just nice to have.

I don't see the benefit of going higher than 1440p or 1600p though, any higher res just means I need to increase scaling otherwise things are too small to read comfortably for me.


High res with scaling does make text and stuff really crisp though which is nice


"but...8K ? 32 inch? curved?"

It depends much on the type of work one is doing. I don't have 8k but I'd surely love to have one when I'm editing high resolution 48-bit TIFF files (≥1GB) which I do quite often.

Normally for other work 1920x1080 is more an adequate.


Michael Stapelberg's blog was a source for me when I bought my Dell UP3218Ks specifically for usage with Linux. A great resource!

The article mentions that "it (the Dell UP3218K) needs two (!) DisplayPort cables on a GPU with MST support, meaning that in practice, it only works with nVidia graphics cards."

I would like to tell people here that this is not true in my experience (which might be related to advances in amdgpu and/or linux in general): I'm running one Dell on a Vega (64) Frontier Edition and one on a 6900XT, both on Linux 6.3 with the amdgpu driver. It sets up correctly, using DP TILE extension, so both cables used, 7680x4320@60hz.

Also I would like to add (subjectively) that nothing beats the display quality of these screens. It's literally life/eye-changing.. insanely nice, sharp. I could never go back to sub ~200ish DPI displays.


Hey, thanks for letting us know!

I most recently tried the AMD GPU that came with my ThinkStation P620, and it could not drive the monitor at native resolution :(

Maybe I should try a 6900XT. Which particular model do you have?

How much power does it use when driving the 8K monitor?

Do you have PCIe ASPM enabled in the UEFI setup for your PEG slot, and does that run reliably? I was running into problems with ASPM with my nVidia: https://forums.developer.nvidia.com/t/pcie-errors-after-enab...


> it needs two (!) DisplayPort cables on a GPU with MST support, meaning that in practice, it only works with nVidia graphics cards.

This is nonsense, of course AMD graphics cards support MST. https://www.reddit.com/user/hubsdocks/comments/rcf6vz/dell_w... Perhaps some weird Linux driver limitation...?

As for DisplayPort, the only semi-modern customer grade video card with four DisplayPorts is the AMD Radeon™ RX 5700 XT Taichi X 8G OC+ RX5700XT TCX 8GP.


Yes, with AMD cards, the 8K monitor only works on Windows.

With Intel ARC cards, the 8K monitor works neither on Windows nor on Linux :(


FYI in Macrumors [1] forum users found that when connected via thuderbolt 4 in Apple Silicon Macs, you dont get the full resolution, because MacOS does not enable DSC. In short you get max 6016 x 3384 instead of the native for the monitor 6144×3456.

You have to either use the HDMI port from the Mac (if you have an M2 one) or use a usbc to HDMI converter which for some reason triggers the DSC.

[1] https://forums.macrumors.com/threads/dell-6k-u3224kb-monitor...


Slight correction, the monitor will run at full resolution (6144x3456), but with 24-bit color instead of 30-bit. To get 30-bit over Thunderbolt one has to run it at 6016x3384. So effectively, you get full PPI or full color but not both.


Thanks for the correction :D

The main issue is that users reported that text does not look very nice when the monitor runs in 24-bit color and full resolution, which is a bit concerning when considering the asking price.


Fully agree, at that price the picture needs to be very close to perfect.


I'm surprised no one is talking about the Dell screen bugs with macOS/M1/M2. I have the 27" and 32" curved monitor, and they both suffer from massive flickering issues with some hues.

And yes I have it pegged at 60hz, unlike the "solution" says, removing the variable rate does not solve the problem.

Buyers beware of Dell screens with macOS on Apple Silicon.

And yes I've tried many different cables... :)


I have a LG 38” ultrawide which I would love, except it also has this problem with specific hues. It took me a while to figure out that it was related to certain colors.

I found that if change the color profile I can mitigate the issue, but then it doesn’t look quite as nice.

On the LG, not only is there flickering but a transient burn-in which still appears when the TB cable is disconnected! It fades after a few hours.

Many owners have the same issue with their M1 devices so I assume it’s not something that is fixable.


Is this specific to the curved ones? I've got the Dell G3223Q, a flat 32" 4k 144Hz, no flickering on my M2 MacBook pro or my Framework with PopOS. I run it at 1080p (2x scaling) on both and connect through a CalDigit TS4 w/ DisplayPort.

I like it enough I'm considering a second. Only downside is no usb-c.


It's not specific to curved ones, I have a flat S2721SQ that does it as much as the curved S3221QS.

I believe this is why it's not a problem for you:

    [...] both and connect through a CalDigit TS4


Why is it that pretty much all displays in the PC world have such dismal pixel densities (< 226 ppi)? It would be nice to be able to shop for a new desktop display without having to give up Retina in macOS.


Because they're expensive. It's the same reason that phones have higher DPI than laptops/tablets do -- what's economical on a 6" screen isn't on a 13", and what's economical on 13" isn't on 27."

(Why? Yields. A 6" phone screen is like 12 square inches. A 27" desktop monitor is over 300 square inches. If you have a 98% yield of phone screens without defects, that implies only one defect per ~600 square inches, which means that at a handwave level, you'd have a 50% yield of desktop monitors with that same tech. "We have to throw away half the screens we make as junk" is a lot worse than "we have to throw away 2% of them.")


The missing factor here is how pixel density affects yields. How does that scale for a same-size screen at 1080, 4k, 8k?

And smaller defects are better when you do sell off the imperfect panels.


I think part of the reason is that high DPI support was completely broken under windows for at least a decade after it was working fine in MacOS and Linux (pre-wayland and the rise of "HiDPI support").

Most PC's are targeting windows, and stuff would be unreasonably small if you plugged in a midrange monitor from the early 2000's. So, monitor manufacturers stopped offering monitors with reasonable DPI.


Because people have switched to larger screens (32+ inch) that they place farther away, lowering the need for pixel density.

I'm on a 4K 27" monitor. That's a PPI of 163. I don't feel it's dismal, and I sit relatively close to my monitor.


I’m using a 4K 28” (I didn’t think there were any 4K 27” given the geometry), and it’s ok but I would really prefer a 5K 27”, those are just more clear although not incredibly so. But they are so expensive to get outside of an iMac. I recently bought my wife a 24” 4K, which has a similar density to the 27” 5K, and her screen looks much better than my 28” 4K.


Which 24” 4K did you get her? There doesn’t seem to be many on the market?


I have followed this approach as well. I use a larger display from further away. I did get a deeper desk to enable this in my current setup. Previously, I mounted my monitor on the wall and moved the desk out ~10" (~25cm).

Either way gives me a greater range of suitable monitors, and they are all cheaper than going for unicorn displays.


Apple defines 218 DPI as Retina BTW.


Most people buying high-end monitors prioritize high refresh rates over pixel densities.


High refresh rates seem to have gotten quite cheap. In addition to refresh rates, the high end is concerned with HDR, resolution, color reproduction, curve, and width, depending on the professional segment being targeted.

The products exist but they're not as numerous as I'd expect (certainly not outside of the Apple ecosystem), and it's an eye-watering leap as far as prices go, hundreds of dollars at one end and multiple thousands at the other with very little in between.


My experience is that HDR is rarely worth it on a computer monitor because OLED is so rare. FALD or other non-OLED HDR displays don't get near a decent OLED TV's HDR performance.


Gamers do. For coding I prefer high contrast and clarity. I’m disappointed there haven’t been any huge genuine OLED screens. MicroLED could be nice too.


Yeah the extra frames do nothing in my text editor. The only part of your typical gaming display I want in a work/daily driver monitor is adaptive sync to reduce video judder and to allow the GPU to idle even more deeply when nothing is being redrawn (which is part of why M-series 14/16” MBPs get great battery life: their screens can go down as far as 1hz).

I won’t turn down a higher peak framerate if it comes without cost to the rest of the monitor’s functionality of course, but it’s not worth trading anything away for.


Right, and most people buying high-end monitors are gamers.


Interesting, I’d assume it’s professionals in the video/photography/production space spending the serious money on high end displays. Where are you getting your data?


Well there's "high end" consumer and then there's "high end" professional which is a few times higher in price


People are happy using DSC for a pixel perfect monitor? I'd never trust it for graphics work.


I wouldn’t say happy, but it’s currently the only choice. Maybe as drivers advance we can have non-DSC links — the hardware should support it per the spec…


I switched from a pair of 27" 4K to a 34" Ultrawide and I like it but I miss the pixel density. Hope we get a "Retina" Ultrawide before too long.


I have the Alienware 38" Ultrawide. The ratios are just slightly off for my eyes comfort. At a full 1:1 pixel ratio, the text is too small. Scaled up slightly the size is right, but the blur is really annoying. I think it would be prefect for me if it were like 45" with the same resolution, or if it were still 38" but instead of being 3840, native it was 3200 or something like that.

If I scale it to 50%, it looks phenomenal, but who wants to work on a screen with the equivalent real estate of 1920x720?

I now spend most of my time on the built in display of my 16" M1 MBP. The text sizing is fine for me.


This is why I've stuck with my pair of older 24" 4k dell monitors.

Before them I was using regular 2k monitors, and would get headaches staring at a monitor for a full workday 8-10 hours and my eyes were so strained I usually ended up just going to bed right afterwards. I've never used an ultrawide, but I'd be really interested in trying one out IF I can get a similar PPI (~192 or higher)


We have multiple, there's a "5k" LG /MSI and a 4k x 2 Dell


Monitors are frustratingly slow to evolve.

What would really make a dent in experience is OLED + HDR. In the TV world, the combination with HDR is relatively new as OLED displays struggled with brightness. This particular combination working well is a sight to behold. It makes you consider what absolute crap image you were looking at before.


Link to the monitor in the US site if your German is rusty (the monitor is $2.4k)

https://www.dell.com/en-us/shop/dell-ultrasharp-32-6k-monito...


I don't understand why these monitors are so expensive. I've been using a 55" 8k TV as my main monitor for years and paid less than USD 1000 for it new. Recently I saw a similar deal for a 8k tv again, so making panels at 8k and shipping them around the world at around USD 1000 is certainly still possible, but the Dell 8k is 4000+ and the 6k is 2300+.

Using a tv as a monitor has downsides (Only HDMI, doesn't detect inputs automatically etc) but having 55" of retina screen real estate is such a great experience that I'm sticking with it. The same panel sold as a PC monitor with proper inputs (and maybe a matte screen) would be ideal.


It's because a smaller screen with the same (high) resolution means higher pixel density and smaller pixel size. It's something noticeably more expensive to manufacture.


>It's because a smaller screen with the same (high) resolution means higher pixel density and smaller pixel size. It's something noticeably more expensive to manufacture.

That's a fair point but I don't think it applies at these scales. Even a 8K 32" has a very low pixel density compared to a cellphone screen for example.

Acer has a 16" portable 4K screen for around 600 USD which they presumably sell at a nice profit, 4 of them would make a 8K 32".

Anyway, pricing aside the main problem is that the products don't exist in the first place. A monitor that's simply 4x 27" 4k screens (available from 300 usd) combined would be perfect for a programmer, data analyst etc even if it isn't strictly "retina".

The manufacturers don't make monitors like this for one reason or another. I think it's a marketing/economic strategy more than anything else.

My pessimistic take is that LG and Samsung etc have tons of investments in old tech and want to keep selling old tech to consumers for as long as they can because that's where they get the highest margins.

A slightly less pessimistic take is that monitor makers don't believe that there is a market for really large screens because when you do a survey most people think 32" is "big" and anything larger is absurd.

But it's wonderful! :)


I bought this 6K monitor, but I did not like

1) The scaling on macOS. Everything was way too big. There were not enough options for fractional scaling.

2) The matte finish. I came from another matte Dell monitor, but the whole screen looked slightly fuzzy, like a permanent oil stain all over the monitor.

3) The contrast sucks. They call it 'IPS Black' but I was not impressed. The colors look different depending on the angle. This is probably an issue with the size of the monitor as well, as when you look at the left size, the angle to the right size is pretty big.

Lastly I did not like the size either, but I will have that with every 31.5 inch monitor.


You'll never pry my Retina MacBook from my hands, and I love my original USB-C LG 4K, but I've really started to like my 42" 4K TV that I have been hah g the last two years.

It was $300, the color isn't perfect, it has a little glare, it can be choosy when it comes to ports and cables, but having such a large desktop is great. It's a large enough screen to run at native resolution.

I'm interested in things like Apples 6K monitor but the cost and the fact that it's only really work with my Mac, as well as not giving me that much usable space, make it an easy decision to not buy.


Is the TV wall mounted? I‘d be interested how this could be done in an ergonomic way (eyes in line with top of the screen).


Wendell from LevelOneTech has a custom setup that does this, but in my own experience it's better to have a "main part" of the screen in the ergonomic zone and use the rest of the space for auxiliary windows. Window management tools is a must for me when using the 42" tv.


The problem is 16:10 22” at 1680x1050 nas been always the sweet spot for me. No zooming, everything visible perfectly, ideal font size for coding and etc.

I’m at 16:9 (nearly impossible to get 16:10 today :/) 4k 32” and honestly this is freaking small. What’s the point of all the space if it all looks really good without zooming at maybe 2500-ish pixels wide. I’d gladly get a good 42” display at 4k, would be perfect.


Having owned Dell's 5K for years, last year I finally upgraded to their 8K model. I could never go back to anything less. If you need it and can justify the cost, you will know; but I don't see either being justifiably true for very many people at this point. While higher refresh would be nice, 60Hz is more than adequate for me, as it's the screen real estate that matters, and you really don't have many options to choose from.


The larger the monitor, generally the better. However, considering GPU performance and power consumption, compromises must be made somewhere. I have a $500 Dell 4K 32-inch monitor, which is the maximum I can accommodate in terms of desk space. (The desk space is greatly influenced by real estate prices, currently estimated at $12,000 per square meter based on my home.)


Recently there was an article here on HN declaring that LCDs are on their way out now in favor of OLED. That doesn't seem true at all for the monitor market. The author doesn't even mention that both monitors are LCDs.


In that post [1] some commenters said that the backlight technology for LCDs is the hot stuff at the moment, apparently.

You don't want an OLED PC monitor because it will very quickly suffer from burn-in at the typical locations of taskbar and window controls.

1: https://news.ycombinator.com/item?id=36398596


He may referring to mini LED, a backlight which enables more fine-grained local dimming.

As for burn-in, this doesn't seem to be an issue in smartphones for many years now, and monitors are also not normally used with damaging high brightness HDR content like OLED TVs. My guess would rather be that OLED monitors are too expensive, maybe because they have much higher PPI values than TVs. (A 55" 4k TV has only 80 PPI.)


Great review. Are you returning the monitor? Also, have you kept the Kinesis 360 keyboard and did you get used to the plam pads? Personally, I'm using the palm pads of the Advantage 2.


Yeah, I’m returning it.

I kept the 360, but am currently using the Advantage 2 again. I never got used to the 360 palm pads.


Funnily, I was trying to buy the monitor as well last Friday in Austria, but was informed that the inventory nr on the website was wrong - so my order was cancelled. After your review, I guess I'll be waiting for the Samsung ViewFinity S9 (5k).

Re Kinesis: I have mixed feelings as well but in the meantime I got used to the 360 model. With the old palm pads, it feels more like the Advantage 2. I'm a bit sad that I didn't order the pro model. I really liked my Advantage 2 as well, but I can't go back to it as some electronic part seems to be damaged. It just seems to send random characters without touching the keyboard.


Why are you returning it? I am keeping mine because I don't know of anything better.

Also, check out the MoErgo Glove80 keyboard.


I wonder what impact Apple's VR headset is going to have in, say, 10 years. Hopefully prices for their headsets as a complete unit will have dropped, and at that point, while I'm no optical scientist, we must be approaching the limits of eye clarity, no? So eventually VR headsets will offer effectively 360 degrees of "monitor" that can't be matched by any panel, and the only actual pixels they will need to produce are a postage stamp sized thing in front of your eyeball.

I can imagine that a lot of personal computing is going to transform into some kind of "screen+lens+eyeball" as COGs for these kinds of headset displays are going to be lower than traditional monitor (or possibly even smartphone) displays.


Unless they can manage to shrink the headset to the weight of a pair of glasses, I don’t see this as competition for monitors. Just to name one out of dozens of issues: if you have your face supporting a heavy VR mask for hours a day, it’s really going to show after a few years.


The top part looks really bad to my eye at least. Would totally buy it if didn't look this ugly


6k is even way too much to spend.


> Dell’s 32-inch 6K monitor (U3224KBA), a productivity monitor that offer..

I stopped reading after this lol


I just purchased 2 of the 6Ks and I am keeping both. AMA.

I do not understand the complaints about pixel density. The display is 223 ppi, which is the same or slightly higher than Apple displays. I am using them in a dual display configuration with an M1 Macbook Pro via TB4 and a Windows PC with an RTX 4090 via DisplayPort. The KVM feature makes switching back and forth a breeze. I am using 175% scaling (in Windows) and I don't think 8K with 1:2 scaling would give suitably sized OS components for me, so it would be fractional scaling either way.

The startup time doesn't matter in practice because they still wake up from sleep instantaneously.

The mini-DP IN port would be inconvenient but Club3D makes a bi-directional adapter that makes it regular DisplayPort for about $20.

I am fine with the camera defaulting to ON for Windows Hello to work in my 2 computer setup.

My only serious complaint is the piss poor speakers and the fabric covering them.

I would also prefer a glossy display, but in practice I just don't find it makes any difference. I arrange my office for no reflections anyhow, which results in an equivalent experience. Additionally, the Dell 6K has excellent blacks. It has much better blacks than any IPS panel I have used.


> I do not understand the complaints about pixel density. The display is 223 ppi, which is the same or slightly higher than Apple displays. I am using them in a dual display configuration

I don't even understand all this ppi hype, nor even dual displays anymore. I always used dual-triple displays in the past but nowaways I just bought a single table-size (49-inch or something like that) 4k display and put it right in front of my desk just to use it in a plain old 100% mode (no "hi-dpi" scaling, just tweaked some font sizes in editors' configs). This way it effectively replaces 4 displays with a single seamless space - no additional frames, no constraints, almost no head turning, beautiful tiling (when on Windows - PowerToys' FancyZones are great).


If I were just using text editors, I would do something similar. However, many people do other things with computers besides edit text. DCC applications are obnoxious on displays larger than 32" and reducing the window size just leaves you with little border areas for multitasking. It's just sloppy. So 2x 32" is a pretty obvious choice here.


I don't edit much text in the way people here would mostly imagine (like when your job predominantly is coding/terminal), I'm a GUI lover. What I hate the most, however, is switching windows (and turning head to look at another monitor feels just slightly better), also scrolling - I want everything I currently work with (+ a player window for a background vibe + two messengers) to be open side by side shown fully. Because as soon as I switch the context (like hiding a window to see another one full-screen) I often immediately forget what I was doing/thinking, like when you exit a room [1], this harms my productivity tremendously.

By the way, I once noticed that vertical stacking of 2 monitors feels much better for me that when I place them side by side horizontally. This is why a single big (big in both ways, also vertically) display works much better than fancy ultra-wide monitors for me.

What is DCC?

[1] https://en.wikipedia.org/wiki/Doorway_effect


> The display is 223 ppi, which is the same or slightly higher than Apple displays.

For context, my MacBook from 2015 had a PPI of 220, so this comment is extremely accurate.


Forget 4K, 6K, or 8K. Pixel density is the spec when considering any display. It doesn't matter if it's a wearable, phone, laptop, monitor, TV, or stadium wrapping LED surface, the only important factors for resolution are the distance to the viewer's eyes and the distance between the pixels.

For context, 20/20 vision is the ability to perceive 1 arcminute of angular resolution. Beyond that point there are diminishing returns.

To see what this looks like for different viewing distances: https://www.desmos.com/calculator/dtrszipgmd (black line is mm, red line in freedom units, green is 220PPI reference).


20/20 vision is a low bar though. I’m bothered by how much my vision has deteriorated since I was a teenager. The difference is obvious, as I now need to walk up to things I used to read from a distance. I’ve tested 20/20 a few weeks ago.


> 20/20 vision is a low bar though.

Absolute preach right here. I would say I laugh when someone says it's not perceptually possible for me to differentiate the pixels of my 4K display from a typical viewing distance, but that's actually become such a common assertion that I'm dying to know if my vision is just that much better or theirs is just that much worse. :/


That's a good point to highlight: 20/20 is not perfect vision. It's simply a benchmark for "good enough" vision. IIRC there are physical limits around 0.4 arcmin. People with good eyesight sit somewhere between the two.


It's the opposite for me; I have to move small things away in order to focus on them.

I always joke that if it gets much more worse my arms won't be long enough to hold my phone :-)


So too close and they're out of focus / blurry?

Does VR work for you?


This happens naturally with age, usually starting sometime between age 45-50, as your eyes shift from variable focus to fixed focus (at distance). Google presbyopia.


Have not tried.


So viewing from 40cm away is about the point where 220PPI crosses the point where you can tell any difference? That makes me think 220 is quite a well picked number, as that's about how far away someone might typically be viewing.


There's no sudden switch. All perception is a little squishy and dependant on the individual. For most people though, once you're creating an image that's 220PPI and viewed from 400mm or so it's probably better to dedicate bandwidth to better temporal resolution, colour depth, or lessening the compression/compute/power/cost needed to balance those three.


> So viewing from 40cm away is about the point where 220PPI crosses the point where you can tell any difference?

There is a difference between your ability to resolve, and whether you can tell _any_ difference. The usual measure of sight is visual acuity, which is your ability to discern details of a certain size at a certain distance. Specifically, in most exams, it's the ability to tell strokes from blobs.

However, even when the pixel density is beyond your ability to discern details, they can still affect your perception in other ways. For example, misalignment of borders can be detected far beyond what your visual acuity is.

On computer screens, particularly, pixel density also affects how things are rendered, think diagonal lines, curves, hollow spaces, etc. Take an extreme example, a box rendered on a 4x4 screen is going to look better than on a 2x2 screen even if the former is twice the distance away.


> Forget 4K, 6K, or 8K. Pixel density is the spec when considering any display.

I'm not really sure what you're refuting, if anything. Either I personally find 220ppi now rather unimpressive, or I found the 2015 MacBook display to have been especially impressive, but I am not sure which; all I was really saying is that they do indeed have quite similar pixel densities.


Nothing to refute - I was simply continuing from your highlight of pixel densities.

While PPI has stayed mostly static (which is a good engineering choice), plenty of other display parameters have improved: better contrast, faster response times, deeper colour, larger continuous surfaces. The 2015 MacBook display was impressive. Modern panels are too!


> The 2015 MacBook display was impressive. Modern panels are too!

Apple's always been pretty impressive with their color reproduction. Which makes sense, since they are really, really pushing hard for designers and content creators (that's one of the big central themes of nearly all Mac-related advertising). Their newer, more premium machines are mostly sporting HDR & WCG displays, and macOS has had incredibly good software support for many years.

The MacBook was a premium product for its time, so even 5 years later, there were brand new higher-resolution (and higher-PPI) panels that performed much worse! My 2020 ASUS laptop is a great example of that. The display was 4K, Adobe RGB, and had a higher PPI, so it sounds like it should have been better, but it only supported 24 bpp and had awful ghosting (around 6 frames to change a pixel).

But at least then I won't screenshot something and have the file saved to disk in Adobe-RGB-interpreted-as-sRGB (which is literally what would happen when the dedicated GPU did the color space conversion, but that was then taken as an input to the Intel APU and interpreted as sRGB, even though it was rendered for an Adobe RGB display) which even if converted back still loses a ton of precision due to Adobe RGB being squeezed into 24 bits in the middle. I care about that a lot. Something that annoys only me is possibly tolerable, but something that affects the content I share with others is like life and death (in terms of importance to me).

They also connected that display to the Intel APU so Windows could not properly do the color space conversion in hardware, resulting in artifacts that were so terrible I had to turn off the conversion entirely and just deal with viewing everything in sRGB-interpreted-as-Adobe-RGB. (yes I know "switchable graphics" is a near-universal configuration in laptops, but it's incredibly cursed and causes so many fucking problems in a completely irreparable fashion. Why not just connect the display input to the display output as it was intended, ASUS turds.)

They market it as "Pantone validated", so I guess that means that they added negative value for the sake of a marketing sticker. (Honestly, put that way, it definitely sounds like something that they couldn't possibly not do. Companies just can't help themselves when it comes to this stuff.)

When I switched to a desktop, I grabbed a random 15" 4K monitor from China for $70. It's sRGB, but it's proudly sRGB, and didn't half-ass itself into something that is hopelessly broken like ASUS did.

When I have a spare $400 I'll dish out for one of the 10-bit DCI-P3 displays on AliExpress and then be happy.


So, if 25 cm is an optimal viewing distance, 350 ppi is maximum phone screen density required?


Generally apple picks the optimal ppi and they don’t market the ppi value, only that it’s “retina”. They will prefer odd resolutions over suboptimal ppi. For example, the m2 macbook air is 2560 by 1664 at 224 ppi. The maximum ppi you can get on an iphone is 460 ppi, so I assume in their testing they found that even people with perfect vision holding it really close had no use for sharper screens than that.


> Generally apple picks the optimal ppi and they don’t market the ppi value, only that it’s “retina”.

First world problem, but I hate when companies choose a one-size-fits-all value that is allegedly supposed to reach the limits of human perception, but that value still turns out not to be enough for me because (surprise) some people are naturally slightly better than average.

For example, the new Apple VR headset, despite having a resolution of around 4k*4k pixels per eye, would probably still have to make up for it in other ways, because I can still see significantly more pixels than that.

Apple actually seems to know this, and they know it so well that their entire marketing push is based on augmented reality and immersion. Blending things with the environment, moving around and interacting- basically things that can never really be disappointed by the screen resolution being insufficiently high.


The "human perception" crap is always marketing. Everything is supposedly at the limit of human perception, until the next generation of technology comes out, and then this time it's for real at the limit of human perception! I think the last 20 years of tech products I've bought were always marketed as at the limits of human perception.

Step aside 8 bit color, 16 bit color displays are here: 65536 colors! Nobody can perceive more than that! Then 32 bit color came along: 1+MILLION colors!! How can it get better than that? Humans can't perceive more! 30FPS gaming as the gold standard for what the eye can behold, then it became 60FPS, now it's 120FPS! The PPI arms race is the same thing.


Apple does not use a one-size-fits-all in ppi.

The phones are assumed to be held close to your eye and have a 460ppi.

The MacBooks are assumed to be seen from a medium distance and use around 225ppi.

The Studio display and Prod Display distance is a little longer and use 218ppi.

Retina is a property based on subtended angle from the viewer's perspective, not on a physical property of the device.


And none of those devices have a PPI that you can customize when you purchase the device, in contrast to many PC laptops that generally offer multiple options (say, either 1080p or 4K on the same device), meaning each device indeed has a one-size-fits-all PPI. I never said Apple uses the same PPI for all their devices.


iPhone uses a pentile arrangement that requires higher densities to appear the same.


The iPhone is still pretty much the same ~330PPi as before, only the OLED number are inflated. If you only count the smallest PPI within the sub pixel of OLED they are still, magical ~330 ppl on newer iPhone.


25cm is pretty bad viewing distance, a good way to mess up your eyes long term


Good to know. I meant that's where humans can discern smallest details.


I will leave this here for those looking for detail on what PPI they need at what size for a Retina display like sharpness. https://bjango.com/articles/macexternaldisplays2/


Also, the current Apple Studio Display and Apple XDR Pro Display are both 218 ppi.


218 ppi matches the native displays on current Macs.

The Dell 8k has 288 pixels per inch, which is 4x the detail of the original Mac.

Macintosh 128k was released with a 72 ppi display, so every pixel was the size of a "point", a unit used in more traditional, paper-based graphic design.


The history of the point [1] is equally interesting and messy. Back in school, my old teachers (mostly old printing machine operators, and typesetters) preferred Apple over anything else especially because of the native 72ppi vs windows normal 96ppi.

[1] https://en.m.wikipedia.org/wiki/Point_(typography)


And these days, 96 ppi is the de facto standard, via the CSS specification:

https://webplatform.github.io/docs/tutorials/understanding-c...

It's as if the Windows spec were retconned into being the historical norm all along, but I need to dig around to find the origin story here... Why 96 dpi?

My simple guess is that 96 is also divisible by 12, and thus easy to split up into 2, 3, 4, 6, 12, 18, 24 parts... you also have eight pieces in 72 for every nine pieces in 96.

  72 = 8  x 9 = 2^3 * 3^2
  96 = 32 x 3 = 2^5 * 3
Huh. Never really realized that...


The wiki has this paragraph:

In 1996, it was adopted by W3C for Cascading Stylesheets (CSS) where it was later related at a fixed 3:4 ratio to the pixel due to a general (but wrong) assumption of 96 pixel-per-inch screens

I actually never really questioned this.


Some Macbooks don't match 218 ppi. MacBook Pro 14" and 16" have 254 ppi. I don't know why Apple decided to have such a high ppi for them.


So close... The display with the smallest pixels in my collection has been the quirky Essential PH-1; its display is a silly 504 ppi. I loved that thing, but the device is no longer supported by official updates.

It looks mighty nice, just 4 ppi short of exactly 2x pixels per linear inch of these new Macs. I have no idea why.


32" is an extremely common size for 4k at 100% scaling so I can see the desire for 8k using 200% scaling at the resolution. 225-250% would probably be more "proper" densities for such a display but many like to go a bit under standard scale.


I have a 32" monitor with 4k resolution. With macOS, it's unusable at 100% scaling. It's far too dense. Sure, you can change fonts in many applications, but not all.


Many would disagree with you is the the point I was making, not that everyone should agree. Some don't think 32" 4k at 200% scaling is large enough while others don't think running a ~4k 16" MacBook at 100% scaling is too small to use. What's relevant is that 4k 32" at 100% scaling remains popular and should not be surprising to read about as a use case, despite whatever one's personal preference is.

Some of this comes down to visual acuity, viewing distance, and what one is used to as feeling normal. Things like changing font sizes are just another way at upping the scaling factor though.


It's quite popular, but I personally don't get it. It's too low density for text to look good on it, 24-27" is around the maximum. It's too small to be a single monitor, while being too big to comfortably fit in multi-monitor setups.

I guess it's popular because 32" 4k TVs make the panels cheaper?


I've found 28" 4k is the sweet spot if you are of the age that requires you to use reader glasses.

At 2x things don't look too big, just comfortably big, and the PPI is still Retina-like.


What would be more proper in this case is to make it 64" display. Otherwise it is just a waste of resources. But one can not sit in front of 64" monitor at a normal viewing distance. I would guess except some very limited niches 8K PC monitor makes little sense


I don't think so. From where I'm sitting, I can see the rainbow effects of antialiasing on my 4k 32" monitor. The easiest way to observe this is some skinny light text on a dark background.

Other than that, I'm quite happy with my screen real-estate. So, I would indeed love it to have a higher resolution. I could probably use a somewhat bigger screen, a bit further away, that would do a better job as a TV the rare times I use it as such (don't have an actual TV). But all in all, I just want smaller pixels.

Others are talking about higher refresh rates, but for my use case it wouldn't be as useful since I rarely drag things around (I use a TWM) and don't use smooth scrolling in apps.


> I can see the rainbow effects of antialiasing on my 4k 32" monitor

Are you talking about subpixel rendering?

Leveraging individual R G and B dots needs knowledge of the layout to achieve increased resolution, and the algorithm should be adjusting (best effort) to compensate for human colour perception with knowledge of the subpixel layout + subpixel size.

But usually it might be that the screen has a different layout than the OS picked, e.g RGB vs BGR, vertical vs horizontal, or it has a peculiar layout like RGBG. See down the page for an example when it doesn't match, which usually gives the perceptual effect you described https://www.grc.com/ctwhat.htm

At some point though I found with increasing DPI there's diminishing returns to go subpixel and it's simply easier and just as good to go grayscale AA and be done with it.


I'm familiar with the subpixel arrangement issues. My display is RGB, according to its specs and to some review I've seen before buying it (I made a point of avoiding "non-standard" panels). I only notice this with very thin fonts, since sometimes the vertical lines end up "in between" pixels.

I agree that at some point grayscale AA is good enough. But I've found that that point is at a higher DPI than I get on a 4k 32" screen. On my 24" 4k screen grayscale is good enough, so the limit is somewhere in between for me.


I also find 24" to be the "perfect" maximum size for a 4k monitor. 27" is close, 32" is clearly too big.


Does it support DDC over i2c? I want at least sleep mode, input, and brightness control.


Lunar dev here. Looks like it does support DDC brightness control: https://f.alinpanaitiu.com/nAa7gs/Image.png

I checked this on https://db.lunar.fyi using the following query:

    select control, name from displays where name like '%U3224K%'
Can’t be sure about DDC input switching though, I have no way of knowing that it works without trying it in front of the monitor. DDC reports success even if the command failed to switch the input.


Just wanted to say, thanks for the fantastic app. It's the first thing I install on a new computer.


Hey thanks! So happy to hear that ^_^


I'm really glad I stumbled upon this thread. I need this app badly.


i know my old Dell Ultrasharp U2415 supports this feature. There is a tool in linux with which you can switch input, also dell has a display manager in windows which allows you to perform the switching and controls. I totally love it!


Sorry, I am not quite sure what that is. Sleep works great with my PC (motherboard is ASRock WRX80 Creator) and M1 MBP, but I can only adjust brightness through the OSD.


There’s a Mac app called Lunar that uses DDC (a protocol for sending commands to a screen) to allow you to adjust the brightness and volume of an external monitor with the usual keys on your MacBook keyboard (in clamshell mode at least) or with a slider. Very convenient, can’t recommend enough.


Oh OK sorry but yeah I wouldn't use that. If this is usually supported on Dell monitors, I would bet it works on these.

Personally, I rarely adjust brightness on my desktop displays and my ZMK keyboard volume buttons work reliably cross-platform.


I you don’t need it, you don’t need it, but I don’t understand the first sentence. Many different displays support DDC, my Philips at home does and the Dells and I think BenQ that we have at the office do too.


My macbook is my work laptop and we're encouraged to minimize installing extra software.

I wish my personal was a mac as well, but I had to abandon macs when they abandoned me by not upgrading the mac pro for like a decade then kept me away by later ending support for eGPUs.


I have 3 24" Benq 2K video grade monitors running at 2560x1440 giving me nearly 11 million usable pixels. It has a huge boost in productivity for me.

4K TVs only seem to become usably readable with Chroma 4:4:4 at 38" or above, ideally curved, to get 4 usable 1920x1080 quadrants.

When the screen sizes go smaller, absolutely detail and quality increases, but for software development or text centric work is concerned, am I missing something when it comes to 6K or 8K Monitors?


The point isn't to have more pixels, it's to have more detailed pixels. You use scaling to get whatever display size you want.

4k 27" with 2x scaling is effectively the same size as 1080p 27". But one looks like absolute crap, the other is reasonably sharp and detailed. Text is much better with higher PPI. And 4k27" is just 160 PPI, the 6k 32" monitors are 220 PPI so text is very crisp.

I don't know about other OSs, but fractional scaling "just works" on KDE Linux. The size of displayed content has nothing to do with the pixels you have. You control the physical screen size (inches) and scale factor to get the content size you want, and you buy as many pixels as you can afford to get good quality at that size. There's a parent comment on this submission talking about how his 8k 32" monitor is visibly better than 6k 32" for text sharpness; TFA says the same thing. You really want as many pixels as you can afford to buy.

Of course, the minimum (densest) content size you can use is constrained by PPI, low PPI won't be able to display very small content.


Yeah I love my 4K 24" monitor but nobody seems to be making them anymore :'( especially at this price point ($250). It's the minimum DPI I'd consider HiDPI. Really worried about it breaking at some point. I wish I had bought two.

27" 4K just won't cut it. And I don't want a monitor that big anyway.


My Dell P2415Q is thankfull still kickin' (knock on wood).

I have had my eye on an LG[1] out there at 24" 4k (currently about 300usd) -- as that size/resolution has been the sweet spot for me for the last few years. I'm planning to pick one of these up and stick it in a closet just to have on hand in case the Dell dies on me.

[1]: https://www.amazon.com/gp/product/B01LPNKFK0


I've got two of the P2415Q's and will probably keep using them until they kick the dust. Every 6 months or so I look at monitors and anything with an equivalent or higher DPI is too expensive for me to justify ($1200+ per monitor)


That's the exact one I have! It's great, perfect for me. It's old stock I think, in Europe these aren't available anymore :'( I recommend picking one up.


Wow that Dell P2415Q looks very tempting for replacing my junk 1080p 24” work gave.


I'm using a 30" Dell Ultrasharp which would be slightly less dense than yours. It's great, but I'd just like the same visual appeal as the macbook pro I have plugged into it. Text looks far better


32" displays are a sweet spot in size for me as long as the pixel density is >200 ppi, but more for graphics applications than text editors. For 3D applications especially, it's very beneficial to have a lot of area on one screen since the GUI default layouts tend to fit best on 16:9 screens or close to that and larger screens give you larger multi-viewport layouts. 24 inch displays just couldn't come close here. 32" allows me to work in a 4 viewport layout all day long. It's also nice in IDEs, but my main purpose is various graphics applications. And sure I could fiddle endlessly with multi-display layouts but that's such a time suck and every application is completely different so I wouldn't bother no matter what my screen size is.


It makes total sense for graphical applications especially with 3d and the emergent VR/AR stuff and something I'll have to keep in mind.

24 definitely can be a bit tight, but realistic for 3 monitor setup.

If I was going with only 2 monitors for software development, or mainly text work, there are nice 2K monitors that are about 25" that are perfect.

I have two older 27" 2K monitors as well and they tend to be too big in the long run, and 3 were untenable.


Can you elaborate on the kvm/switching feature? My current monitor is awful for that, I really just want a dedicated "cycle input" button. Or ideally, an N-position slider switch, but I can't imagine anyone would ever build such a thing.


If it works like the 3419W you have one button press to wake the OSD and then a press on the input you want to switch to. Not terrible, not great either because on the 3419 the OSD is a bit slow and the button are below the screen so you can’t reliable double click to switch.


If you ever try using the KVM feature on for instance a Phillips monitor, you'll realize the Dell feature is a dream!


Oh I’m sure, it’s mostly fine, my biggest annoyance is actually that the OSD buttons take just a bit too much force (especially for how small they are) so you pretty much always lift the corner, which results in the display “bouncing” for a few seconds before it settles.

And that sometimes it’ll get confused and won’t switch the USB peripherals correctly, but that’s not too common (hasn’t happened in a long while in fact, maybe there was a firmware update I didn’t notice which fixed it).


The Dell 6k has the buttons on the back so this is not a problem.


This is more or less how it works. If you want to install software then you can control it with keyboard shortcuts, but the shortcuts respond much slower than just pressing the button so I doubt anybody actually uses them after trying them once.


Software hotkey/gesture can select monitor input via DDC, https://www.reddit.com/r/ultrawidemasterrace/comments/hm2mvk...


We have ultrawides at the office that send usb to the source of video in auto mode but also has a button to override that and you press it to toggle between targets. Not sure about brand and model. Can check it when in the office.


The built-in kvm feature is very basic and I don't even setup the keyboard shortcuts because that requires software to be installed on the computers. Basically it just lets me switch between computers with 2 quick button presses on each monitor. I use an additional usb 3.0 switch from CableMatters so my peripherals can be switched between each display incase I want to have both computers up side by side.

If you only need 4K, there is this extremely badass TESmart KVM switch that I have used before and would highly recommend. It also has a cycle mode. It's $700 but the thing supports 3 4K monitors and 4 computers and works exactly as described. https://www.tesmart.com/products/4-port-triple-monitor-hdmi-...


Extremely badass but still 60Hz :(

I wonder if I’ll ever find a way to drive 4k@144Hz and switch inputs without the current cable replugging ritual.


The best option I've found is to hook both computers up to a USB-only KVM, and plug them in via separate video cables to the monitor. Then set up the computers to tell the monitor to change inputs (via DDC commands, which most but not all monitors support) when you change devices on the KVM

There's software to help do this automatically (https://github.com/haimgel/display-switch)


I set this up on my M1 Mac yesterday, it’s awesome, and because I have Mac on all the time I only need it installed on that machine


Gigabyte m32uc


I think displays with built-in KVM switching is your only option right now.


Does this KVM work with a Bluetooth keyboard and mouse? Do you have to reconnect the devices every time you switch?


I don't think any KVM supports Bluetooth. You either need to use software like Barrier or get devices that can switch between multiple. The other option is wireless peripherals that use a USB dongle can work fine with KVMs like the ones Logitech sells.


If you want unified switching, you need to go through USB ports AFAIK.


There’s an app called Synergy that lets you seamlessly move mouse and keyboard input between devices, if you ever feel the need to run the Mac and the Windows machines side by side rather than using a KVM switch.


But... it can be a bit spotty. A physical KVM will be far better.


do they make kvms for moving the mouse between screen borders or just switching displays / inputs to a seperate pc?


Not sure, but that's definitely a particular thing Synergy does alright. I do find it's a bit meh still between OS' but only in some cases where you might have a full-screen game open on one computer and other non-full-screen content on another.


I personally just use a kvm to switch between computers on my desk with monitors that are dedicated to them because of issues with synergy. But also synergy doesn't work with vpns.


Have you tried that? How does it actually work? Does it not add lag?

I use a USB switch connected to both displays for my input devices so I can switch between screens/computers: https://a.co/d/ckNXGpt


Synergy worked great without perceptible lag for normal desktop stuff in the late 90s (including across all the steam powered pcs in the school computer room).


Does the refresh rate go higher if you configure it to 4k or 2k?



Sorry I haven't tried and not near them atm, but I am quite confident it does not.


Would you mind elaborating on the cables and connections? Do you plug a DisplayPort and an usb cable from windows and get kvm working? The monitors are daisy chained?


> Do you plug a DisplayPort and an usb cable from windows and get kvm working?

Yes. There is one upstream USB-C port on each display that is used for this purpose.

> The monitors are daisy chained?

Daisy chaining with Thunderbolt 4 (not Thunderbolt 3) is supported, but when daisy chaining the second display will only do 4K@60Hz or 6K@30Hz. For this reason, I plug both Thunderbolt 4 cables directly into my Macbook Pro M1, both on the left side of the laptop so it's a clean setup. Thankfully, the included Thunderbolt 4 cables are plenty long enough to reach across to the laptop positioned on either side. Apple's displays include such short TB cables that you'd have to buy another longer one to do this.

In addition, you may want to know that HDR mode is great but only supported with HDMI for some reason and I haven't found any DP->HDMI adapters that support it which rules out dual HDR displays with an RTX 4090 which only has 1 HDMI port. I think this is due to HDR needing HDMI 2.1. But I can have 1 monitor connected with HDMI for HDR when I need it. I just haven't bought a long HDMI cable for that yet. The KVM switching could support that setup just the same.


Thanks for providing all of that information. Maybe I missed you answering this:

What do you do with the monitors? Do you work with images, videos, code, etc.?


My day job is software development for creativity/art related software and I do a lot of visual creative work on the side, both personally and professionally. I use the PC for 3D rendering.


Ugly camera blob up top bother you at all?


I think I misunderstood this question earlier so here's another go at it...

Aesthetically, no. I am grateful that there aren't any gamer aesthetics involved, but my desk is not a showroom; it's a workspace. However, the cloth that wraps the speaker cluster seems cheap, like it could rip or get pilly over time. That is the only aspect that bothers me. I like that the cameras are attached because they are very nice cameras and support Windows Hello and I prefer not having a corded camera balancing on my displays.


Not enough to have a lower resolution display, and if a higher refresh rate will get rid of it then that is news to me. Will it?


Although I haven’t used these monitors, my experience with high dpi on windows and Linux has been a nightmare compared to OSX. It’s surprising me the non Mac world hasn’t made this a bigger priority.


I actually think Windows does it better than OS X (and Gnome, Wayland, and anything that does not support true fractional scaling). OS X just scales the entire surface and as result it always look blurry.


> OS X just scales the entire surface and as result it always look blurry.

I genuinely have zero idea what you are talking about. Typing this from my macbook connected to a 5k LG ultrawide monitor, and it is as crystal sharp as it can get. As opposed to my windows 10 desktop (connected to the same monitor) having some occasional application windows render fairly blurry and inconsistently (one of the main offenders of this is, ironically, task manager). And don't even get me started on font rendering in general.


When I used a 5k LG, on the lowest scaling above 100%, I would get shimmering effects when I moved windows. You could see the same art/glyph rendered differently depending on if it was on an even or odd line; move the window 1 pixel and the text totally changed. If you only ever run at integer scaling, this wouldn't be apparent.

Windows does a Much better job with non-integer scaling because hairlines are 1px no matter what the scaling and text is rendered with pixel-hinting instead of macOS's new, lame strategy of super sampling.


Surprisingly, macs can’t actually scale the UI like Windows. All you can do is simulate higher or lower resolutions. Which is fine if your DPI is sky-high, but a real pain in the arse if you’re working with a QHD 24”for example, and just want everything to be a bit bigger


OS X not only uses a lame hack to scale, it completely muddles the issue by introducing the concept of "HiDPI" UI elements. Somehow I can set my 4K monitor to use "native resolution" at 3840 x 2160, and yet the UI and fonts look fuzzy! Absolutely terrible, and a complete embarrassment for Apple imo since they are supposedly the UI kings. You only don't notice the issue because you're using a 27" 5k display, which has been "blessed" by Apple as the "correct" DPI to match native screens. For those of us with 4k screens (95% of the market), I guess we're just supposed to enjoy a subpar experience. Even X11 looks better.

For me, I only closed the book on the issue after finding BetterDisplay [0]. Basically a 3rd party program that gives you complete control over resolution, display density, and a ton of other options on MacOS. It has a trial mode but it is well well worth the money. With that + the CLI tweak to set font smoothing to 0, the 4K experience on MacOS looks decent. You can even decrease the effective scale of the native screen past "More Space", so those of us with good eyes can actually take advantage of the screen real estate.

Also, if you're curious to explore this issue beyond my subjective thoughts here, these [1] [2] blog posts do a great job diving into what is so bad about MacOS scaling, why 4K 27" or 32" screens end up looking bad, and why 5K 27" look okay.

[0] https://github.com/waydabber/BetterDisplay

[1] https://bjango.com/articles/macexternaldisplays/

[2] https://bjango.com/articles/macexternaldisplays2/


macOS will render at the next highest integer scale factor and then downscale to fit the resolution of your monitor instead of just rendering at the fractional scale in the first place


It’s effectively supersampling. The resulting image looks excellent.


There are several scenarios where it clearly doesn't look that good, and where Windows objectively does a much better job.

Most people (and companies) aren't willing to spend $1600 on Apple's 5K monitor, so they get a good 27" UHD monitor instead, and they soon realize macOS either gives you pixel perfect rendering at a 2x scale factor which corresponds to a ridiculously small 1920x1080 viewport, or a blurry 2560x1440 equivalent.


The 2560x1440 equivalent looks tack sharp on macOS. It renders at 5120x2880 and scales it down to native, as I said it’s effectively supersampling. I used this for years and never experienced a blurry image. I now run a 5k2k monitor, also at a fractional scale and again it looks excellent.


It very obviously is blurry, though. There's a reason so many people notice this issue, you're not going to be able to explain it away.


I have eyes, and recently updated subscription glasses, I can see what it looks like and it’s tack sharp.

Are you sure you aren’t confusing Window’s terrible font rendering with sharpness?


Does macOS support any scaling factors above 2x?


I had the LG 5k Ultra wide monitor and couldn't get used to it. I gave up and got the XDR display, expensive but worth it


Modern Linux DPI support is a nightmare. It's a shame, since if you just run and old-school software stack (X11; minimal window manager; xrandr to adjust DPI if you hotplug a monitor), then it has much nicer font rendering than Mac OS.

This is particularly frustrating since I've been using high DPI displays since the CRT days. Everything horribly regressed about a decade ago, and still isn't back to 1999 standards.


IDK, high DPI worked fine for me under Linux. I just set the desired DPI in Xfce settings, and everything scales properly. (Except Firefox, which has its own DPI setting! But it works equally painlessly.)

Where things go haywire is mixed resolution. It's best avoided :-\ Hence now I have a 28" 4k external screen which is exactly like four 14" FHD screens of my laptop, so the DPI stays strictly the same.


Mixed resolution can look great on X11 if you do it with super-sampling, especially if you're able to do it via integer downscaling.


> Hence now I have a 28" 4k external screen which is exactly like four 14" FHD screens of my laptop, so the DPI stays strictly the same.

I did the same and just bought monitors that all have the same DPI, so I can easily use scaling that matches on all of them.


It actually got really really good with Ubuntu 23.04 and KDE. It's finally working as it should, and perfectly sharp.


I'm actually not sure what your complaint is at this point. I've long been using 3 mixed dpi displays on windows 10 for gaming as well as normal desktop stuff. Any relatively modern software scales fine to high dpi. Some old software using old APIs has to be upscaled by the OS and is blurry, but that's stuff like... Winamp.


I guess you've not used VMWare, VirtualBox, DaVinci Resolve or most anything written in Java. There's more, but that's off the top of my head. There's plenty of software out there with unusably small text/displays even with just one display.


High DPI screens with Windows really show how bad the font rendering is.


At high DPI the difference in font rendering between ClearType, Freetype and macOS diminish greatly, it's mostly a matter of taste, and at least Microsoft hasn't crippled low DPI rendering in recent Windows versions like Apple did with macOS.


I'd guess gaming is at least partially responsible. For anything more than 2k you need a high end/expensive video card, which just aren't that common. Just look at the steam stats right now.. 62% of users have 1080p.


Dell cuts cost anywhere they can. Their power cable, keyboard, and mouse have the tiniest gauge wire I have ever seen and this is a $2500 system. Do not buy Dell.


The mouse cables are hilariously short these days. I know, I know, get a $30 bluetooth mouse that is better in every way, but I remember the days of like 8 ft mouse cables.


Friends dont let friends buy Dell monitors.


What's the issue with Dell monitors? I'm still using my Dell P2715Q (4k) from > 5 years ago every day and the only reason that would make me switch is that it doesn't support the latest HDMI connections any more.


Customer service / warranty is a joke. They'd rather spend hours slow walking an RMA, hoping you give up, rather than actually dealing with the problem.


I guess that might be very region specific. I once had to RMA one for a firmware update and it was very painless and quick in the EU.


Yeah, until I got enamored by 144Hz, I'd exclusively buy Dell monitors because of their great price/performance.


Every Dell monitor I've had from the UltraSharp line has been incredible, especially for the price. So, not sure what you're talking about.


Had an Ultrasharp 27" 4k USB-C that passed the sniff test spec-wise on paper, and was a great deal, again, for the specs.. but it just had awful glare/reflections, low brightness.. backlight failed in under 2 years, warranty service was awful, and they sent me a replacement with a green pixel line.

Just an absolute clown show of warranty service.

The ROI on the replacement was negative, I'd have been better off throwing it in the trash.

If you are happy with it in practice, and not just on specs, and are not banking on warranty being fulfilled - enjoy.


I've owned 6 UltraSharps and 2 of them are even nearly 20 years old and still work. I gave them to someone a few years ago. My old office had UltraSharps and they were flawless, I even requested them because the monitors they originally gave me hurt my eyes. Maybe you just got a bad egg, which happens to virtually every manufacturer.


I have had warranty service with other companies, my Dell experience was such that I would write off any damaged product I was gifted in the future, and not buy another one myself again.

Hours in iMessage chat with offshore reps, having to record videos of various diagnostics before they'd issue an RMA. Just to be sent another broken monitor. Clown show.

Honestly, make me pay to ship it to you and if I lied about it being broken, send it back. I do not have time for multi-hour 200 message iMessage conversations over a $400 monitor.

Eizo warranty service could not have been simpler. And the product they sent back worked!


What do friends let friends buy?


Eizo, Apple, LG, maybe a Viewsonic or Samsung..

Basically go read rtings.com for whatever size/resolution/price class you are looking for, and look at the in depth reviews for the things that matter to you.

In my case I have a bright office, so glare/reflection and backlight brightness are extremely important.

A good monitor can last you 2-3x as long as the compute you hook up to it, and your eyes are important.. I think its not the place to skimp on cost in the total home office setup.


Have you tried LG or Samsung service? Not better. Generally my observation is that outside of warranty problems HP and Dell monitors have better calibration and balanced feature set in OSD compared to Asian ones, when they have all kind of unnecessary things in settings and lack what I want (unlocked sRGB for example).


I find a 2Kish (2560x1440or1600) resolution on a 24-27" Monitor to be the most usable for text.

I understand things like fonts render smoother on a 4-6-8K monitor but it also draws more power usually to power all those pixels.

The use of a Chroma 4:4:4 capable 4K tv is also often really helpful because one 4K screen at large enough size gives four 1920x1080 working surfaces. 4K seems to become useful at 38" Or so in a TV.

Is there something I'm missing out on in terms of option that are superior to 2K for external monitors in terms of usability?

Many of the ultrawide monitors sometimes have worse or close to the same resolution as a single 2K monitor.


Why are you conflating 2K (2048x1080/1920x1080) and 1440p (2560x1440)? They're not even close to the same resolution.


1440p monitors can be referred to as 2k monitors and I did clarify the resolution.

Confused, not conflated.

Same comment and question about 1440p then, please :)


> 1440p monitors can be referred to as 2k monitors and I did clarify the resolution.

By whom? The only reference to 1440p being called "2K" I can find are cheap rebranded monitor sellers that defraud their customers with all other specs anyway.


This seems to really be upsetting to you. I like monitors too.

“2K” monitors are widths that fall around the 2000 pixel range. It’s a resolution category, not a specification.

2K monitor categorization emerged when 4K started to be used as a rough mid point.

2560x1440 is often shortened to 1440p, but 2K is often easier to remember. Still, officially it’s called Quad HD (QHD) or even WQHD. With slight variations beyond it.

Since we’re trying to be precise the category for this res can also be called 2.5K.

But like you said, we won’t always be able to search for that.

Finding the spot between FHD and 4K has this fuzzy category. Every retailer Amy display it different depending on how and when their sites were built and evolved.

The newer ultra wide monitors can do a lot of sleight of hand to stretch a lot more pixels horizontally to make the screen seem bigger.

“5K” monitors might give as many pixels as four 2560x1440, but it won’t be 4 monitors worth of work space.

All I know is my wife won’t give me back my QHD screens since she started using them because she can fit more on 2 monitors.


> It’s a resolution category, not a specification

That's where you're wrong, 2K as a term (just like 4K) originated in the cinema industry, and it's absolutely a specification.

DCI Digital Cinema System Specification v1.4.2, Section 4.3.1.

A 2K distribution – the resolution of the DCDM[3] container is 2048x1080

A 4K distribution – the resolution of the DCDM[3] container is 4096x2160

UHD was already stretching it with "4K", but 1440p is absolutely not a spec-compliant 2K resolution.


Ok, I’m wrong.

Benq lists their monitors as 2k. They’re wrong too.

https://www.benq.com/en-us/monitor/professional/bl2420pt.htm...


In a Desktop or plugged in laptop, does the extra power really matter? I greatly prefer the sharper text and images.


Oh, I meant the horsepower it takes to drive 3-4 separate monitors consisting of all those pixels can a single or maybe two 4/5/6K displays.

You’re absolutely right about clarity - if I was running two screens I’d probably get what you’re referring to. But having a workable third monitor at QHD proves priceless quite regularly for my work and most of it can be text specific.


The comments regarding matte vs glossy and text sharpness resonates with me. It’s really hard to use any non-Apple screens since their color is completely washed out and lifeless and text lacks crispness on all screens at a range of price points I’ve ever tried. What’s even more puzzling is how a large number of people online seem to adamantly defend widely sold lower-dpi 32” 4k matte screens as a superior product. It’s been over a decade since first retina products shipped and somehow 200 dpi+ desktop screens is still an extremely niche product.


The M1 MacBook Pro 14" shipped with a GtG response time of 58.4ms.

The U3224KB featured in this review has a GtG response time of 5ms in fast mode, which is considered average but not great. Most high-end monitors manage GtG response times of around 1ms. With OLED it's closer to 0.5ms.

58.4ms is unacceptably bad.


Sure, they are not fast response gaming screens, but I am specifically calling out reading all forms of text (web/email/documents/coding) as a main use case.

OLED would be great if we could finally get 27” 5K. Currently all the gaming screens sit at around 1440p pixel density and are often coupled with non RGB pixel layout that causes color fringing on text: https://pcmonitors.info/articles/qd-oled-and-woled-fringing-...


You should be able to set the font anti-aliaser's subpixel order unless you're running a terrible operating system.


To be clear, a 58ms response time is so bad that it produces a ton of ghosting while just scrolling text. Apparently Apple screens have always been this bad and Mac users just live with it.


> somehow 200 dpi+ desktop screens is still an extremely niche product.

Resolution and color accuracy are subject to diminishing returns.

I could be wearing my glasses right now. They're not a strong correction, but I could see slightly better. In fact, they're within arm's reach. Meh.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: