The technical details are all right (or seem right to me, anyway), but this is too opinionated for my liking. No, you do not need a 4K monitor for software development. Some people might like them, some won't. [edit/clarification: someone rightfully pointed out that nobody will actively dislike a 4K monitor. I was unclear here: I meant "some people won't need them" more than "dislike them"]
This sounds like when Jeff Atwood started that fad that if you didn't have three external monitors (yes, three) then your setup was suboptimal and you should be ashamed of yourself.
No. Just no. The best developers I've known wrote code with tiny laptops with poor 1366x768 displays. They didn't think it was an impediment. Now I'm typing this on one of these displays, and it's terrible and I hate it (I usually use an external 1080p monitor), but it's also no big deal.
A 1080p monitor is enough for me. I don't need a 4K monitor. I like how it renders the font. We can argue all day about clear font rendering techniques and whatnot, but if it looks good enough for me and many others, why bother?
Hello! Person who actively dislikes 4k here. In my experience:
1. No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode. Depending on the OS that can mean it renders tiny, or that the whole things is super ugly and pixelated (WAY worse than on a native 1080p display)
2. If the 4k screen is on your laptop, good luck ever having a decent experience plugging in a 1080p monitor. Also good luck having anyone's random spare monitor be 4k.
3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow than "My fonts are almost imperceptively sharper" ever could
My setup is 2x24" 1920x1200 monitors - so I get slightly more vertical pixels than true 1080p, but in the form of screen real estate rather than improved density. I also have 20/20 vision as of the last time I was tested.
My argument in favor of 1080p is that I find text to just be... completely readable. At various sizes, in various fonts, whatever syntax highlighting colors you want to use. Can you see the pixels in the font on my 24" 1080p monitor if you put your face 3" from the screen? Absolutely. Do I notice them day to day? Absolutely not.
I genuinely think 4k provides no real benefit to me as a developer unless the screen is 27" or higher, because increased pixel density just isn't required. If more pixels meant slightly higher density but also came with more usable screen real estate, that'd be what made the difference for me.
> 1. No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode
You lost me right here on line 1.
If there are apps on MacOS that can't handle high dpi mode, I haven't run into them as a developer (or doing photo editing, video editing, plus whatever other hobbies I do). Also, I don't have any trouble with plugging my highDPI MacBook into a crappy 1080p display at work.
> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming.
Things like this are exactly why I left Linux for MacOS. I absolutely get why you might want to stick with Linux, but this is a Linux + HighDPI issue (maybe a Windows + highDPI issue also), not a general case.
> I genuinely think 4k provides no real benefit to me as a developer unless the screen is 27" or higher, because increased pixel density just isn't required.
You could say the same for any arbitrary DPI; 96dpi isn't "Required", we got by fine with 72dpi. It's all about ergonomics as far as I'm concerned.
I'm on Windows and can confirm 1. is an issue. Windows lets users scale UIs to be readable at high DPI on small screens. Doesn't work on all UIs (e.g. QGIS). So maybe not all OS's, but two important ones.
> You could say the same for any arbitrary DPI; 96dpi isn't "Required", we got by fine with 72dpi. It's all about ergonomics as far as I'm concerned.
I think the point the parent is making is that human vision has limited resolution. I.e. for a given screen size & distance from the screen, you cannot notice any difference in DPI past a point. The parent is suggesting that 1080p & 27" with a typical viewing distance is already higher resolution than the eye can resolve. Looking at my 1080p 27" screen from a metre away with 20/20 vision I am inclined to agree!
>I think the point the parent is making is that human vision has limited resolution. I.e. for a given screen size & distance from the screen, you cannot notice any difference in DPI past a point. The parent is suggesting that 1080p & 27" with a typical viewing distance is already higher resolution than the eye can resolve. Looking at my 1080p 27" screen from a metre away with 20/20 vision I am inclined to agree!
Are you sure you have 20/20 vision? I can absolutely resolve individual pixels with zero effort whatsoever on 1080p 27-inch displays.
Back when I had a 27-inch 1080p display at work, my MacBook's 13-inch Retina Display effectively became my main monitor. The 27-inch monitor was relegated to displaying documentation and secondary content, because I found its low resolution totally eye straining
Edit: I might have found it so eye straining because MacOS does not support sub pixel rendering. That means a lot of people will need a 4K or Retina monitor to have a comfortable viewing experience on the Mac.
MacOS does support subpixel rendering, has at least since the early to mid 2000s. One or two versions back though they turned it off by default since it isn't necessary on HiDPI "Retina" displays and they only ship HiDPI displays now.
You can still turn it on although it requires the command line.
Subpixel rendering dramatically slows down rendering text. When you have a high res screen, and want everything to be 120fps, even text rendering starts to be a bottleneck.
That combined with the fairly massive software complexity of subpixel rendering is probably why mac dropped it.
Been a while since my eyesight tested, but I think so! I can see pixels if I focus but not when reading text at any speed. I have also checked and my display is only 24" (could've sworn it was more!) so maybe that's why. I retract my comment :)
> I think the point the parent is making is that human vision has limited resolution.
If you can't see the difference between 4k and 1080p on a 24" monitor, then you probably need reading glasses. On a 27" monitor it's even worse. It's not so much that you can "see" the pixels, sub pixel rendering and anti-aliasing go a long way to making the actual blocky pixels go away, the difference is crisp letters versus blurry ones.
Yes, I can see the difference, but I (personally) don't notice that difference while reading. I do notice a big difference when using older monitors with lower DPI compared with 1080p on a normal-sized desk monitor, however.
> I'm on Windows and can confirm 1. is an issue. Windows lets users scale UIs to be readable at high DPI on small screens. Doesn't work on all UIs (e.g. QGIS). So maybe not all OS's, but two important ones.
Haven't seen any scaling issues on Windows in years. Last time was Inkscape but they fixed that.
I see these issues all the time, with enterprise desktop apps. The scaling is only really a problem because it is enabled by default when you plug in certain displays. If the user made a conscious choice (which they would easily remember if they had trouble), it would be fine.
For many, many years there were at the very most 120 dpi monitors, with almost all being 96, and I imagine a lot of enterprise applications have those two values (maybe 72 as well) hard-coded and don't behave properly with anything else.
I'm currently working from home, accessing my Windows 10 desktop machine in the office via Microsoft's own Remote Desktop over a VPN connection. This works fine on my old 1920x1280 17" laptop, but connecting from my new 4k 15" laptop runs into quite a few edge cases, and plugging an external non-4k monitor has led to at least two unworkable situations.
I've now reverted to RDP-ing from my old laptop, and using the newer one for video calls, scrum boards, Spotify and other stuff that doesn't require a VPN connection or access to my dev machine. It mostly works OK in that configuration.
I've seen other weird things happen when using other Terminal Services clients, though.
> Also, I don't have any trouble with plugging my highDPI MacBook into a crappy 1080p display at work.
Low DPI monitors are pretty much unusable since MacOS dropped subpixel rendering - with fonts being a blurry mess. You can only really use MacOS with high DPI monitors now for all day working. It’s a huge problem for everyone I know who wants to plug their MacBook into a normal DPI display. Not that the subpixel/hinting was ever that good - Linux has always had much better font rendering in my opinion across a wider range of displays.
> Low DPI monitors are pretty much unusable since MacOS dropped subpixel rendering
Nonsense, fonts look fine on non-Retina monitors; they were fine on my old 24" 1920x1200 monitor and are fine on my new 27" 2560x1440 one. Can I see a difference if I drag window from the external monitor to the built-in Retina display? Yes, but text is not blurry at all on the external monitor.
If it matters, "Use font smoothing when available" is checked in System Preferences (which only appears to have an effect on the Retina display, not the monitor).
That's been my experience, too. I prefer high-DPI monitors, but back when I was going into the office (remember going into the office?) and connecting my MacBook to a 1920x1200 monitor, text was perfectly readable. I suppose if I had two low-DPI Macs, one running Catalina and one running, I don't know, High Sierra, I might be able to tell the difference at smaller font sizes,
As an aside, I wonder whether the article's explanation of how font hinting works -- I confess for all these years I didn't know the point of "hinting" was to make sure that fonts lined up with a rasterized grid! -- explains why I always found fonts to look a little worse on Windows than MacOS. Not less legible -- arguably hinted fonts are less "fuzzy" than non-hinted fonts on lower-resolution screens, which (I presume) is what people who prefer hinted fonts prefer about them -- but just a little off at smaller sizes. The answer is because they literally are off at smaller sizes.
These things are fairly subjective. But it’s hard to argue that Catalina has good font rendering on regular DPI screens. I dealt with it when I had to, but it was very poor. There are also tons of bugs around it. Like the chroma issue - Apple doesn’t support EDID correctly so fonts look even more terrible on some screens. A google search will confirm these problems.
This is an interesting position. I have always thought that font and font rendering were always an especially pernicious issue with Linux and a relative joy on MacOS?
I think that is a historical artifact. Ubuntu had a set of patches for freetype called infinality, developed around mid 2010, which dramatically improved font rendering. Since then, most of those improvements have been adopted and improved in upstream. [1] Any relatively modern Linux desktop should have very good font rendering.
As with most things Apple, it is a joy as long as you restrict yourself to only plugging the device into official Apple peripherals, preferably ones that are available to buy right now. It’s when you start hooking your Mac up to old hardware or random commodity hardware that the problems surface.
I recently started using Linux some on the same 4K monitor I usually have my Mac connected to. I was shocked at how much sharper and easier to read the text was on Linux.
I have been using a 4k monitor and 2 1080p monitors on linux for a while now. The current state of things is that hidpi works correctly on everything I have run including proprietary apps. I'm also surprised when my wine programs scale properly as well.
What does not work perfect is mixing hidpi and lowdpi screens. On wayland with wayland compatible apps it works fine but on X11 or with xwayland apps like electron it will not scale properly when you move the window to the other screen, it will scale to one screen and be wrong when moved over. Overall I don't find this to be too much of an issue and when chrome gets proper wayland support the problem will be 99% solved.
I can confirm this anecdata. Single and dual 27" 4k is fine, but mixing with a 27" 1440p is messy (tried with GNOME and KDE on Manjaro during early Corona home office).
> I have been using a 4k monitor and 2 1080p monitors on linux for a while now. The current state of things is that hidpi works correctly on everything I have run including proprietary apps.
It's good to hear things aren't as bad as some have suggested.
I think it was bad since when I look through the issue trackers a lot of hidipi bugs were closed less than one year ago but I have not really noticed much other than what I noted about multi monitors.
> If there are apps on MacOS that can't handle high dpi mode, I haven't run into them as a developer (or doing photo editing, video editing, plus whatever other hobbies I do). Also, I don't have any trouble with plugging my highDPI MacBook into a crappy 1080p display at work.
PyCharm had high CPU consumption issues with a Macbook connected to a 4k display running in a scaled resolution. Native 4k was fine, using the default resolution was fine but "more space" made it use tons of CPU since it had to rescale text and UI elements on the CPU.
I think all the jetbrains tools did. I remember a few years ago there was some big switch over (high Sierra maybe?) and the jet brains tool fonts were janky for a while - something to do with the bundled jvm and font rendering. I think it's sorted now, but there was an 'issue' there for a while. Maybe it's still an issue in some configurations?
It's why I actively avoid monitors with small pixels. My trusty old Dell U3011 and the two rotated 1600x1200's flanking it suit me just fine.
I've no inclination to change my OS, either, just for the sake of fonts.
Kudos to those commenters still on CRT displays. One complaint I have with LCD's is reset lag time, which can make it tricky to catch early BIOS messages.
Sure. The fastest I've seen lately is that ARMv7-A board submitted the other day which boots in 0.37 seconds, or with networking, 2.2 seconds. That's time to user land, and to achieve it took highly specialized firmware and a stripped down kernel compiled with unusual options. I've yet to see a PC come anywhere close to that, and personally I won't consider the boot problem solved until a cold one completes quicker than I can turn on a lightbulb.
In fact historically, some of the higher-end hardware yielding the best performance during operation (an axis along which I optimize) actually added time to the boot sequence. The storage subsystem on my workstation is backed by a mix of four Intel enterprise-grade SSD's in RAID-0 (raw speed) and 8 big spinning platters in RAID-6 (capacity), plugged into an Areca 1882ix RAID card w/ 4GB dedicated BBU cache. Unfortunately that card adds a non-bypassable 30 seconds to the boot sequence, no matter what system you plug it into. But once there, it screams. It's only just the last couple years PCI-NVMe drives have come out that can match (or finally beat) the performance metrics I've been hitting for ages.
So I actually kind of feel like I've been living in the future, and the rest of the world just caught up ;-).
If there are apps on MacOS that can't handle high dpi mode, I haven't run into them as a developer (or doing photo editing, video editing, plus whatever other hobbies I do).
Same. I haven’t run into any apps that don’t support high dpi mode. Even terminal apps look great on my Retina 4k iMac screen.
Before getting this machine nearly a year ago, I couldn't natively view high dpi graphics for web projects I’d work on, which was a problem since there are billions of high dpi devices out there.
> If there are apps on MacOS that can't handle high dpi mode, I haven't run into them as a developer (or doing photo editing, video editing, plus whatever other hobbies I do).
The version of pgAdmin not based on electron had pixelated fonts on a 5k iMac. I haven't checked recently.
> If there are apps on MacOS that can't handle high dpi mode, I haven't run into them
Audacity has extremely low framerates and chugging when interacting with the waveform on retina screens. Even running in low DPI mode doesn't fix it. Only runs nice on non-retina displays.
Agreed. I have a retina mbp that I use as the 3rd screen plugged into a 4K monitor over usbc and a 1920x1200 over dp. Everything works fine. Windows stay in the right place when plugging/unplugging, etc... My eyes also thank me every time I look at the 4K text on my main monitor. I’m debating buying another one and dumping the 1920 monitor.
The post I quoted was: "No matter what operating system you're on". I quoted that particular bit for a reason.
The ergonomics of better screen resolution don't change just because your OS isn't good at dealing with high resolution.
When you pick an operating system (if you have the choice), there are a lot of factors, it's good to know what's important to you and choose appropriately.
That was a direct reply to the “No matter what operating system you're on” quote. The fact that there are some operating systems that have issues doesn’t make it true that all do.
FWIW, my main monitor is a 43" 4k display, and it works perfectly fine on AwesomeWM - but I' don't use any scaling, the 4k is purely for more screen real estate - literally like having 4 perfectly aligned borderless 24" monitors. I can fit 10 full A4 pages of text simultaneously.
I recently upgraded to a 43” 4k monitor and use it the way you describe. I’m not sure I am happy with it. The real estate is nice but it might be too much. UI elements end up very far away. I rarely need all that space.
I either need a bigger (deeper) desk to sit back farther or just a smaller monitor physically with the same resolution.
What I found helped when I moved to a large 4k screen is when I stopped trying to eke out every last bit of space and started using my desktop as an actual desktop analogue again. Whereas I used to full screen, and snap to half or quarters of the screen, now I have a few core apps open that take up roughly a quarter, usually a browser and email, and other apps I open up and move around to organize as I feel the particular task warrants (generally some terminals that are all fully visible). I often drag the current thing I'm working on to the bottom half of. The screen so it's slightly closer and easier to see directly, and leave reference items or stuff I'm planning on revisiting shortly up top.
I also was thinking I didn't particularly like the large desktop and screen at first, but now that I treat it as a combined wallboard and desk space, I can't imagine going back (and using the large but not quite 4k monitor at my desk at the office always felt like a step backwards).
I do set default scaling for Firefox and Thunderbird to be about 125% of normal though, as I don't like squinting at small text. I generally like how small all the other OS widgets are though, so I don't scale the whole desktop.
Thanks for the input. I’m about a week in and that is the realization I am reaching. My window sizes are almost back to where they were before.
It has been an iterative process but I’m getting a handle on it.
I have a sit/stand desk, I’m considering moving my recliner in to the office and elevating the monitor. Working from a reclined position seems ideal but I also thought a 43” monitor was a good idea...
If you're on Windows, consider the fancyzones power toy. Break up your screens into an arbitrary set of maximizable zones. Highly recommend if the normal 4 corners isn't enough for you.
I'm on MacOS. I basically never maximize anything. I just resize the floating windows and put them where they make sense at the time. The dream would be doing this with the keyboard using something like i3. I have been tinkering with that on weekends but M-F I focus on paying work.
Check out Moom. You can save window layouts and replay, save window positions and apply them to any app, arbitrarily move, maximize to a range with a customizable margin, etc etc.
https://manytricks.com/moom/
Depending on how exact of a similarity you're looking for, gTile for GNOME/Cinnamon might be of interest to you. I've also found PaperWM to be very productive.
I use gTile for gnome. I had to play around to find a setup that I like. I eventually settled on ignoring most of the features that were offered out of box. Now I have configured a few simple keyboard shortcuts.
For example, Super + Up Arrow will move a window into the central third section of the display. Pressing it again will expand the window a little bit.
Nice and simple. Makes working with large displays pleasant.
I use Rectangle. I’ve configured it with a few keyboard shortcuts that let me move a window into specific regions on the display. I use it to quickly have multiple non-overlapping windows.
I cannot imagine using a large display without it!
BetterTouchTool is _excellent_ for window placement/resizing; also can be triggered externally if you want to combine it with Alfred for Karabiner Elements (though you don't have to, you can define the triggers in BTT itself).
The 4K monitor at my office is a fair bit smaller than that, and I haven't used it since the COVID-19 epidemic sent me packing home. Since then I've just been using my laptop's built-in screen.
To be honest, I think I may stick to it. At first, the huge monitor was fun, and initial change to having less screen real estate was definitely a drag. But, now that I'm accustomed to it again, I'm finding that "I can fit less stuff on the screen at once" is just another way of saying, "it's harder to distract myself with extra stuff on the screen." My productivity is possibly up, and certainly no worse.
A major pain (literally) point for me with laptop screens is posture. My neck aches after a day of looking mostly down. I suppose an external keyboard and mouse would help but I would have to get a stand and blah blah.
Also for my particular workload real estate is very handy. I totally agree with there being some virtue to constraints but several times a day I really need the space.
I think this really depends on the work you do, also. Pure development or content creation and I'm good with just a laptop. For research, with team communication, concurrent terminal sessions, debugging, management - I really do want at least 3 screens.
This. After doing some research, as far as I can tell a 32" 4K maximizes the amount of content you can see at one time within a comfortable viewing angle and without needing scaling to make text readable.
At typical desk monitor distances you shouldn't be able to see distinct pixels anyway.
Agreed. My work monitor is a 32 or 34" ultrawide. It works well but I would really like more vertical real estate. I'm definitely shopping for 32-34" 4k displays right now.
Looks nice also, but I already have a 4k 32" Eizo Flexscan that I'm happy with - I'm after the 4:3 aspect ratio, everything just seems to go wider and wider these days.
Vertical is the reason I haven't upgraded from dual 1920x1200 (but of the time single anyway). Although I'm looking at 1440p mainly for 120hz+. (1440p at 30+")
43" seems rather large--how far away do you sit? If it were as close as a more "normal" sized monitor (~2-3 feet), wouldn't you be craning your neck all day trying to see different parts of the screen?
Nah. I have a 49" curved 1440p monitor. Things you look at less often go to the sides. You can fit 4 reasonable sized windows side by side. Code editor holds over 100 columns at a comfortable font size for me 40 year old eyes. It's the best monitor setup I have ever had. You can spend less and get the exact same real estate with two 27" 1440p monitors. Either way, it is a fine amount of real estate and not at all cumbersome for all day use in my case.
I am getting the same Dell 4919DW monitor, transitioning from two 25" Dell monitors. I think the built in KVM will be great addition as I have two workstations. Ordered the new Dell 7750 to pair with a WD19DC docking station. I hope the Intel 630 UHD built in graphics will do, as stated in the knowledge base.
The 4919DW only have 60hz refresh rate but I am not concerned about that. A great alternative would be the curved Samsung 49" C49RG9 at 120hz.
It’s quite a piece of hardware. It lets me plug a USB hub into it. It supports USB-C for its display adapter. So it’s the dream: one USB-C cable to charge the laptop, drive the display, and provide a USB hub for keyboard, mouse, etc.
I'm in the same boat. More real estate is the big win. I made a pandemic purchase of a TCL 43" 4k TV to use as a monitor primarily for programming. I sit a bit further from it: 30" rather than 24ish when working on the laptop. I drive it with a 2019 inexpensive Acer laptop running Ubuntu 20.04 and xfce. Every so often an update kills xWindows, but I can start it in safe mode and get things working.
I do find my head is on a swivel comparatively, but while noticeable without being a negative. Overall I like it. A lot. The only thing that is painful is sharing the desktop over Webex/skype. That does bog the system down and requires manual resizing of font size to inflate it so that viewers on lower resolution systems can cope with it.
I am somewhere in between. I don't go for hi-DPI but am using 28" 4K on the desktop and 14" 1080p on my laptops. So identical dot pitch and scaling settings. I just have more display area for more windows, exactly as you say like a 2x2 seamless array of screens.
I actually evolved my office setup from dual 24" 1920x1200 and went to dual 28" 4K. But with the COVID lockdown, I only have one of the same spec monitor at home for several months, and realize that I barely miss the second monitor. I was probably only using 1.25 monitors in practice as the real estate is vast.
People who complain that a monitor is too large should stop opening a single window full-screen and discover what it is like to have a windowing system...
In the same boat here. I use a $400 49 inch curved 4k TV as my monitor along with i3wm and while I waste a lot of the space on screen to perpetually open apps I don't touch, having the ability to look at my todo list or every app I need for a project at the same time has its benefits. I just wish I could lower the height and tilt the TV upwards a bit so I'm not breaking my neck looking at the upper windows.
I had a similar setup at a previous job -- one of the early 39" TVs. It could only drive 4k at 30Hz, but for staring at text, nothing could beat it. It takes a good tiling window manager to get the most out of this setup. By the same token, a good tiling WM also makes a tiny little netbook screen feel much bigger. So I guess what I'm really saying is, use a tiling WM!
43" 4k is approximately 100dpi, like 21" 2k. It seems like a reasonable form factor to me (at 1x), but there aren't many of them that do high refresh rate, and they're all very expensive.
I've only seriously tested the Dell P4317Q that I have in the office. Others have had good success with small 4k TVs. Can't say I've noticed anything about the latency, but I've never gamed or watched movies on it, so IDK
I use Samsung 4k TV's (55in and 43in) at work and home and the experience is absolutely fantastic. In game mode the latency is reported to be 11ms and there's no difference visible to me compared to 60hz computer monitors.
Do you have the specific model numbers? I'm buying a new monitor soon, and considering using a 4K TV. Would be great to check out the ones you're using!
If you are on macOS, all is good. Never had a problem with any of my 4 monitors (3x4K, 1x5K). I set the scaling to a size I like, and the text is super crisp. I don't see how any programmer can NOT like that.
How do you manage multiple monitors with MacOS? I was doing this until recently and every single login involved rearranging my windows because MacOS moves them all to whatever display woke up first.
In my experience MacOS multi-monitor support is effectively non-existent.
Recently I picked up a 49” ultra-ultra wide monitor (basically 2x27” panels). It is one monitor but MacOS can’t drive it. They just don’t detect that resolution. I switched to a 43” 4k monitor (technically more pixels) and MacOS drives it fine.
My experience with MacOS is not “it just works” unless you are doing something Apple already predicted. That’s fine for me, I just wish they still sold a reasonable monitor themselves so I could be assured it would work properly.
How do you manage multiple monitors with MacOS? I was doing this until recently and every single login involved rearranging my windows because MacOS moves them all to whatever display woke up first.
I finally got sick of this and wrote a Hammerspoon script to deal with this. The config looks like this:
> every single login involved rearranging my windows because MacOS moves them all to whatever display woke up first
Maybe we can get to the bottom of this. What is your use case?
I ask because as long as I plug them into the same ports it remembers how I arranged them previously (2018 macbook pro 15"). I haven't had to arrange them in over a year... even remembered when updating to latest operating system. Occasionally, I even plug in my LCD TV as a third external monitor and it remembers where that one should go in the arrangement too.
MacOS cannot drive one 5120x1440 display using Intel display hardware. It will happily drive two displays at 2560x1440. The monitor had multiple inputs so by putting it in PBP mode I was able to drive one input as USB-C and another as HDMI through a dock converter. This means the wakeup was not in sync. MacOS would see one monitor, arrange everything on that then realize there was a second one and fail to move anything back in this "new" arrangement.
The fact that it was all one physical monitor may have further confused the OS as a sibling comment mentions.
The solution was to sell the monitor to a Windows-using architect friend and buy a different panel with a resolution MacOS supports. She has a macbook too but it's the fancy one with discrete graphics which can drive 5120x1440.
The value proposition of MacOS to me is that I plug things in and they work. Any fiddling beyond that destroys the benefits of using this platform. I'm willing to iterate on hardware until I find something that works.
I do not have a 2020 MacBook so I cannot test but the Pro Display XDR is not 5120x1440, it is 6016x3384. The problem with my current MacBooks ('14 15" RMBP and '17 13" MBP, both with Intel Iris graphics) is that while they can drive 4k displays they cannot drive the 5120x1440 resolution specifically.
This limitation is specific to the MacOS drivers. Windows in Bootcamp is able to drive 5120x1440 on these devices.
Yeah I read through all those. It’s a work laptop so I’m not comfortable doing things like disabling SIP or mucking around in any system settings. That machine is my livelihood so I don’t mind finding devices that just work.
Ah ok, ya maybe it's related to it being the same monitor.
I have two different monitors that wake up at very different speeds and it's no problem here. My 15" 2013 and 2015 macbook pros had no problem with this either, and I've had 4 different monitors in the mix through those years too. I've transitioned to a CalDigit Thunderbolt 3 dock now and still no problem with it remembering.
So there's definitely something unique about that monitor. That is sad news for me too -- I'm hoping they make a 2x4K ultra wide monitor like that someday. Hopefully they've solved this problem by then.
That might work but it breaks my workflow in another way. Physically the display is a single panel. I organize workspaces by task so changing to a new one needs to change "both" panels because I'm actually using them as one.
>How do you manage multiple monitors with MacOS? I was doing this until recently and every single login involved rearranging my windows because MacOS moves them all to whatever display woke up first.
At least for apps that are dedicated to one screen + virtual desktop, right click its icon in the dock and assign it to that display and workspace.
Note that the effectiveness of window restoration also depends on the make/model of your monitors – many manufacturers incorrectly share EDID's across all units of the same model and sometimes across multiple models, making it much more difficult for operating systems to uniquely identify them.
That used to happen occasionally to me as well in earlier macOS versions. Didn't have to do any rearranging since Mojave, I think, definitely not on Catalina.
I use a single 34" 4K monitor with an arm mount on my Mac Mini. The power button on this monitor is one of those touch sensitive ones on the bottom right that I sometimes accidentally brush past. When I switch it back on, every single window gets reduced into an incoherent size and everything gets moved to the top left. It's really annoying.
I'm thinking of flipping my monitor upside down so I'll never accidentally brush that area while picking up something on the table.
That's likely a firmware bug in the monitor. It probably reports some ridiculously small resolution during its boot process and macOS queries it that time and rearranges the windows accordingly.
macOS could implement workarounds of course, but probably it just follows the process whatever the display id protocol prescribes...
What kind of MacBook do you have exactly? Year, size, graphics hardware and OS.
Reports I read stated that while you can select it with SwitchResX it was scaled.
I never tried installing it myself because I’m not a fan of modifying the system on a Mac, especially one I don’t own.
From my poking around I think the horizontal resolution is the problem. The system scans possible resolutions to see what works. Apple just never expected a single display that wide.
There’s some reports that newer MacBooks with discrete graphics on Catalina can indeed run this resolution. It used to not work regardless of hardware, now apparently discrete graphics MacBooks can run it. Maybe because they updated the drivers/system for their new super fancy monitors.
Go to the Displays panel, switch to the "default for display" option, then switch back to "scaled" while holding down the option key. Do you see that resolution in the list of options?
I’d say all is bad with MacOS and external monitors... It can’t manage text scaling like Windows, so you either have to downscale resolution and get everything blurry or keep the ridiculously high native resolution and have everything tiny :(
Is it not visible for you in the displays settings? You DO need all the monitors to have the same DPI or you’d have a window rendered half in one dpi and half in another when dragging across a display boundary.
No, when it’s an external screen I don’t get any scaling options, only the choice of resolution. I have a 24” QHD, so either it’s ridiculously small 2500xSomething or it’s blurry HD :(
I have this problem as well. I actually run my 27-inch 4K screens downscaled on MacOS because the tiny font at native-4K gives me a headache.
The worst thing about it is that scaling seems to use more CPU than running natively and the OS has some noticeably additional latency when running scaled.
Odd, my main setup is an external 4k monitor and I only use it with the “large text” text scaling and I have no complaints, the text is clear and large and easy to read. Perhaps you’re also using your laptop screen as well?
At work I have a mac mini and a Windows box, and I use three crapola Asus monitors between them, and my impression has been that macOS does a better job rendering text on said crapola monitors (the Windows box does a better job at compiling C++ in a timely fashion, though, so I mostly work on that one).
It's just a different stylistic choice. A lot of font nerds prefer the OSX choices because they try to stay true to the original font spacing without regard to the pixel grid.
Missing sub-pixel antialiasing is plain technical deficiency, not a stylistic choice. I agree arguments can be had about hinting and aligning the glyphs to the pixel grid, but not much beyond that.
Yeah they didn't completely remove it, but they did a good job of hiding it by not making it an option to turn on in the GUI. Have to use a terminal command to enable it:
https://apple.stackexchange.com/a/337871
In general with a HiDPI screen I don't find any need for it. But on a low-res display like the typical 24" 1080P models it certainly helps.
Completely agree! Went from a mediocre 2x1440p to high quality 2 x 4K, then back to a pair of equal quality 2x1440p.
I would also add, when it comes to 4K and, for example, MacBooks, things fall apart quickly in my opinion. Cables, adapters/dongles/docking stations just must match up for everything to work in proper 60fps, and it gets worse if you have two external displays.
As for my home set up, also stayed at 25" 1440p. Nice balance for work, hobby and occasional gaming without braking the bank for a top-tier GPU.
>I would also add, when it comes to 4K and, for example, MacBooks, things fall apart quickly in my opinion. Cables, adapters/dongles/docking stations just must match up for everything to work in proper 60fps, and it gets worse if you have two external displays.
I agree it's a bit of a mess, but USB-C monitors solve all those issues. I just plug my MacBook in with USB-C, and instantly my 4K (60 Hz) display is connected, along with external sound and any USB peripherals. No fussing with a million different cables and adapters. It's the docking workstation setup I've dreamed of for a decade.
It doesn’t solve all of the issues. The USB-C port can support DisplayPort 1.2 or 1.4 bandwidths and you have to make sure it matches up for some high-resolution monitors to work.
Why not both? If I'm on linux, with no interest in changing and perfectly happy with my display, and 4k doesn't work easily on my system, why would I be interested in a 4k screen?
Strange. I'm not seeing any issues with Linux and 4k. I'm running a plain Debian 10 with OpenBox running on 4x 4k (3x 28" in a row and one 11" 4k under the right-most) monitors though, granted, I only normally have one web browser that follows me around, across work spaces pinned to the right monitor, mostly maximized Sublime on the middle monitor and a pile pile of alacritty/xterm windows on the left-most monitor. The small monitor which content also follows me around contains clipboard, clocks, Slack and monitoring.
What is the software that people are using that creates problems?
So far, I've never had an issue with KDE Plasma and 4K@60Hz on linux, once I realized that you can't just use any old HDMI cable: you need DisplayPort or HDMI2
FWIW, switching between resolutions in my favorite desktop environment, Xfce, is two steps:
# This affects every GTK app.
xfconf-query -c xsettings -p /Xft/DPI -s 144
The second step is going to about:config in Firefox, and setting layout.css.devPixelsPerPx to a higher value than 1.0. I really need to write an extension to do that in one click.
What is really tricky, though, it's having two monitors with different DPI. Win 10 does an acceptable job with it; no Linux tools I'm aware of can handle it reasonably well. Some xrandr incantations can offer partial solutions.
Even Win10 struggles when you move windows between different DPI domains. Apps will slide in HUGE or tiny until you get past the midway point. And when the system goes to sleep everything can go to hell. You can come back to small message windows being blown up to huge sizes or windows crushed down to a tiny square. You can forget about laying out your icons perfectly on your desktop too, they'll get rearranged all the time. Even more fun when you remote into a high DPI display with a low DPI display. It actually works pretty well, but stuff will get shrunk or blown up randomly when you to back to the high DPI display.
Logically having your OS maintain a consistent UI size makes sense until you try it without.
I'm running a couple medium high density monitors alongside one of the highest density ones available. I don't scale the HIDPI monitor at all, which means when I drag windows to it they are tiny. Instead it works in two ways, as a status screen for activity monitors/etc and as a text/document editing screen. AKA putting adobe acrobat, firefox or sublime/emacs/etc on the high DPI screen and then zooming in gives all the font smoothing/etc advantages of high DPI without needing OS support.
So the TLDR is, turn off dpi scaling, and leave the hidpi screen as a dedicated text editor/etc with the font sizes bumped to a comfortable size. Bonus here is that the additional effort of clicking the menu/etc will encourage learning keyboard shortcuts.
I don't think the reasons you illustrated support that conclusion. You don't actively dislike the extra pixel density of a 4K display. You seem to only dislike the compatibility issues relevant to your use case.
>No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode.
FWIW, I can't recall the last time I has a problem with apps not rendering correctly in hidpi mode on MacOS. Unless you've got a very specific legacy app that you rely on for regular use it's a non-issue.
>Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming
Ah, I think I found the real issue ;-) If your linux desktop rendered 4K beautifully, seamlessly, and without any scaling issues right out of the box, I could all but guarantee that your opinion would be different.
You know, I was in complete agreement with the article and I was considering a 1440p monitor or something until I saw your comment and reflected on it. The most productive periods/jobs I can remember were on i3wm with a Goodwill 900p/19" monitor and a 20" iMac 10 years old at the time. But it's because I had access to good tools then like Neovim/Atom respectively. My work now requires an RDP Notepad.exe session so there's no monitor that will help me there. I guess software tools are way more important.
I have a 49" curved monitor. It is effectively two 27" 1440p monitors stapled together (5120x1440). It is the best monitor I have ever had. 1440p has a very decent [higher than typical] pixel density but is not "retina". Fonts look pretty smooth, but you can still see pixels if you try really hard. Overall, I do think high density screens look amazing, but the software has not quite caught up to them. The benefits are on the softer side, and if I could just have magical mega-high-DPI displays with no side effects, sure why not? As it stands, 49" curved monitor is pretty fine. It fits four windows side by side at reasonable resolutions.
Primary apps go in the middle, such as code editor, etc.. Tertiary windows, such as documentation go on the outer edges. Still quite usable, but a little out of the way for extended reading.
Hey, do you mind sharing more info. on how to get the monitor? I'm looking to invest in a curved one since it's an experience I've never had. And are there retina models out there, or is it not worth it, in your view?
I couldn’t agree more with this. A 49” 5120x1440 curved monitor is brilliant for productivity. It’s better than two or three separate monitors. I do miss high DPI but I wouldn’t trade this type of monitor for the current batch of smaller high DPI ones.
There’s only two or three things that would make this better. A high DPI variant, more vertical space and a greater refresh rate. Given those two things, I think that’s the endgame for monitors (in a productivity context).
(I think that’s 8x the bandwidth so it’s a while away!)
I have a 4k 27" monitor and I have had to run it in 1440p recently because its all my dell xps can manage and its usable but a noticeable downgrade. My 24" 1440p monitor at work looks perfectly fine though.
> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow than "My fonts are almost imperceptively sharper" ever could
What issues actually? Xmonad on Arch user here and I find the sweet spot for me is 27-32" 4k, 1440p on laptop (I guess 4k would be nice here too, but not sure if it motivates the increased power draw). After getting used to the increased real-estate, I do feel limited on my older 1080p laptop screen; fonts smaller than 8p (which is still all good) noticably impacts the readability to the point I can feel my eyes strain faster. It did take a bit of playing around with Dpi settings to get it right, though, out of the box it's not great. The Arch wiki has some great material.
The only frustration I do have (which IS super-frustrating, specifically for web browsing) is with multi-monitor setups with different pixel density - your point 2, I guess. Even plugging anything larger than 19" with 1080p into my 1080p Thinkpad is annoying.
I think it should be possible to configure it correctly but I just gave up and end up zooming in/out whenever I do this and send windows between screens. Haven't looked at it, but maybe a mature DE like KDE or GNOME (which, if you don't know, you can still use with i3) should be able to take care of this.
Also, this is all on X11, have no idea if and how wayland differs.
My day job involves untangling SQL that was written under fire. I consume more spaghetti than a pre-covid Olive Garden. Every vertical pixel is precious for grokking what some sub query is doing in the context of the full statement.
I used to when I ran dual 24" monitors! We had really sweet old IBM monitor stands that could tilt, raise and rotate. You had to be quick to grab them from the copy room before the electronics pickup. I swiped one from the still warm desk of a colleague on the way to their farewell happy hour.
So I had a '14 13"(? Might have been 15" but probably not) RMBP on the left with email and chat stuff, 24" main monitor in landscape and then another 24" monitor in portrait on the right. I think at some point I put a newer panel on the IBM mount. It was sweet.
Back then we still had desktop PCs at our actual desks so there was a KVM on the main monitor! What a time to be alive!
These days I do prefer a single monitor workflow if possible. It's just cleaner and move convenient.
Being used to MacBooks with retina screens, 4K on at 30” is perfect to me as “retina”. Anything larger needs to be 5k or 6k. 1440p is passable on <24”.
27" x 1440p has been my go-to for a while now. Works well without scaling between win/mac/linux, does not dominate the desk completely, high quality monitors are readily available in this resolution etc etc.
I think my dream set up would be a 27" 1440P (What I currently have at home, with a pair of smaller (19" maybe?) 1080p screens on either side setup in portrait. Basically a similar screen area to 2x27", but without a bezel right in the center of my field of view, and the 1080x1920 screens will be a good size for displaying full page (e.g. PDF) at more or less full screen.
I actually like having multiple screens. I know I'm weird in this but I actually like running certain apps maximized.. but I wouldn't want it maximized across a whole ultrawide.
Plus, if I take a working vacation somewhere it's a lot more practical to schlep around one 24" or 27" than an ultrawide.
Also, just as an ergonomic thing, I could angle in the two outer screens a bit while not having one of those icky curved screens.
>My setup is 2x24" 1920x1200 monitors - so I get slightly more vertical pixels than true 1080p, but in the form of screen real estate rather than improved density.
I'm working on an old 24"16/10 display (the venerable ProLite B2403WS) and an OK 32" 4K display with a VA panel. Both are properly calibrated.
There is no amount of tinkering that can make fonts on the 24" look good. It looks like dog shit in comparison to the 4K screen. It might not be obvious when all you got in front of your eyes is the 24" display, but it's blatant side to side.
On top of it, the real life vertical real estate of the 4K display is also quite larger.
I've never been a big 16/9 fan, but frankly at the size monitors come in today and the market prices, I don't a reason not to pick a few of these for developing.
> 1. No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode.
I haven't had this experience (MacOS, 4K monitor for 2.5 years)
> 2. If the 4k screen is on your laptop, good luck ever having a decent experience plugging in a 1080p monitor. Also good luck having anyone's random spare monitor be 4k.
shrug - 4K and 1080p seem to work together just fine for me. I've currently got a 27" 4K monitor and a 24" 1080p monitor both running off my 2015 13" MacBook Pro; the 4K is on DisplayPort (60Hz @ 3840x2160) and the 1080p is on HDMI (and it happens to be in portrait mode). I use all three screens (including the laptop's), and while the 1080p is noticeably crappier than the other two, it's still usable, and the combination of all three together works well for me. A couple of extra tools (e.g. BetterTouchTool) really help with throwing things between monitors, resizing them to take up some particular chunk of the screen, etc. - my setup's quite keyboard-heavy with emphasis on making full use of the space inspired by years of running i3 (and before that xmonad, ratpoison and others) on linux and freebsd.
> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming.
That's a statement about linux and i3, not monitors. (And again, I like i3, but stating this limitation as if it's a problem with monitors not i3 seems... odd.)
> > 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming.
> That's a statement about linux and i3, not monitors. (And again, I like i3, but stating this limitation as if it's a problem with monitors not i3 seems... odd.)
It is also wrong. I am a long time i3 user. Never had a problem with it, never done anything special. Most of the time I'm running Debian stable, so I even use software versions that most people consider 'old'.
> 1. No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode. Depending on the OS that can mean it renders tiny, or that the whole things is super ugly and pixelated (WAY worse than on a native 1080p display)
Never happened to me in 4 years, see below. That said, I barely use any graphical programs besides kitty, firefox, thunderbird and spotify.
> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow than "My fonts are almost imperceptively sharper" ever could
This is just not true. I have used the same 32" 4k monitor for 4 years running NixOS with bspwm (a tiling window manager, which does even less than i3) on 3 different laptops - thinkpad x230 (at 30 Hz), x260 and x395 and it all worked completely fine.
It depends on a very simple tool I wrote, because I was sick with `xrandr`: https://github.com/rvolosatovs/gorandr , but `xrandr` could easily be used as alternative.
Recently I switched to Sway on Wayland and it could not be smoother - everything just works with no scripting, including hot-plug.
> I genuinely think 4k provides no real benefit to me as a developer unless the screen is 27" or higher, because increased pixel density just isn't required. If more pixels meant slightly higher density but also came with more usable screen real estate, that'd be what made the difference for me.
Indeed, screen size is way more important than resolution. In fact, even 4k at 27" seemed too small for me when I had to use that in the office - I would either have to deal with super small font sizes and straining my eyes or sacrificing screen space by zooming in.
> 2. If the 4k screen is on your laptop, good luck ever having a decent experience plugging in a 1080p monitor. Also good luck having anyone's random spare monitor be 4k.
I have been running two 1440p displays on a 4K retina MBP and the experience has been impressively seamless. Both with Catalina (the latest) and High Sierra
The biggest problem is with Apple port splitters; they are crap and sometimes monitors wake from sleep with a garbled picture
So I'm a little nuts in that I run 2 x 27" 4K monitors side by side with no scaling. 27" is about the smallest I can tolerate 1:1 pixel sizes.
Since aging has forced me into wearing reading glasses, I wear single vision computer glasses that are optimized for the distance range of my monitors' closest and furthest points.
Because I dont have scaling enabled, I don't get any of the HiDpi issues that I've gotten on my laptops with Windows.
I have found that I am still wanting for even more screen real estate, and for a time I had a pair of ultrawide 23" monitors underneath my main monitors, but it created more problems than it solved and I recently went back to only two monitors.
That's an interesting idea. I should look into that when I eventually upgrade. A stubborn part of me left the monitors in landscape because I occasionally play games, but I end up never doing that on my desktop.
Because I prefer to use the same model of monitor when doing a grid, so I don't think I want to add to my existing setup, because my monitors are discontinued, and they're displayport only for 4K60.
I think they have more than a few years of life left in them, but I'll definitely look into a configuration like yours at upgrade time.
> Good luck ever having a decent experience plugging in a 1080p monitor.
A 4k monitor is now $300 (new). Used are even cheaper.
> Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow
I use Gnome 3.36 and many HiDPI issues I was having before are now gone, without any extra configuration.
> My argument in favor of 1080p is that I find text to just be... completely readable.
It is readable but fonts are pixelated, unlike 4k.
My only problem is that macOS has some artificial limitations when it comes to using non-Apple monitors. Like a lower refresh rate. My solution? Use Linux.
$300 is a lot of money where I'm from. And they aren't available at that price here, anyway.
What do you mean, fonts are pixelated at 1080p? Whether you can see the pixels probably depends on pixel size. I certainly can't see them on my 23" LG monitor unless I try really hard.
> No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode.
I am using using Emacs and tmux under Linux, in the Gnome desktop. Gnome has HDPI scaling. For me, that works fine with a 4K 43 inch display. The thing you need to watch out for is to get a graphics card with proper open source (FOSS) driver support. Some cards are crap and don't come with FOSS drivers. You can get them to run but it is a PITA on every kernel update. Don't do that to yourself, get a decent card.
I miss i3 so much. But I've succumbed to laziness and have been using my various macbooks. Agree that it's a huge productivity gain, moreso than any font improvements.
Another linux+i3 user here, I've not tried 4k yet but you confirmed my suspicions.
I did a lot of research before buying an xps-13 and went with the 1080p version due to basically all the reasons you just stated + poor battery life and video performance.
I have hope for the future though... what would really make transitioning easier is a way to automatically upscale incompatible programs, even if it means nearest neighbor scaling at least it will make them usable on super hi-dpi monitors.
I have 4K monitor on laptop and also as external monitor but I have no problem with linux (using debian testing with gnome 3). I can easily combine it with 1080p monitors. Everything works out of the box.
I still switched to 1440p as 4K is just better looking 1080p. You cannot fit more information on the screen with scaling and without scaling everything is too small. I work as backend developer so space is more important for me than visual quality.
> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow than "My fonts are almost imperceptively sharper" ever could
Happy i3 arch linux 4k monitor user here for over 2 years. I only set an appropriate Xft.dpi for my monitor size/resolution in ~/.Xresources once and that was it.
I'm not going to argue your preferences, but then why don't you get one 50 inch 4k display? That's about four of your current displays at similar density on a single cable. And probably at a similar price point, too.
Or, if you are using decent graphics hardware, you could even get two of them and have four times more display space than you have now.
> No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode.
It might depend on the program, too. Some might only work in pixels. Fortunately, it is usually not a problem if you are trying to run a program designed for Gameboy; the emulator should be able to scale it automatically, subject to the user setting. I don't know if any X server has a setting to magnify mouse cursor shapes, but it seems like it should be possible to implement in the X server. Also, it seems like SDL 1.x has no environment variable to magnify the image. My own program Free Hero Mesh (which I have not worked on in a while, because I am working on other stuff) allows icons to be of any size up to 255x255 pixels (the puzzle set can contain icons of any square size up to 255x255, and may have multiple sizes; it will try to find the best size based on the user setting, using integer scaling to grow them if necessary), but currently is limited to a single built-in 8x8 font for text. If someone ports it to a library that does allow zooming, then that might help, too. However, it is not really designed for high DPI displays, and it might not be changed unless someone with a high DPI display wants to use it and modifies the program to support a user option for scaling text too (and possibly also scaling icons to a bigger size than 255x255) (then I might merge their changes, possibly).
Still, I don't need 4K. The size I have is fine, but unfortunately too many things use big text; some web pages zoom text by viewport size, which I hate, and a bigger monitor would then just make it worse.
> some web pages zoom text by viewport size, which I hate, and a bigger monitor would then just make it worse.
Not that it excuses bad UX, but you might consider keeping your browser window at something below full width. I find this more comfortable anyway.
Total aside: I've noticed Windows and Linux users tend to keep their windows fully maximized, whereas Mac users don't. Doesn't apply to everyone of course, but enough to be noticed. This was true even before Apple changed the behavior of the green Zoom button, and I've always wondered why.
I find Spaces/Expose/Mission Control (or whatever they call it these days) way more comfortable than dealing with Windows. I especially like that if I hide a window, it doesn't pop up when using Mission Control. Opening the Windows equivalent shows me every window, even stuff I minimized/hid. It feels cluttered.
I don't see how alt-tabbing through maximized windows on macOS is different from Windows and Linux like the OP is suggesting. Though I do keep my browser at half-width on my ultrawide monitor because it's somewhat of an exotic/untested aspect ratio for websites.
Also any power user that cares will use a tool like Divvy on macOS for arranging windows with hotkeys.
> I've noticed Windows and Linux users tend to keep their windows fully maximized
Interesting, I've noticed the exact opposite. Mac devs, especially younger ones, tend to have full-screen IDEs and browsers and constantly flick back and forth between apps. My theory was always that Windows and Linux users had gotten comfortable with the desktop metaphor while a large percentage of newer Mac users grew up using iPads which were all full-screen, all the time.
Quick note I perhaps should have clarified, I wasn't thinking about the Mac's "full screen mode". This was something I noticed about other students in my high school a decade ago (why it's coming to mind now, I have no idea), before full screen mode existed on Mac.
It used to be that if you clicked the green button on Mac, most apps (not all apps, for weird aqua-UI reasons, but certainly web browsers) would grow to fill the screen without outright hiding the menu bar and dock, just like the maximize button on Windows.
My experience pre-full screen on macs was that the green button would do just about any random thing except make the window fill the screen. It would certainly change, usually it would fill vertically (but not always) but almost never horizontally.
To this day I still rarely press that button because of years of it doing nothing but unpredictable nonsense.
Technically, that's true, but "Wayland is inherently better at security/DPI scaling/other" is one of those cultural myths that eventually come true because of the people who believe in it. It would be possible to add these improvements to the X server, but no one wants to maintain or improve the X server anymore. All the developer effort is behind Wayland. So to get those benefits, you have to use Wayland.
Im on Gnome and use fractal scaling . 2x and everything got too big. But 1.6 looks OK. Its actually not on the app layer, its the screen that is scaled up. Although some low level programs can have issues with mouse pointer position if they dont take into account the scaling.
i3 user on a 4K screen here, has worked fine since 2014 for me (with the exception of the random super old TCL/Tk app) ? https://i.imgur.com/b8jVooO.png
Since nobody else has mentioned it, if you like i3 you should give Sway a test drive. Wayland still has some rough edges (screen sharing, for example) but it supports high DPI monitors with fractional scaling almost out-of-the-box.
> I genuinely think 4k provides no real benefit to me
Could you clarify when in your life you've had a 4k screen with good OS support, so that we know what experience you are speaking from when you say that?
One can definitely still see the pixels in a 4K 24'' monitor. That is not the point.
But I do agree with points 1 and 2 (they tend to work better on windows, though).
On the other hand, what about 3? I would find it ridiculous that it'll take you more than 5 seconds to enlarge DPI (no multi-monitor) even on the weirdest of X11 WMs. X11 is designed for this....
I agree with all your points, however I’ve found the mac is extremely variable DPI friendly. I think the games with custom UIs (Europa Universalis IV comes to mind) are the only things that haven’t adapted and it’s hardly a problem if you set the scaling to “large text” or whatever, just a little pixelated like you would see on a 1080p screen.
I feel like HN as a whole want blogs to come back, and part of that is well written exposition (the technical details) followed by near-outlandish opinions that tend to generate discussion. It's more fun, it leads to more conversation (here we are!), and clarifies our own values.
I think three monitors is distracting, but that whole community discussion gave me the opportunity to compare how I use a multi-monitor setup compared to others.
I generally dislike how fonts are rendered at floating-point scaling or lower resolutions, but I too just learned to live with it over years of ignoring it. Unfortunately, now I can't unsee it!
Could you elaborate on this? Specifically, do you feel that having three/multiple monitors is necessarily distracting (no matter what you do with them) or just encourages distraction?
My experience is the latter, and that if I am disciplined and only put windows from one task at a time on all three, then I have no more temptation to be distracted than I usually have at my internet-enabled glowbox, but maybe I'm an outlier.
For me, I can't use three monitors. It just takes too much brain-cycles to process and remember "where that window is".
I can process two fine: one "the workspace" the other "the reference/product/outcome/tests".
I tried really hard to teach myself structure, but with three monitors, I find myself always loosing windows, losing the cursor etc. Hacks like "find my cursor" (not default on Linux) help, but I'd rather have a clean mental map of where my cursor, focus and windows are, at all times.
What stuck best was having the third monitor hold the API/documentation/reference but still, the mental power to keep all my whereabouts mapped in my head were just too much.
Also note that I went from one to three monitors, so this is not "just too used to two monitors to ever change" it was the other way around.
Ubuntu has nice tiling features without requiring tiling for everything ([meta]-[→] and [meta]-[←]) that allow my now-ingrained use of two monitors on my laptop-screen when I'm somewhere without a second monitor. Another thing that I disliked about my three-monitor-setup: it does not map in any way to a single monitor: using virtual desktops worked, somewhat but still too different.
My experience is that my attention span sucks enough that there's not a functional difference between encouraging distraction and being necessarily distracting.
I do basically all my work on my laptop without an external display. I use two displays for some specific tasks where I need to look at two things at once.
As for me, three monitors is just too much information. There's no thing I do that requires me to see so many things at once. I do make extensive use of virtual desktops instead.
I like one large 4k monitor right in front of me, and the window manager is set to allow splitting windows in thirds. Laptop off to the side with music, terminal, and other auxiliary stuff.
>I feel like HN as a whole want blogs to come back
Blogs never left— they may have withered to near-nothingness after the same exact people who claim to want them abandoned them, but they’re still here.
I may be wrong but I believe the people who claim to want blogs abandoned them for sites like Hacker News.
HN is a link aggregator that gives a lot of blogs more exposure than they would otherwise get. I would say something like Twitter (micro-blogging and social networking) has contributed to decline of blogs.
If you are going to blame anyone, it's easiest to blame Google's failed EEE attempt on the blogging world with sunset of Reader and attempt to push everyone to Google+. There were so many blogs and meta-blogs I saw directly lost in that fumbled transition. In trying to build their own Facebook, Google did irreparable harm to the blogging world in the process.
"Relatively unchanged" is such an interesting POV. It's been in such a stasis that I pretty much assume it is as dead as LiveJournal. Maybe not in literally the same way that LiveJournal got shuffled around in a shell game to some strange Russian owners, but in a very metaphorical sense.
At one point Blogger in the early oughts was ahead of the pack in leading mainstream acceptance and usage of blogs. At one point my feed list was over half Blogger sites, but today I can think of only a few Blogger-hosted blogs left at all in my feed list, none of them have been updated recently, and those that have updated recently were claimed by spammers and (sadly) dropped from my list.
I can't imagine there's much more than a skeleton crew at Blogger keeping the lights on, and I would be unsurprised, if in pushing the metaphor to the LiveJournal thing, to learn that they were being kept in the Bay Area equivalent of Siberia by some half-mad state actors that need Blogger's zombie to keep its current undead behavior in some strange psy ops nightmare.
I love my 3 monitor setup. In 20 years of coding, I finally reached peak productivity in my workspace. No switching between windows, ease of throwing a window aside for monitoring, laying out information and tools side to side. It’s incredible once you readjust your habits and learn to use that new space. I compare it to having a large, wide desk when you’re working on something. Who wouldn’t want that?
I was one of the people who worked on 15” retina MBP everywhere, even in the office where I had 30” on my desk, to not have to readjust and keep optimal habits for the screen size. Now I simply refuse to work on a laptop at all, it feels like being trapped into a tiny box and I get literally claustrophobic :)
> I compare it to having a large, wide desk when you’re working on something. Who wouldn’t want that?
I have used between 1-3 monitors over the last decade, and there sure are advantages to having 3 for certain tasks. However, I noticed that having multiple monitors resulted in me having a dedicated screen for email (usually the smallest, my laptop screen). This decreased my productivity.
Perhaps not everyone has this weak spot, but for me using multiple monitors has a downside from an attention/focus perspective.
Coronavirus has robbed me of one of my favorite productivity hacks, which is coding on old Thinkpad with 4:3 ratio display, at a coffeeshop or library with Internet turned off. No distractions, no multitasking, just pure focus on a problem.
My friend, when I moved into my new apartment I went without WiFi for 2 years and did nothing but code on my T43. In that time I managed to rewrite (a variant of) Age of Empires 2.
IMHO, two monitors is an amazing upgrade. One screen for code, another for reference material or for the app being debugged. Better than one huge screen in many cases, as it’s two 16:10 spaces that you can maximize things to.
But with a 3rd monitor, you’re well into diminishing returns, it may even end up being a distraction if it becomes a Slack/Mail/web screen.
> I noticed that having multiple monitors resulted in me having a dedicated screen for email (usually the smallest, my laptop screen)
I'm currently using two external monitors, with my laptop docked and closed. I find the "dedicated screen for <distraction>" was a problem for me when I had my laptop screen open, because it's a different size/resolution/position than my actual monitors. On the other hand, I never have that problem with my dedicated monitors - in my mind they're a part of the same "workspace" because they're the same size, resolution, and positioned together - so I could see myself going to 3 desktop monitors one day.
I have two monitors right now and wish I had a third. One for code, one for documentation, and one for running whatever I'm working on (website, android emulator, etc). Currently I have the code monitor vertical and swap between workspaces on the horizontal monitor for the running thing and documentation.
I've solved this problem by having my email client (actually it's Slack in my case, but the same principle) and terminal share a screen. This works pretty well because I rarely want to use my terminal and chat at the same time.
I understand this and I don't want to argue against anyone's preferences. You know what's best for you.
I'm just pushing back against these out of touch fads. Most developers worldwide don't have a three-monitor setup. There is no proven correlation between quality software and 4K displays or mechanical keyboards (to name other fads). More importantly, the best devs I've known -- people I admire -- used tiny laptops with tiny displays, and shrugged when offered even a single external monitor; it just wasn't a big deal for them.
Mechanical keyboards don't make you a better coder.
But the only thing that will is writing lots of code -- over years and decades. And about 12 years ago, I started running into this anti-feature of human physiology known as "aging". And whereas in my think-I'm-so-l33t 20s I could bang out code on crappy desktop and laptop keyboards, by my 30s they were turning my hands into gnarled claws.
The remedy for this, for me, was a keyboard with Cherry MX switches. The crisp feedback let me know when a stroke was registered, so I unconsciously pressed each key less hard and was able to type faster with less pain.
Yeah, I wouldn't say having a mechanical keyboard makes your code any higher quality - that'd be pretty silly.
I think in general the thought is, if you care enough about your craft that you seek out refined tools, that care will be reflected in higher-quality development. Whether that's true or not, I don't know, but I'm inclined to believe there's a correlation.
I mean, it would be weird to visit a professional carpenter's house and see Harbor Freight tools, right?
Thanks for the reply. I think there is little to no correlation, but like the opposite opinion, I've no proof other than the anecdotal: the best hackers I've known didn't care about these things.
Other bizarre opinions I've read from Atwood and his followers: that you should be an excellent typist (this is also related to owning a mechanical keyboard). No. Just no. Typing speed is not the bottleneck when writing software. The bottleneck is my brain. I've never seen a project fail because people typed too slowly.
I do think there's a "CrossFit" mentality among the typer-coders who swear by mechanicals and end up with wrist braces - a kind of "more is more" approach that drives them to write lots of code, put in lots of hours, memorize innumerable details, and min-max their output in Taylorist fashion. It's optimizing for reps, versus mobility, stability, flexibility.
I have let my WPM drop a fair bit over time. I'm still relatively young yet, but I see no reason to go fast when I realize that most of the typing amounts to disposable bullshit. It's better to spend time thinking and developing thought patterns, and then just type a little bit to jog your mind and clarify. I allow myself to write some cheap code, but the point of that is to sketch, and the sketch should be light and quick, for the same reason that artists will say to favor long, confident strokes instead of chicken-scratch markings.
My main gripe here is that as time has gone on, and I've racked up the RSI's, is that the brain-to-text latency has gone up notably.
This scares the shit out of me. I'm not in the older subset of programmers (<30 atm), and this has gotten to the point where the latency actually affects my workflow.
> Typing speed is not the bottleneck when writing software. The bottleneck is my brain.
I agree except with a caveat: the mechanical action of typing, formatting, refactoring, fixing typos and missing semicolons, and moving code around actually distracts the brain from the higher level task at hand. And when the brain is already the bottleneck, I don't want to make it worse by wasting brain cycles on mechanical drudgery.
As one might expect, I feel far more productive when I'm using languages and tools that require me to type less and refactor & re-edit code less. I think the language would matter less if I could just wish code onto the screen. Until then, learning to touch type (with as few errors as possible! not necessarily as fast as possible) and use editor features to make it more effortless is the next best thing.
It’s the opposite for me, having fewer monitors makes it harder to find a window. I have to go through alt-tabbing slowly to get to the one I’m after.
With three monitors, I know exactly where my windows are. If I have more than three windows that’s annoying, but I keep the extra ones on the middle monitor to simplify things.
I think the Python ethos applies directly to typing speed: "code is more often read than written".
I agree, if speed of your typing is your bottleneck in getting code written, perhaps you should be coding smarter not harder.
I think there is some wisdom that you should try to be a "good" typist, in that better typing skills reduce the risk of injury (RSI), but that's self-care/ergonomics, and while still very important, there are plenty of good software developers that hunt-and-pecked their way to an early retirement (and/or multiple carpal tunnel surgeries).
I've had a phase of getting mechanical keyboards, but I always found myself typing slower on them. The added travel time, even on the "low profile" mech keyboards was making me type slower. I am back to scissor switch and I couldn't be happier. Although I prefer the low profile keyboards in general. One of my favourite keyboards is the butterfly Macbook keyboard, but I know it has mixed opinions.
Typing speed is not the bottleneck when writing software. The bottleneck is my brain.
I see typing like the ability to do mental arithmetic: being able to do it well isn't the thing that's directly valuable, but it removes distraction from the activity that is valuable, and that ends up making it valuable as well.
Another way to look at it: the faster you think, the faster you need to type in order for it not to become the bottleneck (during the times where you're actually implementing an algorithm as opposed to designing it). Of course, that's not just a function of raw typing skill, but also of the tools you use and the verbosity of your programming language. (An amusing corollary of this is that for society at large, it's useful to have great hackers who are bad typists: they will be tempted to develop better tools from which the good typists can also benefit!)
I've never known a great developer who did hunt-and-peck typing though. I do know great developers who have their own typing system. They simply never bothered to learn the "proper" way to do ten finger typing, and that's fine (unless those typing systems are worse for RSI, which was the case for me personally).
I understand what you're saying, but that's simply not my experience (either with myself or observing others).
Note Atwood claims you must be an excellent typist, training yourself to become one. I find this fetishization of a mechanical skill bizarre. I'm not advocating clumsily struggling with the keyboard like an elderly person, but past the point of "I'm a decent typist", I find that's enough.
I find there's no correlation between the problem-solving ability needed to be a great software developer and being a fast typist of the sort Atwood et al advocate.
I file this under "weird things some programmers believe without evidence" ;)
Yeah, I think we agree on the excellent typist point. It needs to be fast enough, but I suspect what happens is that pretty much everybody using a computer sufficiently to become a great developer reaches "fast enough" naturally through implicit practice.
I agree, but would say that in real life most of the people I know who like mechanical keyboards like them for hand strain reasons. They find them more comfortable to work with. While the code written is the same on both, that things like 4k monitors (eye strain) and mechanical keyboards (hand strain) are better for the long term health of the programmer. I've not gotten on the 4K train, but do like having my keyboard for that reason.
Multiple monitors though is purely personal preference I think. While having the documentation on another screen is something I personally find useful, if anything it probably makes me lazier about trying to remember things.
I think it's valuable to be able to type effortlessly without having to think about it too hard. Typing is a distraction that takes brain power away from the important things.
Typing speed is probably not the bottleneck but I found that since I started touch typing I get a lot less headaches because I don't have to look up and down all the time.
Unlike carpentry, the quality of our tools (keyboard and monitor, specifically) don't affect the quality of our output.
I think in many cases, people hide the fact that they're not competent with high-cost professional tools, because laymen use it as a proxy for talent that they cannot evaluate.
I think that's also why many exceptional programmers just use a 5-year old laptop -- they don't need to compensate.
A day-trader having 12 monitors mounted on the wall doesn't make him profitable.
I still use my 11 year old Thinkpad together with my newer Thinkpad. I use both laptops alternately. My productivity does not increase when i use the newer one.
That's said my extra 20 inch monitor helps me to visualize.
I've seen plenty of professionals using Harbor Freight tools, usually not carpenters but the tile saws and wrenches seem popular for professional use.
The correlation seems more likely to me that if you can afford the fancy tools then you've already had some level of success. Though there are those new mechanics who bury themselves in a mountain of debt buying a whole chest full of Snap-On stuff...
A professional will eventually wear even the best tools. And since they use those tools every day, they can keep an eye on wear. So it's not that crazy for them to use relatively cheap stuff.
Plus, a tradesman will also sometimes lose tools, drop them in places they can't recover them from and so on.
On the other hand, the day I want to fix some issue at home, the last thing I want is the tool I use perhaps once a year to be an additional source of issue, because it involves a round trip to the store.
In the last few years they've addressed that. Their Chicago Electric tools should generally be avoided but the Vulcan line of welders and Bauer/Hercules hand tools are all perfectly serviceable for light/occasional use.
The issue with heavy use is not that they don't work but that they're heavier, less ergonomic, and less robust/repairable than the name brands; if you can afford the name brands and will be using the tool until it breaks, fixing it, and then using it more then you'll want to go with the name brands.
20 years ago when GeForce 2 MX appeared I switched to 2 monitors; in the next 6 months the software department (that was the name) switched to 2 monitors by contagion; I was in the infrastructure department, they just saw the benefits. Since then, I never worked with less than 2 monitors. I can use productively 3 if I have it, otherwise (and most of the time) I use 2.
I am not a good developer, it is not my job, but I started coding on a Commodore 64 in text mode, then I did Cobol and FoxPro for DOS on 80x25 screen with no problem. But when larger monitors appeared, I used it, when the possibility to use more than one monitor appeared, I used it. It is a case of technology helping you, not making you better but helping - I am more productive using 2 monitors than limiting to just one. Because of this, I use the laptop (1366x768 screen) only as a portable email tool, everything else is on a pair of 24" monitors, in the office or at home. Sometimes I pull a monitor from another desk (in the office) or other computer (at home) when I do specific work that benefits of 3 monitors, but it is not a matter of preference, just specific use cases where 3 is better than 2.
May favourite dev environment was a 7" Android 4 running Debian. I got plenty done with an external keyboard.
I bang away at my mba 13" 2013 these days and the only real gripe i have is the lack of delete & backspace keys combo: I've never gotten comfortable without it.
That said, the only reason i could possibly use more screen real estate is web debugging. But to me that's more of an indictment of the environment I'm "coding" in.
The only time ever needed two monitors was back writing 3d games on a 3dfx (before nvidia head hunted their engineers) and needed to debug something while running full screen.
While i understand this argumentation, to me, monitors, their size & their number have always been pretty much...meh. instead it's the quality of the monitor itself (refresh rates, contrast, brightness)
> ... the only real gripe i have is the lack of delete & backspace keys combo
I’m not sure I follow. Are you complaining about the lack of a dedicated Delete key on Macs, having to use only one key for both Backspace (Delete key) and Delete (Fn + Delete keys)?
That personally bothers me a lot, as does the lack of home and end keys. Yes I know there are key chords to accomplish the same function, but having a dedicated key as part of your workflow makes a big difference. Maybe if I worked on my Mac 100% of the time I wouldn’t mind, but I only use a Mac about 20% of the time and it is incredibly infuriating.
Agree with other commenter who said you know what's best for you. Good job on iterating toward an optimal setup!
But I will tell you why multi monitor setups aren't the best for me. It doesn't feel like having a nice big desk to work at; rather, it's like having a separate desk for each monitor, and I have to move from one to the other to use it. With more than one centered monitor, I have to move my head to look between them, or my entire body, so that I can face straight towards whichever monitor I'm currently looking at. I've tried going back to multi monitor setups many times and every time I get tired of it faster due to straining my neck, eyes, elbows and shoulders with all that turning-to.
For me, it's one very nice monitor, with my laptop plugged in in clamshell mode (although now I leave my rMBP cracked so I can use TouchID).
I've also been using a window manager (Moom) with hotkeys to be able to set up three vertical windows on my screen. That seems to be the sweet spot for me: I can have multiple different code editors, or editor+terminal+web, or throw in email/slack/whatever into the mix. (I can also split a vertical column to two windows to achieve a 2 row x 3 column layout, and lots of other layouts, 1x1 vert/hor, 2x2, centered small/large...) I feel like I've arrived where you're at, my perfect setup!
I also still enjoy the 13" rMBP screen, although I can't get to 3 columns, and lately the keyboard hurts my wrists after extended usage. I use a Kinesis Freestyle 2 with the monitor+rMBP which has been absolutely fantastic for typing ergonomics.
I don't want that. There was a time when I used multiple monitors, but I've found that just working on a laptop works better for me. It's less distracting, and I find switching between windows to be both faster and less disorienting than turning my head to look at another monitor.
I can definitely understand other people preferring multiple monitors, but not everyone has the same preferences.
I appreciate your favour. I am probably an exception, but i like to code on just a single 15 inch MacBook. I switch screens by pressing key combinations. I believe faster, but also more convenient than moving your head around constantly.
for me everything must be accessible by various key combos. once I have that working I hardly need to use the trackpad mouse anymore.
the truth is that most people in my team's work with dual or triple screens.
I have always used a single monitor and I am productive like crazy. I mainly need a code editor and terminal. Sometimes switch to a browser and back. And that's enough. More monitors doesn't automatically imply more productivity IMO. Maybe in some specific cases. You can't focus on all monitors simultaneously anyway.
Have you used a 4K display, for at least a few days? If you have, I still disagree with you, but if not, I’m going to completely ignore your opinion, because I find the difference in how pleasant it is to use a good screen just so vast. Sure, you can do things on a lousy monitor, but it’s terrible and you’ll hate it. :)
(My first laptop was a second-hand HP 6710b at 1680×1050 for 15″, and that set me to never accepting 1366×768. So my next laptop was 15″ 1920×1080, and now I use a 13″ (though I’d rather have had 15″) 3000×2000 Surface Book, and it’s great. Not everyone will be able to justify the expense of 4K or similar (though I do think that anyone that’s getting their living by it should very strongly consider it worthwhile), but I honestly believe that it would be better if laptop makers all agreed to manufacture no more 15″ laptops with 1366×768 displays, and take 1920×1080 as a minimum acceptable quality. As it is, some people understandably want a cheap laptop and although the 1920×1080 panel is not much dearer than the 1366×768 panel, you commonly just can’t buy properly cheap laptops with 1920×1080 panels.)
I'll go one step further: I used the LG 27" 5K Display for two whole years before returning to a 34" Ultrawide with a more typical DPI.
Obviously I preferred the pixel density and image quality of the high-DPI screen, but I find myself more productive on the 34" Ultrawide with a regular DPI. (FWIW, LG now has a newer pseudo-5K Ultrawide that strikes a balance between the two).
I look forward to the day that monitors of all sizes are available with high DPI, but I don't consider it a must-have upgrade just yet.
Also Note that Apple made font rendering worse starting in OS X 10.14 by disabling subpixel AA. Using a regular DPI monitor on Windows or Linux is a better experience than using the same monitor on OS X right now. If you're only comparing monitors based on OSX, you're not getting the full story.
I also have 3440x1440 on my Dell monitor at home, and I love it.
My work monitor is a really nice 27" 4k LG monitor, which a coworker picked out. He's a real monitor specs nerd and made a lot of assertions like the OP. The scaling issues are endless and really bother me, and I don't notice the higher PPI at all. I much prefer the ultrawide Dell - it gives me a feeling that I don't even need to maximize my windows and I can still have lots of space.
It takes up zero desk space (one monitor arm that lets me adjust it anytime I want), and I don't need to rotate it. I've never found that a useful thing to do.
On the other hand, when you're done work its amazing for flight simulator or other games that support the aspect ratio properly.
Not parent poster but, as somebody who works on macOS, my problem is that the switch between the high-density laptop screen and a “pixelated” external one is extremely jarring. At one point I had an additional, average Dell monitor and every time I moved my eyes I cringed. After a couple of days I just removed it - better to have fewer screens than forcing my eyes to readjust every few minutes.
So yeah, if your primary device is a modern Mac, you really want a high-res, high-density screen.
Yes, Apple never completely embraced the somewhat hacky techniques of font rendering (hinting and subpixel AA), thus their fonts always appeared less crisp.
Not really.
There is an end point.
Also, resolution is not free, this is a tradeoff, when you push more pixels on screen it means more work for the GPU (or CPU in some cases) and more loading time and space to load all those high-dpi resources...
At this point I clearly prefer lower latency and higher framerate over more pixels.
Sure, to a point. But the step-up in question here (1080p to 4K, at sizes like 15–27″) has clearly visible benefits.
And sure, resolution isn’t free, but at these levels that was an argument for the hardware of eight years ago, not the hardware of today. All modern hardware can cope with at least one 4K display with perfect equanimity. Excluding games (which you can continue to run at the lower resolutions if necessary), almost no software will be measurably affected in latency or frame rate by being bumped from 1080p to 4K. Graphics memory requirements will be increased to as much as 4×, but that’s typically not a problem.
That's true, undeniably. But I think the better tradeoff still is to go up in size: IMO the optimal monitor size is 38". Big enough, but not too much head turning. Would I get a sharper 38" if possible? Sure. But I wouldn't compromise on size to gain higher DPI.
I have a laptop with a 13" 3200x1800 monitor; not quite 4k, but higher resolution than my eyes' ability to discern fine detail. My other laptop is 1366x768. The other laptop is a lot better for a wide variety of reasons (and also an order of magnitude more expensive) but the display resolution is genuinely something I don't give a crap about. It isn't terrible and I don't hate it. There's plenty of stuff I don't like about the display; narrow viewing angle, poor contrast, prone to glare- but the resolution is fine.
Having used CRT monitors, 1920x1080 displays, 4K displays and 5K displays, as well as various Retina Macbooks over many years, mostly for coding, here's my opinion:
The only good solution today is the iMac 5K. Yes, 5K makes all the difference — it lets me comfortably fit three columns of code instead of two in my full-screen Emacs, and that's a huge improvement.
4K monitors are usable, but annoying, the scaling is just never right and fonts are blurry.
Built-in retina screens on macbooks are great, but they are small. And also, only two columns of code, not three.
One thing I noticed is that as I a) become older, b) work on progressively more complex software, I do need to hold more information on my screen(s). Those three columns of code? I often wish for four: ClojureScript code on the frontend, API event processing, domain code, database code. Being older does matter, too, because short-term memory becomes worse and it's better to have things on screen at the same time rather than switch contexts. I'm having hopes for 6K and 8K monitors, once they cost less than an arm and a leg.
So no, I don't think you can develop using "tiny laptops with poor 1366x768 displays". At least not all kinds of software, and not everyone can.
> So no, I don't think you can develop using "tiny laptops with poor 1366x768 displays". At least not all kinds of software, and not everyone can.
This opinion seems bizarre to me. You start by offering personal (and valid) anecdote, then end up saying "I don't think you can develop [...]". But this flies in the face of evidence. Most people by far do not use your preferred monitor setup (iMac 5K) and in my country a vast number of developers use 1366x768 to develop all sorts of high quality software.
It's one thing to say "as I grow older, I find I prefer $SETUP". No-one can argue with that, it's your opinion (and it might very well become mine as I grow... um, older than I already am!). It's an entirely different thing to claim, as you do here and I think TFA does in similar terms, "you cannot prefer lower tech setups", "you cannot develop software this way", "it's very difficult to develop software without $SETUP". The latter is demonstrably false! I've seen it done, again and again, by people who were masters at their craft.
I don't think they were doubting that somebody does develop in those random setups, they were disagreeing with the people that say it doesn't matter and you can code anywhere. In your quote, a royal you.
But they are not random setups. They are extremely common setups in my part of the world. People -- who are pretty good at what they do -- can and do develop using these tiny screens. In this regard, "it doesn't matter". Or taking less literally, they wouldn't complain if they got a better monitor, but it's not the primary concern for them. So taking a cue from TFA's title: "no, it's not time to upgrade your monitor".
Following your logic, I can't claim anything, because there is always someone, somewhere, who will come up with a contrary opinion.
I do respect your opinion, but I still hold to mine. I also think this discussion won't lead anywhere, because we are glossing over the terms. Not every "developer" is the same, not every "software" is of the same complexity. I can fix CSS in a single 80x25 terminal, I can't do that when thinking about changes in an ERP system.
Note I do not dispute your opinion. You're entitled to it and you know what works for you.
Regrettably, following (my) logic does mean you cannot say "you [the generic you] cannot develop like this", because this is indeed easily disproven. People can and do (and sometimes even prefer to). That's the problem with generalizing from your personal opinion ("I don't like this") to the general ("people cannot like this").
Ya, the iMac 20" 5K is deal. I got an LG 28" 4K just to interface with a MBP..it was a painful compromise but stand alone 5K monitors are just too expensive right now.
When we are talking pixel count, we have to talk about size of the display also. a 28" 4k is acceptable, a 40"+ 4K is best used as a TV.
The best display I use right now is a 11" iPad Pro with a refresh rate at 120 Hz. You really can feel it, especially for stylus work.
Atwood is a bit of a prima-donna and part of being a blogger is a to make big statements.
One of my roles for a long time was speccing/procuring computer equipment (later overseeing the same) for a huge diverse organization. I took feedback, complaints from users and vendors, etc. People are passionate about this stuff... I was physically threatened once, and had people bake cookies and cakes multiple times as a thank you for different things.
The only monitor complaints I recall getting in quantity were: brightness, I need 2, and i need a tilt/swivel mount. Never heard about resolution, etc. Print and graphic artists would ask for color calibrated displays. Nobody ever asked for 3, and when we started phasing in 1920x1080 displays, we literally had zero upgrade requests from the older panels.
You dont need a good display until you have used one. Its like glasses, you think you got perfect vision and then you get glasses its night and day in comparison. You dont know a good display until you have seen one. The thing with high res monitors is that you should upscale or everything will look tiny.
IMO, there are two key criteria for monitors. Real estate and pixel density, and in some cases, you can't get both affordably.
I have had 15" laptops with 4K displays for some time now. I love the pixel density. But I can't do certain types of tasks on them because I end up getting real estate anxiety. I feel so constricted on a laptop screen, even when I add a second monitor.
My desktop has 2 x 27" 4K monitors running without scaling. So I have plenty of real estate, but the text could look nicer. Having said that, I don't miss sharp text in the same way that I miss real estate when using my laptops, at least from a productivity perspective.
I don't think a 5K screen is an answer for me, because my first urge would be to try to use it without scaling for more real estate.
On the other hand, a pair of 27" 8K screens (does such a beast even exist, and is it affordable?) would be ideal, because 1:1 scaling on such a beast is impossible at that monitor size, but 200% scaling would basically give me the same workspace real estate that I have now but with super sharp text.
A long time ago, resolutions got better, but when we got more pixels...we didn't make the characters smaller, we just made them more crisp. What changed? Why is the transition from 110 PPI to 220 PPI being handled differently than the one from 50 PPI to 110 PPI?
> What changed? Why is the transition from 110 PPI to 220 PPI being handled differently than the one from 50 PPI to 110 PPI?
Well that is a user setting, and I tend to think the first retina iPhone is what started the trend.
Also, at certain screen sizes, 1:1 scaling is just impractical due to limitations of the human eye.
My 27" 4K screen is pretty much at my personal limit in terms of what I can handle at 1:1 scaling for my normal viewing distance. Had I gotten a 24" 4K screen, I'd have to set the text scaling to ~125% which would also translate into some lost screen real estate.
> Also, at certain screen sizes, 1:1 scaling is just impractical due to limitations of the human eye.
Scaling isn't even relevant. It makes sense to talk about scaling pixel artifacts, but not vectors one. The font is 8pt or 20pt, you can configure that on your own. 1:1 scaling is just some arbitrary and artificial fixed pixel/letter seen.
A 24" 4K screen (basically a 24" imac) had text the same size as a pre-retina 24" imac, they just used more pixels to render that text.
Windows 8+ does, windows 10 has no problem with resolution independence. Some legacy apps that run on windows break, but not something modern
like Visual studio, chrome, etc...
Windows has issues with HiDPI once you have multiple screens with different densities, and it's not just with legacy apps.
In any case, Windows is just using a scaling feature. But when you scale, you are losing workspace in exchange for sharpness.
When 4 physical pixels translates to 1 logical pixel, you've actually lost 3 pixels of potential real estate on a screen size where your eyes can actually discern those pixels.
On a 15" 4K monitor, losing that real estate is not a big deal, because >90% of people can't easily read or see anything scaled at 1:1 on a screen that size. When you scale that screen at 200%, you're basically using a 1080p screen with sharper text. You don't gain any real estate at all from the higher resolution.
On a 40" 4K screen, it's a whole other story. The text may not be sharp, but you can have way more windows open on that screen, which makes it easier to multitask. It's like having a grid of 2x2 20" 1080p screens.
-addendum - my visual limit is 1080p @ 1:1 on a ~14" screen, which is why I am fine with no scaling on my 4K 27" screens.
If you use an OS that handles high DPI very well, and where 98% of all apps handle it just as well (including resolution scaling), it is an absolute joy to my eyes to be able to use 4K at the same effective resolution (so basically 1080p, but double the resolution, double the sharpness).
Every time I use a 'non-retina' type of display (like an old laptop I use for testing or my 19" 1080p monitor), it feels like I'm looking through some dirty glass because of the blotchy effect.
I tried 4K on one linux environment (and documented the experience[1]), and according to numerous responses, my situation was not unique: if you try 4K on any Linux environment, and don't enjoy everything being tiny, then it's not a fun time trying to get everything to behave like you can with Apple's built in resolution scaling options.
I certainly agree that some like them, while others don't.
And some grow to like them - I used to scoff... before I tried it for myself.
I think as well, it depends on your eyesight, and also your workflows and apps. I use Rider (and previously Visual Studio), and these have a lot of panes/windows - it's really nice to be able to have the code front and centre, with debug output, logs and a command prompt in another screen, for example. Another example would be to have zoom/webex in one screen, while I'm taking markdown notes in another.
A good while back, I moved to a dual monitor setup at home (1x 3k in landscape, 1x 1200p in portrait), and a triple monitor setup at the office (1x 1080p in landscape, 2x 1200p in portrait). I also use a Dell screen manager, so I can easily setup regions that windows can snap to - for example, the portrait displays are usually split in 2 horizontally.
The triple monitor setup is admittedly gratuitous, but I'm never going back from a dual setup - it's just so convenient to have everything I need always there, instead of constantly flipping back and forth through the many windows I invariably have open. It feels like I'm context switching much less.
No problem. Bear in mind I'm from Latin America, so we're usually several years behind the US (i.e. older tech, sold at excessive prices).
My work laptop is a Dell Latitude 7480. It's display resolution is a 1366x768. I normally use it with an external monitor, because the screen is not only small but also low quality. This... fine... piece of hardware was bought my current employer I think 2 years ago.
In all workplaces I've seen, either you get a macbook (with their high quality display) or an entry level laptop from Dell/HP/Lenovo (or some no-name brand) with a low quality 1366x768 display, which is what passes for entry level where I live. In many companies, macbooks are usually given to people who can explain why they need them, or to reward good performance. However, newer startups seem to default to macbooks.
Put yourself in the mindset of a typography geek. Then, by default, you will care about almost all of these things. I'm not saying you should care about all of these things all or most of the time, but that's the correct mindset to put you in sync with the author's conclusions (mostly -- I don't really think 120Hz is that big a deal).
120 Hz is a huge deal. It makes every mouse move better, and that alone is worth it. It reduces the pain of using poorly designed software that adds extra frames of latency to everything. It makes it easier to track and read moving/scrolling content, and it reduces judder when playing low frame rate content (e.g. 24 Hz Hollywood movies).
I'm surprised the author doesn't mention backlight strobing, which is another huge upgrade. Kind of hard to describe to people who have never seen it though, as is the case with most of these improvements.
The second and third monitor recommended in the article have 4ms latency, so there's essentially no reason to buy them.
There are good 4k 120hz monitors with strobing for way less than the $2000 mentioned. E.g. the zisworks kit for Samsung U28H750 (first available already in 2017): http://www.zisworks.com/shop.html
IMO 120 Hz monitors are overrated for pretty much everything except for FPS games (and similar fast-paced interactive titles).
I have a 144 Hz display and I would not want to go back to a 60 Hz display for gaming. At some point, my monitor changed to a 60 Hz refresh rate without telling me after some driver update. The first time I played Overwatch after that happened I could immediately tell that something was wrong, since the game just didn't feel as responsive and smooth as it usually does.
Another OW player in HN? :)
I feel the same about the 144hz monitor I have, only really worth it for playing OW, the rest of the time is nice but I feel like I'm not getting enough out of it.
Maybe next monitor I get will be 4k, but not sure for now.
After starting with 144 Hz monitor the difference to 60 Hz already with desktop mouse movements is so clear that it's immediately obvious if the refresh rate changed.
I get where he's coming from, but then again, that's not me. The cheapest monitor he recommends costs as much as my entire setup would cost in its current, used, state. I'm not gonna buy a single monitor for that price - and if i would, i wouldn't downscale it to the resolution [edit: real estate] i had before!
I've worked on a range of setups from dual screens, 120hz to little CRTs and laptop displays, and even a bit of graphing calculator and smartphone coding. My conclusion is that the difference depends mostly on the workflow, and what the workflow changes is mostly how much information you're scanning.
If you're scanning tons of text, all the time, across multiple windows, you need a lot of real estate, and in times of yore you would turn to printouts and spread them over your desk. A lot of modern dev goes in this direction because you're constantly looking up docs and the syntax is shaped towards "vertical with occasional wide lines and nesting" - an inefficient use of the screen area. Or you have multiple terminals running and want to see a lot of output from each. Or you have a mix of a graphical app, text views and docs. A second monitor absolutely does boost productivity for all these scenarios since it gives you more spatial spread and reduces access to a small headturn or a glance.
If you're writing dense code, with few external dependencies, you can have a single screen up and see pretty much everything you need to see. Embedded dev, short scripts and algorithms are more along these lines. Coding with only TTS(e.g. total blindness) reduces the amount you can scan to the rate of speech, and I believe that consideration should be favored in the interest of an inclusive coding environment. But I'm digressing a bit.
For a more objective ranking of concerns: pixel density ranks lower than refresh rates, brightness and color reproduction in my book. If the screen is lowish res with a decent pixel font that presents cleanly at a scale comfortable to the eye, and there's no "jank" in the interaction, it's lower stress overall than having smooth characters rendered in a choppy way with TN panel inverted colors, CRT flicker, or inappropriate scaling.
Book quality typography is mostly interesting for generating a complete aesthetic, while when doing computing tasks you are mostly concerned about the symbolic clarity, which for English text is reasonably achieved at the 9x14 monospace of CP437, and comfortably so if you double that. There's a reason why developers have voted en-masse to avoid proportional fonts, and it's not because we are nostalgic for typewriters.
And yet for some reason we have ended up with tools that blithely ignore the use-case and apply a generic text rendering method that supports all kinds of styling, which of course makes it slow.
> There's a reason why developers have voted en-masse to avoid proportional fonts, and it's not because we are nostalgic for typewriters.
This is sad in itself. Proportional fonts are vastly superior to type writer fonts in presenting code. The kerning even makes for better readability even if it comes at the expense of the ability to do ASCII art.
I use a 55" curved 4k TV as my monitor. It's not pixel density that's important for reading text, but contrast. The best thing for coding is a high contrast bitmapped font with no smoothing.
My monitor is still available at Walmart for $500. It's like an actual desktop when you can spread out all your stuff.
> Suboptimal...No. Just no. The best developers I've known wrote code with tiny laptops with poor 1366x768 displays. They didn't think it was an impediment.
I've realized that a stable environment does wonders for productivity. What you got used to (stable) is path-dependent and personality-specific. But it doesn't matter. What matters is to train yourself to sit down and work. If you're opening that laptop lid and thinking about visiting Hacker News first, shut it down, get up, walk away from the desk and then retry. At this point, I feel like doing a "productivity, productivity" chant / rant a la Steve Ballmer's "developers, developers" chant:
In addition to what thomaslord said [ https://news.ycombinator.com/item?id=23554382 ], 4K monitors tend to be widescreen. 1280x1024 user here; you'll take my aspect ratio when you pry it from my cold, dead hands.
I also found that the 16:10 is a better ratio, than 16:9, when working with code.
I wish there would be more 1920x1200 or 2560x1600 displays...
Lacking those options at reasonable prices leaves me with the option of using rotated 2K monitors, though 27" ones are uncomfortably tall and 24-25" ones are a little too high in pixel density.
Huawei's Matebook's 3K (3000x2000) display with its 3:2 (1.5) ratio at almost 14" size is an amazing sweet spot though.
I wish they would have it in 28" size for desktop use :)
In the end I'm working on 27" 2K and 5K iMacs for about a decade now and I see it as a good compromise, despite the aspect ratio, but I quite like my 30" HP ZR30w with its 2560x1200 resolution. Unfortunately its backlight is getting old and uneven :(
Speaking of 3:2 ratio, we have to mention the Microsoft Surface line of tablets, laptops and convertibles.
Definitely overpriced for a developer machine, but there is also the Surface Studio 2, with its 4500 x 3000, 28 inch, 192 DPI display, 192 being exactly 2x the "standard" 96 DPI.
ah, and one solution to the font sharpness issue is to use bitmap fonts, indeed.
I haven't found good ones though which are available at 16px or above sizes, which are necessary for 2K or 4K displays.
Also they are typically quite wide and don't have narrow options.
I have a 1920x1200 at home, 16:10 instead of the usual 16:9. I feel the same way, I won't be upgrading that monitor until it dies. That little bit of extra vertical space is so welcome.
Nope, bad aspect ratio. Let me know if you find a 1920x1536, though. (1600x1280 would also be nice, and a 2560x2048 would be a 4K monitor that's actually useful, although thomaslord's objections still apply.)
Edit: actually, wouldn't 1440x1280 (aspect ratio 1.125, which is even less than my usual 1.25) be a better use of such a LCD panel, rather than a obviously-useless 1280x1440 (with a aspect ratio less than one)?
Yeah, pretty much. Also, standard video aspect ratios like 1280x720 fit the full width of the monitor with space above and below for toolbars and video player UI, which is... basically impossible to convey how frustrating it is to watch video on a widescreen monitor when you've tried a proper one for comparison. Any kind of fullscreen (ie maximized window) games or multimedia is similar.
I understand the bit about a video players, but you're not going to want to play a game in anything except exclusive fullscreen due to the massive input latency incurred when using windowed or borderless windowed.
Depends on the games you're playing, honestly. That "massive" latency is about 30ms from what I can find, which you're not really going to notice when playing a game like civilization or hearthstone.
We could still be programming at 640x480 like we did back in the early 90s. Thankfully, technology marched on and we went to 1024x768 and then even better resolutions. One would think that technology would march forward until we had ink on paper-resolution displays, because why not? But of course, we could still make do with 640x480 like we did way back when.
The bigger issue is that, like we've been doing for the last 40 years, a cutting edge 2X display should eventually become the new 1X display, while the former 1X's become obsolete. This makes building software a bit easier, even with resolution independence, it is difficult supporting more than 2 generations of display technologies (e.g. right now we have 100 PPI, 150 PPI, 220 PPI, even 300+ PPI at the bleeding edge).
I get what you're saying. In general, though, I disagree with the notion that newer tech makes old tech automatically -- always -- not good enough. It's the mentality that makes a lot of people rush to buy the latest Kindle (or mobile phone, or whatever) when the one they own does everything they need.
More importantly, it's one thing to say "I prefer this new tech/resolution/gadget" and another to claim "how can you work like this?" (Where "this" can be "without a mac", "without a mechanical keyboard", "without a 4K monitor", "without three 4K monitors", etc).
Software is developed successfully without any of those. It's not only not an impediment, it's not like a sort of martyrdom either. So no, it's not time to upgrade that monitor.
Someone else in this thread commented that for many devs, their gadgets are a proxy for actual skill. It's easier to show you have a 5K monitor "like all hackers should" than to actually be a good developer.
I just like 200+ PPI displays (24" 4K is the same pixel density as 27" 5K) because I have to spend a lot of time reading text and like not seeing pixels and/or anti-aliasing artifacts. Once you get used to it, going back is like going back from an iPhone 4 to some Windows Mobile 6.1 phone.
It should also be cheap enough now, and at some point fabbing a 200+ PPI display will cost the same as an older one, it won't make much sense to keep making the lower res displays anymore (like it doesn't make sense to fab 32MB DIMMs anymore).
Don't get me wrong, I agree when the tech gets better, with none or few of the downsides others mentioned, going forward is the only direction. I won't actively shop for an older tech once the current one gives up the ghost.
I just disagree that it's "time to upgrade your monitor", or that the current tech is bad or makes it hard to be productive.
After trying virtually every display setup imaginable over the years, I've settled on a single, large, high-resolution display as the optimal setup. While I have written a lot of code on small, low-resolution displays, there are definitely disadvantages to that.
I don't understand the value of multiple displays for code work. Just about everything that I could do with multiple displays can be more ergonomically achieved with multiple logical desktops. I can swap a desktop with a keystroke rather than pivoting, and that workflow translates well from laptops to large workstations.
I use an over the top amount of monitors to... monitor. The purpose is to be able to look at dynamically updating information without touching my mouse or keyboard. i have found myself making an omelette while looking at a large amount of displays in the distance ready to panic-leap into action.
When I code I notice a few things about the multimonitor setup. My cursor can travel 10 physical feet of screen space and it's literally inefficient to point. The ergonomics are terrible. The entire setup basically feeds inattention and an ever-sprawling amount of crap being opened. More than that I think having multiple large rectangles shooting light into your eyeballs kind of screws with your circadian rhythms.
When I code I use a laptop and enjoy how briefly my hands leave the keyboard before returning because what little pointing i'm doing can be done extremely quickly because of the small size of my workflow. Bigger is better is oversold - a workspace should match the work - too small is just more common than too big.
It's not meant as an argument, but as an illustration: people who I respect more than the author of TFA, and whose skill and accomplishments I've witnessed first hand, weren't eager to upgrade from tiny screens. It's not that they would've refused a better screen if given one, it's just that they didn't ask for it and it wasn't a pressing issue. Indeed, for them it was not "time to upgrade the monitor".
It taught me the valuable lesson that creating software, to them, was mostly about thinking, not about the gadgets you use to write it down.
Like another commenter said, it seems gadgets work as proxies for talent for some developers -- "you should use a mechanical keyboard and a 5K three-monitor setup, like real hackers do" -- because actual talent is harder to gain and demonstrate. A sort of cargo culting by accumulating tech gadgets.
Why are you talking about screen size when the article is talking about pixel density? You can have a lot of pixels but low pixel density because your screen is huge, while my first 200+ PPI display was on an iPhone 4, my watch has a 200+ PPI display as well. I don't think many would accept phones or watches with lower pixel densities these days, but somehow the display we use for work is different?
You are right that there is a cargo culting going on, but it is going on both sides. There are a lot of strawmen out there as well.
You are still conflating display size with pixel density, so I’m not sure what you are arguing against. You talked about in other post about how you had to scale your displays because your OS didn’t support resolution independence, is that what you mean by tiny?
I'm really baffled by your comment. I never said that about scaling or my OS not supporting resolution independence. I think you're confusing me with someone else, which might explain the disconnect between your posts and mine. (It's understandable because my initial post seems to have gathered a ton of responses, so it's easy to get confused about who said what. Probably my most successful (?) comment in all my years here.)
I have a hard time using three monitors effectively, so that in the end they are distracting for me. Probably you need some particular personality tics to make proper use of them.
1. Main screen. For the code editor, intense web browsing, etc.
2. Secondary screen. For debugging visuals (since I work on web stuff, it usually hosts a Chromium window; for a mobile dev, I imagine it would be an emulator/simulator), documentation referencing (with the code editor open on the main screen), etc.
3. Third screen. For all comms-related things: MS Teams/Outlook/Discord/etc.
I didn't mention terminal, because I prefer a quake-style sliding terminal. For a lot of devs, I imagine that having a terminal on their secondary screen permanently would work great as well.
P.S. Not that long ago, I realized that the physical positioning of monitors matters a lot (to me, at least) as well. I used to have 2 of them in landscape orientation side-by-side and one in portrait orientation to the side. It was fine, but didn't feel cohesive, and I definitely felt some unease. Finally got a tall mounting pole, and now I have the landscape oriented monitors one on top of each other instead of side-by-side (with the rest of the setup being the same). That was a noticeable improvement to me, as it felt like things finally clicked perfectly in my head.
To clarify, my beef is with Atwood's opinion (and similar opinions) that you must use a three monitor setup, otherwise you're doing something wrong. Of course I understand for many devs this setup works, in which case more power to them!
I just dislike being told unless I follow these fads I'm a subpar developer. I don't own a mechanical keyboard or a three monitor setup. I don't own a 4K monitor (I suppose I eventually will, when they become the norm). When Apple came up with retina displays, I didn't feel I had magically become unable to write code because my display at the time was 1440x900.
> my beef is with Atwood's opinion (and similar opinions) that you must use a three monitor setup
It's weird to me to specify the number of monitors given how they come in a vast range of shapes and sizes.
For example, my dream setup used to be a single 55" curved 8K monitor. That's the rough equivalent of a 2x2 grid of 27" 4K monitors (I currently have two 27" 4K side by side in landscape @ 1:1 scaling).
The only problem with my so-called dream set up though is I don't think my computer glasses, which are sharp only for surfaces between a range of 21" to 27" would allow me to see everything sharp from corner to corner on that monitor, which sucks.
as a general rule I like to have a computer that is sort the average crap that you think a person might have around who does not care that much about computers so then if the stuff I make works on that I know it's going to work on the upscale stuff as well.
Also then when one of my disastrous kids destroys it I don't feel bad.
I used to use this exact setup, but specifically eliminated monitor #3 as I felt it was counterproductive to have an "always on" comms monitor. These days my main monitor has one workspace, while my secondary has the normal secondary stuff in one workspace, and comms in another.
I found it to be less distracting and the two screens are more physically manageable, and easier to replicate if i change setting (cheaper too!). The only thing I will change is whether I'm in landscape/landscape or landscape/portrait. I can never make up my mind about what I prefer.
This is my exact layout too! Though the screen with the code editor is ultra wide, so with window tiling I have the editor and the terminal side by side
X230 at 1366x768 checking in, can confirm, no problem at all.
I'm primarily a C programmer. 1366x768 limits my functions to about 35 lines at 100 columns wide. My font is Terminus, sized at 14.
I thank my x230 for enforcing me into a paradigm where functions are short and to the point. I watch my co-workers flip their monitors vertically and write code that goes on for days. While it's not my style, I'm thankful I've been forced to make do of limited space, and it has trained me well.
My laptop has a 4K screen, but I switch to an external 27" QHD (2560x1440) display when I'm at my desk and boy ... I used to like that external monitor. Now all I can see, all I notice, is how much worse and more pixelated text looks on it than when I'm reading on the laptop's screen. It practically looks blurry!
So yeah, nobody needs a 4K display. But I would think most people don't know what they're missing. (:
I think lots of fonts actually look worse on hidpi screens, especially on Windows, probably because these fonts were specifically designed for low resolution screens. And most applications use hard-coded fonts that you can't change. Text is maybe a little easier to read on hidpi screens, but it is less aesthetically pleasing.
Linux can be made to look good in that respect, but the situation for scaling legacy applications is pretty bad.
Yes, this. Even if you have only one hidpi screen, Windows still looks bad, because the fonts look terrible when rendered on a hidpi screen. This is one thing keeping me on "lodpi" screens for now.
Acclimation is a huge factor: if you’re used to writing on a small laptop that will seem normal but few people won’t see a benefit moving to a larger display even if they don’t expect it. That’s one of the few durable research findings over the decades.
Multiple monitors are slightly different: the physical gap means it’s not a seamless switch and not every task benefits.
I agree, and not. I like having many windows side-by-side (code, docs, thing I'm developing), but I also like having more vertical space. I keep one of my monitors vertical for coding, although that's a little too tall so there's some lost space at the top and bottom (9:12 would be better for me).
Really I want a single plus-shaped monitor that can act in several display modes:
- mimic 3 monitors. Left and right are 4:3 (ideal for single application), center is 9:12 and larger. Shape: -|-
- Single vertical monitor. Same as above but with left and right "virtual monitors" turned off for reduced eye strain. Shape: |
- Single ultrawide horizontal monitor (connect the left and right parts with the strip in the middle, turn off unused pixels at the top and bottom). Shape: ---
Yup, this kind of stuff is very much based on preference. I think some people have tighter comfort deadbands and others are tolerant of anything. These can be the same person, on different days / differing amounts of sleep / different tasks / different lunches / different music.
i.e. I have a three monitor setup, two 1080 screens and a central 1440p screen. Mechanical cherry blue keyboard and a nice gaming mouse. Big desk. One of those anti-RSI boob mousemats. Nice headphones. Beefy computer. Ergonomic office chair. Everything at the right height. Quiet room. Next to an open window, but which is in shade.
Some days I will happily chug away writing reams of code.
Other days I will get nothing done and every minor annoyance will bring me out of focus; background sounds, glitches in my desktop environment, almost inconceivable hitches in computer performance, my chair feels wrong, the contrast of all text is too high, it's too dark in here, it's too bright, this is all bullshit, I'm so done with computers...
And on other days I'll get even more done, hunched like a Goblin in my living room armchair with my legs tucked beneath me, working through to the early morning, on my tiny 13 inch dell laptop, vaguely aware that I have searing pain* through one wrist, having not eaten for 12 hours, but completely and utterly content and in flow writing code or a paper.
Humans are crazy fickle.
There's no Platonic Ideals when it comes to Crushing It(tm) as a software developer.
I think learning the art of mindfulness and meditation does more to help you focus than expensive equipment fads and micro-optimizing your environment.
* Before anyone mentions, yeah I'm aware RSI is Not Good. I've got a bunch of various remedies for it. Ironically I've found that bouldering helps with this it the most. Just not able to do that during lockdown :(
By 'like them' are you talking about the full set of tradeoffs, or the raw visuals? Because I'd be very surprised if someone actually disliked how an all-else-equal 4K screen looks, compared to 1080.
Yes, the full set of tradeoffs. My comment is unclear. If someone gave me a 4K monitor for free I'd use it, but not having one is not an impediment for me (or most devs I know).
At least for me - I had a cheap 4k monitor. Managing the size of fonts / windows on it vs the connected laptop was a pita. Not to mention the display /graphics card overloading and glitching once in a while trying to power everything.
So I am still stuck with my regular HD monitor + laptop monitor - which is pretty good.
I agree with you, 1080p on a 23-24" screen is great for programming. I have a Acer Predator X34P at home (34" ultrawide 1440p @ 120hz), and for games it is AMAZING. But for programming? I prefer 2-3x 1080p at around 23-24".
In the last 3 months I went from 1 screen to 2 to 3 as I progressed through particular stages of development. When we 'go live' I'll easily be using all 3, but I don't _really_ need them, it's a nice to have.
Yes, you're completely right. I remember when the Atwood article came out, and I was convinced I needed multiple monitors to be productive (I was quite junior) - it even caused me stress when I worked at places that didn't have multiple monitors. These days I'm pretty happy with a single screen - including feeling very productive on just a laptop screen.
It turns out the whole idea of needing tons of desktop space was vastly exaggerated. You can't focus on 3 screens at once, it is impossible. Even on a large screen you can only look at one place at a time. Programmers are focused on one thing for a long period of time anyways - your code editor. Maybe a terminal as well that you need to swap to occasionally. Having multiple monitors has limited benefits.
Where multiple monitors is helpful is if you're an operator - you are watching many dashboards at once to observe if things break or change. Traders for example will have workstations with 9 screens at once. But this makes sense - they are not focusing on one thing at a time, they need to quickly look at multiple things for anything unusual.
I've personally found that using a tiling WM gives me vastly more benefits than multiple monitors ever would. Although I was also pretty happy using Gnome's super-tab/super functionality during my non-tiling days.
4K is sort of a nice to have, but hardly a requirement. I'm as happy on a normal DPI screen as I am on a retina screen. Our brains adapt very quickly and unless the text is overly blurry (because of poor hinting implementations), a regular DPI screen is perfectly fine for coding on. My own personal preference is high refresh rates over anything else - I can't stand latency.
We used to write great code on the original Mac’s 512 by 384 screen, doesn’t mean we were very productive at it.
A 5K monitor allows you to view an enormous set of workspaces and code, and sharply. It will cost you less than $300 a year, which for any decent developer is less than 1/2 of 1% of your annual revenues (and its tax deductible for contractors!).
There is no way the increase on productivity doesn’t far outweigh that cost.
What do you mean "we weren't very productive"? People were plenty productive with the original Mac.
I was plenty productive on PC XT clone with a tiny monochrome CRT monitor and an Hercules video card. (I wanted to say I was plenty productive with my C64, but that'd be stretching it :P )
What I want to fight is this notion that once tech N+1 arrives, then tech N must have been unproductive. That's demonstrably false, a form of magical thinking, and not a healthy mindset in my opinion.
But it's true. Developing modern code on a Apple 2 is unproductive. It's suboptimal, and worse. Maybe you don't want to use the word "unproductive," and that's fine. But larger screens, faster computers, and more expressive programming languages all contribute to increased productivity over the older alternatives.
> Developing modern code on a Apple 2 is unproductive
Are we talking about developing modern code on an Apple 2 now? If so I misunderstood the parent post, because they were using the past tense.
In any case, consider this: how do you feel about your current suboptimal, unproductive hardware & software of choice right now? Your current programming language and monitor resolution cannot be used to code. It's simply not possible to be productive with them.
Think I'm exaggerating? Ok. Fast forward a few years and let's talk about this again. Maybe you'll understand my frustration...
No, I mean, it can. It's just going to (probably) be less productive than what exists in the future. All I'm saying is that the productive/unproductive dichotomy isn't useful and we should instead talk about productivity as a continuum.
> All I'm saying is that the productive/unproductive dichotomy isn't useful
Then we agree! I was pushing back against the notion that one cannot be productive unless using a { 5K monitor; three-monitor setup; mechanical keyboard; $YOUR_FAVORITE_TECH_GADGET HERE }. In other words: it's not "time to upgrade your monitor" ;)
I agree better tools have the potential to make us more productive. I also think some nerd/hacker types tend to overrate the importance of their tools and gadgets, because this stuff serves as a proxy for actual talent and it's easier to show off and compare (These are not my words, someone else said it elsewhere in this thread, but once I read it, it really clicked with me).
It's one thing to say "I'm more productive with $GADGET". It's another, different thing to say "we weren't productive before $GADGET". I assure some of the old hackers, back in the days when they coded on a PDP-11 (or an Apple II, or whatever) where more productive than me and -- without knowing you -- I'm willing to bet they could be more productive than you. In my opinion the mindset that old tech wasn't "good enough", that leads people to claim "we weren't productive before" in absolute terms, is really pernicious and unhealthy.
Mind you, I'm not saying all gadgets are equal. Faster CPUs to me are obviously more useful than a larger screen. Both are nice, but one is more important than the other.
However, the most important peripheral, the one that makes the most impact on productivity: my brain. And I can't show it off as easily as a brand new macbook or whatever.
The cost of buying a new high end laptop is trivial in relation to your work value.
I used to run a dev group of 40 people. Every developer got fastest possible pc with big screen monitor back when both were expensive, and their own office with a door. Added about 1% to the cost of each developer, but the benefit to productivity was at least 10-20 times that.
And Not even counting benefits of reduced turnover.
An Intel iGPU can run dual 4K displays just fine, if it has dual channel RAM.
I'm still upset at Lenovo thinking 16 GB should come in one RAM stick.
If you have a Thunderbolt USB-C port then you have a DisplayPort. Some monitors accept it directly and will act as a charger and docking station. Others need a fairly inexpensive USB-C to DP adapter.
I actually like monospace pixel fonts. Doubt it would make a difference. I have a (cheap) 4k display at home. Some things might look better, but I would recommend to use one with a high refresh rate and good contrast.
Since I used ebook readers with their displays, reading on a monitor is always a compromise in comparison.
Wouldn't want to go lower than 1080 though. It can work of course, but it just doesn't have to be.
For me it all comes down to display size and density, and therefore out of all the monitors I've tried, I'd go 27" at either 5k or 2.5k resolution.
For me 27" is the perfect display size:
- Any bigger and I find I have to move my head to much to be comfortable in the long term and I get a stiff neck.
- Any smaller and I can't fit enough on the screen at once and my productive decreases.
2.5k (1440x2560) on a 27" monitor has pixels at a good size that most apps scale by default beautifully on.
5k is exactly double the density of the already perfect 2.5k, so (at least of MacOS) everything displays exactly as it does with 2.5k except really crisp!
Therefore, I now run the 27" LG 5k as my primary screen, and my old 27" 2.5k vertically on the side for all those status like things that it's good to see but don't require my focus. Also, since the size and "looks like" resolution is the same, I can drag windows between them without any changes and size/scale. It just works!
I don’t mean to pick on you personally, but what a stupid argument that is.
How is more pixels and more legible text not strictly better than lower resolution monitors?
It’s like preferring 10 DPI mice to the modern 1000+ DPI ones, saying that PCIe 4x was better than 8x or actively disliking a modern DSL connection compared to a 56k dialup.
Yours is just an idiotic stance to feel superior than everybody else.
I built the majority of the software for my first startup on a tiny under-powered netbook, since I couldn't afford a more expensive laptop. Today I have a very wide monitor for work. Agree that the display size doesn't seem to matter that much, but you do have to learn how to take advantage of features like multiple desktops, etc.
I used to have #1 with two large-ish monitors and I hated it. Two monitors are effectively a single 2xW monitor and that means moving your head from side to side constantly. Having them X inches above your keyboard means moving your head up and down (I touch type but need to look at the keyboard occasionally).
A laptop means a hi-res screen and a keyboard in close proximity. Instead of moving your head around you can use Spaces.
The second laptop can act as a second screen if you need it — as well as a coffee shop runner and a backup if your main machine is down or getting serviced.
It’s the same cost or less than a powerful laptop and two external monitors, assuming you are buying hi-res.
Another conclusion I disagree with is the author's insistence on only using integer scaling multipliers. The article talks about subpixel rendering and then mostly ignores that for the rest of the article, instead assuming pixels are all small black squares (which it isn't - "pixel density" isn't a particularly meaningful term when you're dealing with clarity on modern screens). IIRC, macOS renders text on a higher-resolution internal framebuffer and scales down, anyway; in practice I don't notice any difference in clarity on my MacBook screen whether running at an integer or non-integer scale of the native resolution.
Had this debate when we bought monitors for the firm. Some in management said high DPI, good color etc. are not worth the money (which you clarify is what your really mean). But after splurging, the effect on productivity was measurable. If you pay someone to stare at a box for 8h, a nicer box pays for itself pretty quickly.
Personally I don’t mind a small UI so I currently just run (macOS) on 3840 × 2160 on two 24” LGs. With size 11 Fira this gives me 8 80char columns.
> The best developers I've known wrote code with tiny laptops with poor 1366x768 displays. They didn't think it was an impediment.
Everytime I read something like this I cringe. Ergonomics is terrible with laptops. There are no osha approved laptops.
Those developers should sit in a good chair, with a properly positioned keyboard monitor and body, and head off the problems their body will encounter. (or is encountered but ignored)
I just refuse to be told that I must use a { three-monitor setup, mechanical keyboard, 4K monitor, macbook, $LATEST_FAD } or otherwise I'm using a subpar development environment.
I still remember when Full HD monitors were all the rage. Nobody considered them cheap tools back then, and somehow code got written and developers were happy. There will come a time when someone on the internet will tell you that a 4K monitor is a cheap tool and that nobody can properly code using one.
I have a 2560x1440 monitor at home and 2x 4K monitors at work. I like my home setup better because the tools I spend most time with (vscode and Xcode) lock all their panels to one window making the real estate more important then the pixel density. I still have not found a way to get a 4K display not to display in “retina” 1080p without it getting too small, blurry or laggy.
Agreed. Was supplied two 4ks for my last job and just ended up using so zoomed in it wasn't even like I was using 4k monitors anymore.
My ideal setup is two 1440p monitors, but even then I tend to zoom in quite a bit.
I also had a coworker who was fine with their laptop screen. People you are finicky with these kinds of things seem to have their focus elsewhere over the actual work done.
> The technical details are all right (or seem right to me, anyway)
No, not universally. If I follow the advice and disable font smoothing, it looks worse than any font display I've ever used, including 8 bit home computers. https://i.imgur.com/Y0xkhTb.png
Aside from the added screen real estate, I find my external 1080p LG monitor simply looks better than my Dell Latitude's. The laptop display just looks... cheaper. It's hard to explain, it just looks worse.
The whole "HD ready" thing was an utterly sad part of the story of cinema/television that let's hope will never repeat.
As for the 1366x768, for some reason for many, many years the wide majority of monitors stopped there, with "Full HD" coming to the masses (but not all) only a few years ago.
"1366x768" meant "half baked HD" from the day it was introduced, so it's understanding that few quality monitors (and especially laptop screens) were made with that resolution.
The author is biased because of ligatures. These special glyphs are a little more dense than normal text, e.g the triple line equals and need a higher res display to look good. I personally dislike ligatures, I have no problem chunking two characters together in my head.
> e.g the triple line equals and need a higher res display to look good.
Do you mean '≡' (U+2261 identical to)? Because no, it very much doesn't; in fact I'm not sure I've ever seen a font where it looked bad short of vertical misalignment with surrounding text.
Okay, tracked down this[0], which is probably what you were talking about(?), and what the actual fuck is wrong with that rendering engine? You render a Apx horizontal line, Bpx of space, a second Apx line, Bpx of space, and the third Apx line, where A=B=1. Maybe you use 2px of space. Maybe, if you're rendering obnoxiously huge, you use a 2px line and however many pixels of space. You don't fucking use real-valued vector coordinates and then blindly snap them to integers, or smear bits of gray anti-aliasing around the screen like some kind of monkey with a hand full of it's own feces.
If OSX is actually doing stupid shit like that, it's no wonder sane-resolution moniters look like crap. Stupid-absurdres monitors will look just as bad, because the problem is the operating system driving them, not the monitor.
I'd take a 2560x1440 monitor (or 2 or 3!) over 1080p displays that the author seems to prefer, even if those 1080p display are actually 4k pixel-wise...
Being able to see more things >>>>>>> sharper text, once you're past 100dpi or so at least.
Are you talking about physical screen size, DPI or what? I've no idea what display a MacBook Air 11" has.
Did your eye doctor tell you your myopia developed because of staring at your laptop's screen? (Mine developed when I was a teenager and hadn't yet spent a lifetime of staring at CRT monitors, and my eye doctor told me it was likely unrelated to computer screens. But maybe medicine has changed its opinion, this was decades ago).
You as a coder also don't normally need 120 Hz display like the OP and to be limited to !3! models currently available on the market. It doesn't make sense. But hey, economics, whatever...
I love the old CRT text mode. I don't need fine anti-aliased font for my text, I find those fine fonts are too much for me. What's wrong with some non-perfect edges?
Oh, no argument from me. I said I knew plenty of people I admire, and who are way better than me profesionally speaking, who can code with 1366x768 displays and dismiss the urgency of getting a better screen. In this sense I meant that it's not time to upgrade your monitor: it's simply not that urgent to own the highest DPI screen you can buy.
Me? I can certainly code with a poor quality 1366x768 display (doing it right now with this Dell Inspiron!) but I hate it. I'm not a 4K snob, but I do like better screens!
I started coding in Windows Visual Studio again recently after decades of absence, and the enormous length of their variables, function names, and function parameter lists, has me >120 columns again. All my non windows code is 80 (or 100 in very rare cases).
So now I need a wider monitor. I was on an NEC 6FG until 2011 and then went to a small dell flat panel 1280x1024. I suppose it is time to join the 2020's.
I have multiple VNC and RemoteDesktops up at any moment, so maybe I'll really splurge and go for one of those fancy curved ones.
This article is written in a style i can really appreciate. That's a lot of research.
1440p, 75hz display is perfectly acceptable. If you game, the quality difference between 1440p and 4k is pretty much indistinguishable, but 1440p is much less performance intensive to render.
I upgraded from a 32" 1080p display to a 32qk500. A significant improvement in visual quality for a reasonable price (~USD$270)
I've found my journey through tooling evolution to be fascinating (to me at least).
When I was new and coding was slow, my worries were about so many other things (like how the language even worked) than even what text editor I used, let alone the monitor.
As I became experienced, a lot of newbie problems went away and the next topmost problems emerged, leading to be obsessing over text editor, shortcuts, and even a multi-monitor setup with a vertical monitor.
Then went through a minimalism phase where I was annoyed by how much time I was spending maintaining my tools rather than using them, so gone went almost all of that. Just one giant monitor and VSCode for better or worse (mainly because it does all six languages I use well enough).
I'm now at a phase of thinking, "who are all these people who do so much coding in a day that these things matter?" because reading and writing code is maybe.... 35% of my job now? What I need to optimize for is reading/writing human text. And that's where I currently am: figuring out how to optimize writing documentation/design/architecture docs, given how awful making and maintaining a sequence or block diagram is currently.
My conclusion so far is that I expect my needs to continue to mutate. I do not believe they are "converging" and do not believe there is any sort of golden setup I will one day discover.
And there is another phase - healthy: flicker free monitor, ergonomic keyboard, stand-up desk so on. When are these 27" e-ink monitors?
And yet another phase - no-screen. Everyday I look forward to avoid screen-time as much as possible - more face to face time, more water-cooler chats, figuring out the solution with pen and paper or coding in head during outdoor walk.
I loved it too on a mac. When I switched companies and went to an Ubuntu-based laptop, trying to keep the whole monitor scenario working consistently was frustrating, so I dropped it.
To echo the sibling comment: `arandr` makes it easy to save monitor configurations (as a script). Then you can run that script on startup. I.e. for Xorg I put it in my ~/.xinitrc.
Concerning the smoothing, I presume this is macOS’s glyph dilation. pcwalton reverse engineered it in order to replicate it in Pathfinder (so that it can render text exactly like Core Text does), concluding that the glyph dilation is “min(vec2(0.3px), vec2(0.015125, 0.0121) * S) where S is the font size in px” (https://twitter.com/pcwalton/status/918991457532354560). Fun times.
Trouble is, people then get used to macOS’s font rendering, so that (a) they don’t want to turn it off, because it’s different from what they’re used to, and (b) start designing for it. I’m convinced this is a large part of the reason why people deploy so many websites with body text at weight 300 rather than 400, because macOS makes that tolerable because it makes it bolder. Meanwhile people using other operating systems that obey what the font said are left with unpleasantly thin text.
I've blocked font-weight: 300 in my browser because of this permanent macOS bug. It's funny because people now abuse a thoughtfully-designed CSS property to disable this dilation on webpages, to the point that that CSS property is used exclusively to account for macOS's incorrect font rendering and is inert on other platforms.
How do you block font-weight: 300? On Windows I uninstall thin fonts, but on Linux, Firefox doesn't respect my fontconfig settings to avoid using thin fonts.
I tried turning off smoothing and frankly the text was uncomfortably thin to read. I guess maybe their font should just be bolder by default so you don't need the glyph dilation to compensate.
I didn’t actually know you could turn it off before today! I’ve never had a Mac, I’m used to FreeType and Windows font rendering.
I have a vague feeling that Macs also have somewhat different gamma treatment from other platforms, which would be likely to contribute to thickness perception.
In Leopard that font smoothing preference wasn't a checkbox, but a drop-down menu specifying how much font smoothing. If you select "Strong" you get a result that looks even bolder than normal. Even when a website is using an extraordinarily thin font, this setting makes it readable.
Apple replaced the drop down menu with a checkbox in Snow Leopard, but the underlying functionality wasn't removed until Catalina. Naturally I was very disappointed after my upgrade to Catalina because all text looked too skinny to me.
this is interesting, thank you! And font-weight: 300 has been an old personal nemesis. That explained at least part of it! Do you know if font dilation is on by default?
Maybe this was done to emulate font bleed that happens in the printing process. It's a common problem that latex PDFs look too light, because the build int fonts where copied from mechanical type, which assumed the ink would bleed on the page.
Eh, I think everyone serious about typography agrees that Computer Modern is just unreasonably thin. This is easily enough for me to consider it a lousy font that should literally never be used on screen, and I’m dubious about usage on paper. Unfortunately stockholm syndrome applies.
Do all that, of course. But when you’re writing applications for other people, go buy a shitty middle of the pack laptop, resolution 1280x768, with the windows start bar and menu bars eating like 150pixels of that, and try to use your software there.
While we're on it, make sure this is a low end laptop in terms of components too. Make sure your application is usable. No one needs a chat client that eats several gigs of RAM and a significant portion of your CPU.
> A good monitor is essential for that. Not nice to have. A MUST. And in “good” I mean, as good as you can get.
The whole discussion starts from a what I would say is an incorrect assertion, and then goes on to describe lots of ways that letters can be made better. If, in fact, you don't need better, then a lot of the points go away.
1. There is a point of diminishing returns, past which you are spending a lot more money for very little benefit.
2. There exists a point beyond which "better letters" are unlikely to contribute much to daily work.
Both of those points are, to some extent, the same point. But either way, the idea that you MUST get as good of a monitor as you can" is, in my opinion, untrue and not worth basing an entire document on.
I'd rather see a discussion of which features of a monitor contribute the most (per $) to how well they function for daily work. For gaming, I want high refresh rate and high contrast. For TV, the contrast (real black) goes up in importance. For daily work, neither of those is a huge contributor to how well I can work.
While technically the article is correct, it feels very narrow-sighted to me by placing resolution/refresh rate above everything else. Some more things to consider:
- Price. If you are not a programmer in a first world country working for a magic money startup/big 4 then you wont spend $1000+ on a single monitor. Mine was a better 27" 1920 one for $300 and I'm happy with it. I'm sure there are better ones, but I cant allow myself spending much more on one.
- No PWM. This is crucial. PWMing monitors kill your eye in hours, then you wont be able to work for the rest of the day. Some ppl I know take the eye-protection game so far that they by e-ink screens for development (sadly you cant buy large ones.).
- Adjustable, optionally can be rotated 90 degrees. If you got 2 monitors one rotated at 90 degrees is a blessing for reading code.
- small bezels. Quite important if you got 2 or more, large bezels make my setup clumsy.
- Bright. Its quite annoying if I cant read something because the sun shines in my room. Especially important for laptops.
None of these are mentioned, just that small letters should look nice and scrolling should be smooth.
PWM and flicker need to die. I easily tell when a monitor is flickering and I'm convinced they're bad for long term eye health after using one for multiple years and not knowing why my eyes started hurting so often. Refresh rate is a nice-to-have in my book above 60fps. Size and resolution are much more important. Having a 32" over a 24" changed my life.
Affordable, large E-ink screens would be revolutionary for many occupations and I hope there's more investment in the space.
Macbook Pros use PWM, which is why I had to return the latest 16 inch model. I was really bummed; but I need to protect my eyes. It was causing severe eye strain.
After using a rMBP for 6 years, I realized that using a lower resolution and lower quality display makes absolutely no sense at all if both graphical power and budget is available (first 13' rMBP had some serious issue driving the display). Better quality image is better quality image. I think Apple's biggest selling point over any vendor right now, despite numerous issues with its hardware and software in the recent past, is absolutely top class input and output. It is such a simple concept. A great keyboard (seems to be fixed now) and absolutely incredible trackpad experience along with a display that basically is a huge step up from your past experience means that most users will prefer that setup even if they just use it for basic coding or web browsing. After looking at the first retina displays I realized that Apple didn't just change the displays but it changed how fonts behaved completely because the crisp and clear legibility was key to attract customers early on. I'd say even in 2020 most computers are struggling with good displays which can completely ruin the experience for someone using the product even if every other aspect of it was great.
I've been focused on Apple display products for the past few months as I'm looking to make an upgrade from the Dell P2715Q 4k 27" I use primarily for development.
There is a three page thread on using the Apple 32" XDR for software dev on MacRumors. [1]
I believe there is a major product gap at Apple right now in the mid-market display. Specifically a replacement for the LED 27" Cinema Display which was announced 10 years ago next month. [2]
I am speculating that Apple could announce a new 27" 5k in part because of the rumored announcement of a new Mac Pro but also because the build quality of the LG 5k Ultrafine is just not great and there are obvious synergies with XDR production and Mac Pro.
I think this should be announced at WWDC is because developers specifically are being left out of good display products and Apple should be looking out for what they stare at all day.
While there are no supply chain rumors of such a display, I wargamed what this product might be and its pricing anyway.[3]
In short, I speculate Apple will release a 27 inch IPS display, 5120 x 2880 @ 60Hz with standard glass at $1999, Nano-texture Glass at $2799.
I had not paid a lot of attention to the refresh rate, but it does seem like kind of a miss that the XDR does nor offer this.
I have a Dell P2715Q and I'm happy with it. I haven't tried anything with a higher resolution or DPI, but I can't see any individual pixels on the Dell, so for the moment, it's good enough for me.
I can't go back to sub-4k though. Looking at a 24" 1920x1080 monitor tweaks my brain. Pixels ahoy. It's jarring. I'm not the kind of person who cares about superficial things or style or brand at all, but I just can't get comfortable with sub-4k anymore.
Be very wary of Apple monitors. If you can, try one out in the environment you intend to use it in, before you commit. Apple displays are highly reflective. The glare is obscene. The display quality is great, but I can't deal with the eye strain. It's like there's a mirror glaze on top. They used to offer a matte option, but I don't believe they do anymore. It's painful.
This is an important point. I have a pal with an aging 5k imac that he wishes he could use only as a monitor with a new mac mini.
It seems display tech in imac is severely discounted in order to sell the whole thing. And it does seem to break the pricing if you don't see them as different products offering value for different configurations.
Bringing back target mode, or allowing the iMac to act as an external monitor would greatly increase the value of that product.
I can't explain how this pricing makes sense exactly, except that I can only assume Apple will want or need to price this stuff high to help differentiate the build quality in comparison to the collaboration on LG's ultrafine.
I bet somebody could make a business refurbishing those old 5K iMacs into usable monitors. Either yank the panel into a new case w/ a custom board to drive the display, or hack an input into the existing system (maybe with a minimal booting OS).
Either way, that'd be a cool product to see and seems like a decent side hustle to get some $. :)
I recently bought a 2015 27” 5K (5120x2880) iMac core i5 for cheap, put in an SSD, and upped the RAM to 24 GB. It handles everything I can throw at it as a developer (albeit not a game developer, just “full stack” web/java/node/k8s) and the screen is just incredible.
The SSD upgrade is not for the faint hearted, however. I used the iFixit kit with the provided “tools”, and lifting off 27” of extremely expensive and heavy glass that sounded like it might crack at any moment was not exactly fun. Having said that - I would do it again in a heartbeat to get another computer/display like this for the price I paid.
With regards to scaling: I have had zero problems with either my rMBP (since 2013) or this iMac, when in macOS running at high resolution. Zero. As soon as I boot Windows or Linux, however, it’s basically unusable unless I turn the resolution down.
I've got a Planar IX2790 and it's great. Article is spot-on re. scaling -- it's much nicer to look at all day (native 2x scaling) vs. 4k at 1.5x scaling.
Thanks for this idea. The design looks surprisingly like the old cinema display. Actually, apparently they use the same glass from the cinema display but no camera. [1]
It looks like the main concerns on this are around stuck pixels. Have you gone through calibration / QA on yours? [2]
Otherwise, seems like a compelling alternative to the Ultrafine.
I believe the theory that it's 5k iMac panels that Apple rejected. I have a few stuck pixels but I never notice because they're so small. I have a blue stuck-on in the lower-right quadrant and I can't even find it now.
I kind of thinking such Display's "Design" could overlap with the new iMac. One reason Apple used to having a "Chin" in the iMac was to distinguish it as a Computer and not a Monitor / Cinema Display. Judging from the leaks, New iMac would not have a Chin at all, and since there is no similar sized Cinema Display in the Line up this doesn't really matter.
I just wish they bring back Target Mode, or something similar to iPad's SideCar.
- Proved and debugged on the XDR release at the pro price point.
- Kept working on how it can be cheaper and fit into plans for whatever the ARM-based machine's initial graphics capability will be.
- Designed it to use a similar manufacturing line to the iMac then offer it at a lower price point to support the current mac pro, mac mini and a possible dev kit for the arm mac.
Or I suppose just keep making everyone buy the LG 5k ultrafine that is four years old. :P
I have just got a U2720Q, it worked at 60Hz right away. But I am connecting over USB-C, with the provided cable.
What it did find off-putting is that Dell say they don't test with Macs, and thus can't support them, but they just assume that Appl follow the same specification they do. A bit weak, but the monitor does work fine.
Thanks for this idea. The product page says it is 218 PPI, but I don't see how that is possible given it is 34" wide. Pixensity says teh actual PPI is 163.44. Can you confirm the PPI on this monitor?
It's 163. I think the 218 you got from a Cmd-F and picked up the SEO blurb for the 5k ultra fine in the footer).
I cannot tell the difference between the LG and my MacBook's retina screen at my normal viewing range, but this may just be my eyes.
I also use a non integer scaling factor on both screens as I find it has the right combo of resolution and real estate for me, and I don't notice artefacts.
I honestly find MBP trackpads to be too big, which is admittedly a preference thing, but their keyboards are absolutely horrible, and I have difficulty understanding how anyone could think they were "great". There's not enough distinguishing keys from each other, so I can't ground myself to the home row. I can't think of many keyboards I've used throughout my life that I enjoy less than the MBP. And that's not even mentioning the quality issues (duplicated or broken keys), or the lack of Fn keys.
For the trackpad I agree, but definitely not for the keyboard. And apple was late(!) to hires screens and to hidpi and now has lower res and density than the competition (e.g. 16:10 better 4k on dell xps or 3:2 screens with MS and others).
Also, apple had TN screens forever when the similarly priced competition had higher res IPS screens.
Isn't 4k on a laptop a significant power drain? And isn't the point of "stopping" at Retina resolution that the human eye can't tell the difference between Retina and higher resolutions like 4K at typical laptop screen size and viewing distance?
>"Of course, the real highlight is that new Retina Display. Its resolution is 2,880x1,800 pixels, providing a level of detail never seen on a laptop before. The highest standard Windows laptop screen resolution is 1,920x1,080 pixels, the same as an HDTV."
>The signature feature of the new MacBook Pro is the new 15.4-inch (39.11 cm diagonal) Retina display. [...] So far, no notebook screen has topped resolutions of 1900 x 1200 pixels (WUXGA) or 2048x1536 (QXGA in old Thinkpad models).
If you manage to dig up some obscure laptop that had a higher resolution at the time I wouldn't be completely surprised. However, to suggest that Apple was "late(!)" with high DPI screens is provably false and frankly, ridiculous.
I had a 1080p Alienware laptop in 2004. Then for some reason, laptops all went even lower resolution for a long time, and I couldn't find a good one until Apple came out with their Retina displays. Not sure what happened. Manufacturers just became cheap?
I've found the keyboard on my Surface Book to be much better than the keyboard on any of the MBPs I've used and owned.
The trackpad was just as good, too. But of course, the OS was worse, mostly in terms of performance. Windows 10 just always feels slow for some reason.
I still use a CRT, because features I want are still ludicrous expensive in newer tech (although they are getting cheaper over time).
1. Arbitrary resolutions, great to run old software and games, even better to run new games at lower resolution to increase performance.
2. Arbitrary refresh rates.
3. Zero (literally) response time.
4. Awesome color range (many modern screens still are 12bit, meanwhile silicon graphics had a CRT screen in 1995 that was 48bit)
5. No need to fiddle with contrast and brightness all the time when I switch between a mostly light or mostly dark content, for example I gave up on my last attempt to use flat panel because I couldn't play Witcher 3 and SuperHot one after the other, whenever I adjusted settings to make one game playable the other became just splotches (for example the optimal settings for Witcher 3 made SuperHot become a almost flat white screen, completely impossible to play).
6. For me, reading raster fonts on CRT gives much less eyestrain and is more beautiful than many fonts that need subpixel antialias on flat panels.
7. Those things are crazy resilient, I still have some working screens from 80286 era (granted, the colors are getting weird now with aging phosphors), while some of my new flatpanels failed within 2 years with no repair possible.
Unless you're using a really cheap screen, most LCDs are 8 bits per color. The cheap ones use 6 bits + 2 bits FRC. 12 bits (presumably 4 bits per color) seems insanely low because it would mean only 16 possible shades per primary color.
>meanwhile silicon graphics had a CRT screen in 1995 that was 48bit
Source for this? 10 bit color support in graphics cards only became available around 2015. I find it hard to believe that there was 48 bits (presumably 16 bit per color) back in 1995, where 256 color monitors were common. Moreover, is there even a point of using 16 bits per color? We've only switched to 10 bit because of HDR.
>7. Those things are crazy resilient, I still have some working screens from 80286 era (granted, the colors are getting weird now with aging phosphors), while some of my new flatpanels failed within 2 years with no repair possible.
This sounds like standard bathtub curve/survivor bias to me.
I am talking about SGI workstations, indeed the 1995 ones didn't support (without modification) 48bit, instead it was "only" 12bit per channel, 3 channels, thus 36bit.
> 10 bit color support in graphics cards only became available around 2015.
That's off by a decade.
> 256 color monitors
Is that a thing that exists?
> We've only switched to 10 bit because of HDR.
You can get clear banding on an 8 bit output, and 10 bit displays are used at the high end. 10-bit HDR isn't immune to banding, since most of the increased coding space goes into expanding the range. There's a good reason for 12 bit HDR to exist.
So it looks like it supported 10 bit frame buffers, but not necessarily 10 bit output. A quick search suggests it only supported DVI, which didn't support 10 bit output. In the context of talking about monitors, this essentially means that 10 bit wasn't supported at the time. Otherwise you could claim you have 192 bit color by software rendering at 64 bits per color, but outputting at 8 bits.
Consult the dot pitch of your monitor for the actual resolution. Everything is “scaled” to that resolution. Of course, the algorithm is physics, which is much better than cubic interpolation, so it does look slightly better, but a modern hidpi lcd will provide a higher dot pitch and thus a sharper, more accurate picture.
Color CRTs do have something akin to a native resolution too, defined by the phosphor arrangement, so they do "scale" things. It just happens that the scaling is naturally blurry and artifacts aren't noticeable.
You raise some very interesting points. I've appreciated the physical lightness and ease of positioning of LCDs, plus the absence of flyback whine. And they can go to higher resolutions than the scan circuitry in a CRT can physically manage.
But all those other things, yes the color resolution, the smoother fonts, the response time. I might have to swing by the recycler and pick up a nice "new" Viewsonic. :)
3. 60hz CRT refresh rate is 16.67 millseconds of delay. Interestingly, I connected a VT220 to my 56' 4k Samsung TV via a BNC-to-RCA cable, and by comparing the cursor blinking on both screens there's a very noticeable and visible delay, like a 1/4 second.
4. CRTs are analog. It's as many bits as your output hardware can make up.
5. CRT is still supreme for contrast (at least over LCD) despite all the tricks and such for LCDs.
7. That VT220 I mentioned above is from 1986. It's monochrome, but works great.
> CRTs are analog. It's as many bits as your output hardware can make up
Actually the horizontal resolution of a CRT is limited by: the dot pitch of the colour phosphors, the bandwidth of the amplifiers, and the electron beam spread.
The vertical resolution is limited by a combination of: electron beam scan rate, delay for horizontal flyback/retrace, delay for vertical flyback, desired refresh rate, and electron beam spread.
There are more details for the resolution limitations, but I think I covered the main ones.
I kept a CRT for quite a while but when I switched, I realized I didn't miss it.
1- True, if there is a thing I'd miss, that's it. At low resolutions, CRT sometimes get really nice refresh rates too (I've seen up to 18O Hz).
2- Modern gaming monitors have freesync/g-sync that not only give you arbitrary refresh rates, but they are also adaptive.
3- Also true, but not as significant as one might think. The monitor itself is zero latency, but what's behind it isn't. We are not "racing the beam" like in an Atari 2600 anymore, the image is not displayed before the rendering of a full frame is complete. The fastest monitors are at around 144Hz, that's 7ms. So for a 1ms gaming monitor to a 0ms CRT, you actually go down from 8ms to 7ms, to which you need to add the whole pipeline to get "from motion to photon". In VR, where latency is critical, the total is about 20ms today. More typical PC games are at about 50ms.
4- CRTs are usually analog. They don't use "bits" and it is all your video card job. Also 48bits is per pixel, 12bits is per channel. Apples to oranges comparison. CRTs do have a nice contrast ratio though, good for color range. Something worth noting is that cheap LCDs are usually just 6bit with dithering. True 8bit is actually good and I'm not sure that you can actually make a difference passed 12bits.
5- Never noticed that, maybe some driver problem. An interesting thing is that CRTs have a natural gamma curve that matches the sRGB standard (because sRGB was designed for CRTs). LCDs work differently and correction is required to match this standard, and if done wrong, you may have that kind of problem.
6- I hate text on CRTs. And unless you have an excellent monitor (and cable!), you have tradeoffs to make between sharpness, resolution, and refresh rate. And refresh rate is not just for games, below 60Hz, you have very annoying flickering. I wouldn't go below 75 Hz. And at max resolution, it can start getting blurry: the electron beam is moving very fast and the analog circuitry may have trouble with sharp transitions, resulting in "ringing" artifacts and overall blurriness. One old games, that blurriness becomes a feature though, giving you some sort of free antialiasing.
7- Some CRTs are crazy resilient, others not so much. Same thing for LCDs. And as you said, phosphors wear out, that's actually the reason why I let go of my last CRT (after about 10 years). My current LCD is 8 years old and still working great, if fact, better than my CRT at the same age (because it doesn't have worn phosphors).
Sadly, the CRT manufacturers back then when it was still obviously superior to LCD and Plasma, decided to kill it early, SONY was still selling CRTs faster than flat panels when they shut down their factories.
Many people today think that noone make CRT because noone buy it, but is the opposite, you couldn't buy CRTs anymore because manufacturers intentionally killed them.
There was even research ongoing at the time to make a slim flat panel CRT, but they cancelled that too.
CRT due to being analog, doesn't support DRM, thus this contributed a lot to its rapid death. (among other reasons).
People still use CRT in some arcade machines, medical industry (for example to diagnose certain visual disorders, and a plasticity research I know, require zero update lag, thus only possible with CRT), industrial applications where flat panels are too fragile and whatnot, but all of these are basically buying existing CRTs, driving up the price.
There was some people trying to restart production, but... the companies that have the patents refuse to sell them, the companies that know how to make them also refuse to sell the machinery and whatnot, and the few independent attempts failed often due to regulations.
Also in USA someone invented a machine to recycle CRTs, and ended being shut down due to regulations too, so in USA because of regulation instead of safely melting glass and lead, the law basically says to dump it all in landfills.
My personal workstation has a Samsung Syncmaster that I don't have model number available now. The maximum resolution is around what people call 2k, but 60hz, that I don't like. My current resolution is 1600x1200 75hz.
Reminds me of a Dell CRT I used to use - 1600x1200@80Hz, and 21" IIRC. Every monitor I've had until fairly recently has felt like a compromise in some way compared to it, but taking a CRT to university was a non-starter.
HD-DVDs playing on my Xbox 360 looks looked glorious on it (Xbox 360 supported 1600x1200 output, as did some games IIRC, and any widescreen content would just get letterboxed.
Regarding the "120Hz dance", which sure is ridiculous, the author could probably give the nice little tool "SwitchResX" a try. I adore that piece of software because it allowed me to force my MacBook to properly supply my Dell U2711, which is a 2560x1440 display, with an actual 2560x1440x60Hz signal over HDMI (which was important to me because I needed the DP port for something else).
That older monitor has some kind of firmware bug or maybe it's a wrong profile in MacOS or whatever, which makes it advertise a maximum of 2560x1440x30Hz or 1920x1080x60Hz to the Mac when connected via HDMI (DP works fine out-of-the-box), effectively preventing me from choosing native resolution at maximum refresh rate. I haven't been able to make MacOS override this limitation in any way using the common OS-native tricks, but SwitchResX can somehow force any custom resolution and refresh rate to be used, and the monitor is apparently able to deal with it just fine, so I've been running this setup for years now with no complaints whatsoever.
Also no manual work was ever needed after display disconnect/reconnect or MacOS reboot. I had problems once after a MacOS update, which required a SwitchResX update for it to be working again, but other than that I'm in love with this nifty low-level tool.
I had a similar problem with an older MacBook that would only (officially) power my 4k display at 30Hz. SwitchResX was the magic solution to bring it up to 60Hz.
I found this article utterly baffling. The author clearly knows their stuff, having created Fira Code, but my experiences couldn't be more different.
I spend most of my days in a text browser, text
editor and text terminal, looking at barely moving
letters.
So I optimize my setup to showing really, really
good letters.
I certainly appreciate how nice text looks on a high DPI display.
But for coding purposes!? I don't find high DPI text more legible... unless we're talking really tiny font sizes, smaller than just about anybody would ever use for coding.
And there's a big "screen real estate" penalty with high DPI displays and 2X scaling. As the author notes, this leaves you with something like 1440×900 or 1920x1080 of logical real estate. Neither of which is remotely suitable for development work IMO.
But at least you can enjoy that gorgeous screen and
pixel-crisp fonts. Otherwise, why would you buy a retina
screen at all?
It's not like you really have the option on Macs and many other higher-end laptops these days. And I am buying a computer to do some work, not admire the beautiful curves of a nice crisp font.
So anyway, for me, I chose the Dell 3818. 38", 3840 x 1600 of glorious, low-resolution text. A coder's paradise.
For purposes of software development, I won't really be interested until 8K displays become affordable and feasible. As the author notes, they integer scale perfectly to 2560×1440. Now that would rock.
Legibility is a funny thing. When you're paying attention, your standards for legibility will be very low. I was totally happy to read text on a 1024x768 17" MAG monitor for years. When you're paying attention to something else, like designing the code you are about to type, I think crispness and clarity of text absolutely matters. Microsoft Research did a lot of work on this when they released ClearType. They seemed to believe strongly that clearer text measurably improved speed to recognize characters.
I'm in my 40s and my eyes aren't great. But I have zero issues with legibility on the non-high-DPI Dell 38".
Depending on font choice (typically I use Input Mono Compressed) I can fit six or seven 80x100 text windows side-by-side on the Dell with excellent legibility.
That is up to 800 total lines of code and/or terminal output on my screen at once.
That really is my idea of coder's paradise. You, of course, are entitled to your own idea. No two coders like the same thing. Ever.
Well, it's really about personal preference and while I know a couple of collegues with your preference, I feel it's the minority.
I take two 16:9 screens over one 3840x1600 screen anytime. No need to setup some additional inner-mintor split-window management. Split-monitor management and workspaces works very well and I can even turn the 2nd monitor off, when I don't need it and want full focus mode (i.e. reading). Also I prefer my main monitor exactly in the middle in front of me and an actual 2nd monitor to the right. If I have the luxury of a 3rd monitor, it's to the left (usually the laptop that powers the 2 big screens). Setting one half of an ultra-wide in the middle just feels wrong. And splitting in 3 is too small for me and again the inner-monitor window management issue.
While I also strongly believed my old 1920x1080 24" (~92ppi) screens were good, I had the opportunity to use qhd 27" (~110ppi) screens for 3 months abroad and I was baffled when going home how incredibly bad text looked on my old 24" monitors.
The benefit of a 3840x1600 screen over two monitors with lower resolution is that 3840/2 = 1920 and 3840/3 = 1280. Those are horizontal resolutions for 1080p and 720p respectively. The fact that these are standard resolutions means basically every app works as designed regardless of whether you have 2 or 3 windows side by side. This isn't true for ultrawides with resolutions like 3440x1440 that don't divide cleanly into other standard resolutions.
The default software that comes with the Dell mentioned above handles everything. If I want to simulate two 1080p monitors, I just do the standards Windows drag to the side of the screen. If I want to simulate three 720p monitors I can press shift while dragging and that tells the Dell software to take over. It is more versatile than having individual monitors.
> The default software that comes with the Dell mentioned above handles everything.
Does it really handle everything? Does it handle all (or even just a single one?) of the use cases I mentioned in my comment and that I care about. Can I turn of part of the screen, if I actually only need smaller space? Can I position a 1920 default width space right in the middle in front of me without it looking weird (i.e. symmetry of screen hardware borders)? Are workspaces working correctly (in unix, windows, macos, all of them?). Just splitting monitors isn't everything, I heavily use workspaces to switch between context.
If all of this works (except the obvious middle problem that's phyiscally impossible to solve) I might actually consider it.
It does for my use case, but obvious your mileage may vary.
>Can I turn of part of the screen, if I actually only need smaller space?
Sort of. The monitor can split into dual source mode that has two 1920 sources side by side. You could potentially turn that on and set one side to an empty source. You can also always use a black desktop if you need the rest of the monitor to be dark to allow you to focus on whatever window you have open.
>Can I position a 1920 default width space right in the middle in front of me without it looking weird (i.e. symmetry of screen hardware borders)?
How do you accomplish this with two monitors currently? You would have to choose between symmetry or having one monitor front and center. The Dell software allows you to customize the drag and drop regions. I use three columns that are the full height of the screen. You could set it up to have a 1920 section in the middle with two 960 columns on each side. You could also setup your physical workspace so one side of the monitor is centered in your vision instead of the center of the monitor. Also I have mine mounted on a monitor arm that allows me to reposition it as needed.
> Are workspaces working correctly (in unix, windows, macos, all of them?)
It works in Windows and that is the only native GUI I use. Everything obviously works fine when in the terminal and I rarely increase the resolution of a VM past 1920. I would frankly be shocked if there wasn't similar software available for Linux and OSX that allowed you to setup customized zones like Dell's software if you need to run one of those natively.
That Dell tool sounds a lot like Microsoft's PowerTool FancyZones. Have you tried FancyZones? It can optionally take over the Win+Left/Right shortcuts from Aero snap (the drag to side/quadrants tool built into Windows).
I've been drooling over the Samsung curved ultra-wides since like January as a possibility for my gaming desktop. In March one of my gaming desktop's monitors blew so I've been done to just one monitor and started to use FancyZones and regular usage of FancyZones has got me much more convinced I'd be happy with the ultra-wide, now I'm just hesitant for merely financial reasons.
I haven't used FancyZones. I will check that out, thanks for the tip.
The financial aspect of this is definitely the toughest part to justify. A single monitor with this resolution is always going to be more expensive than multiple smaller monitors. The cost was justified in my experience, but that will vary depending on your use cases and budget.
Yes the financial aspect isn't easy to get past right now, and the curved ones especially right now are a premium cost just because the option is still so new, but the curved ones do about what I tried to do in manually angling my previous dual monitor setup with added advantages in gaming (because I could actually use the center point and periphery in game rather than the center being bevels and in the way if I tried to span games across both monitors). Plus, all the usual financial concerns from the current state of the world and everything that has been going on.
This is, of course, an issue of personal preference and I do not work for a monitor manufacturer so I'm not trying to talk anybody into buying anything hahaha.
Also I prefer my main monitor exactly in the middle
in front of me and an actual 2nd monitor
I don't understand how dual 16:9 screens help with this! But I agree with you that I hate having a "split" in the middle. I need my main monitor centered.
My monitor arrangement is:
- Dell 38" ultrawide (centered in front of me). Work happens here, obviously. MacOS virtual desktops, while not the most feature-heavy, cover my needs well enough here. Of course I respect that some people lean into virtual desktops/workspaces harder and need more.
- Compact ("portable") 15" 1080p monitor, centered in front of me under the ultrawide. This is essentially dedicated to Slack and my notetaking app. This leaves the Dell at a decent ergonomic height at my eye level.
- Laptop off to the side, for nonessential stuff - typically music app, sometimes Twitter or news feeds
I have a Dell 38" 3840 x 1600 Ultrawide and it is bliss. I don't think of it as two screens, I think of it as three. I can comfortably have three applications displayed side by side with no seam down the middle. For me, it isn't about the crispest font I can get. It is good resolution and tons of real estate.
Do you use it with Windows or MacOS? If it's Mac, do you experience any compatibility issues? I was researching quite a bit into this Dell 38" in the last few days and discovered that it does not have official Mac support, and it seems to have some issues with the USB-C connection.
That Dell would be the screen I would buy right now for coding. I am currently using a 5k iMac at home and a 30" Dell (2560x1600) at work. I really loved the resolution of the iMac, but for my work, I need screen estate. The font rendering on the 30" at 1x scaling is good enough, and having a lot of screen estate is essential for me. Having 50% more horizontal space would of course be great :).
The new 32" Apple display would be great of course, but the price is just off. For coding, I don't care how much reference quality the color setup is. My only hope is, that Dell soon offers an affordable display based on that panel.
I definitely have zero regrets about the Dell. Some of the best money I've ever spent. Depending on font choice (typically I use Input Mono Compressed) I can fit six or seven 80x100 text windows side-by-side on the Dell with excellent legibility.
I can't imagine anybody justifying the cost of that 32" Apple display for coding either. I can't even imagine many high-end creative professionals justifying that.
I mean, on one hand... if a person figures they'll get 5+ years out of that Apple monitor and they work 5 days per week... that's less than $4/workday for a $5,000 monitor. From that perspective it's somewhat reasonable. But most people would probably get more benefit from picking a $1000 monitor and spending that $4000 elsewhere.
The funny thing is, I would even think about the 32" Apple display, if Apple made a computer to go with it. But the new Mac Pro at the entry price is 2x of its predecessor and really not a great computer at the entry level specs. It is amazing fully loaded, but I would rather get a Tesla :p. Not sure how well the 32" Apple is supported by Linux :).
Do you use it with Windows or MacOS? If it's Mac, do you experience any compatibility issues? I was researching quite a bit into this Dell 38" in the last few days and discovered that it does not have official Mac support, and it seems to have some issues with the USB-C connection.
What's with the downvotes? HN is turning into Slashdot or Reddit, and that's not a good thing.
I clearly stated my point. Maybe you disagree. However, the downvote button is not a "disagree" button. It's for posts that actively detract from the discussion.
We need some kind of meta-moderation to ensure frivolous downvoters lose their downvote privileges.
I personally agree that the downvote button should be reserved for low quality comments rather than disagreement and that the diverse and high-quality comments are what makes this site and community awesome. Unfortunately neither the FAQ nor the Guidelines state anything about how to use the vote buttons. So how should HN users know when to use them? Is there even consent anymore (or ever has been?) that downvotes should not be used for disagreement? How did I form the opinion that downvotes should be reserved for low quality rather than disagreement? Somehow along the lines of contributing enough to this site to reach enough karma to downvote.
Maybe the guidelines should add a section on how to vote. On the other hand, how can this really be enforced?
It's a sad thing. I also stopped posting opinions that might trigger disagreement from a large majority.
However, the guidlines clearly state:
> Please don't post comments saying that HN is turning into Reddit. It's a semi-noob illusion, as old as the hills. [1]
It's not authoritative, and I don't know dang's feelings on the matter, but it's probably worth noting that long ago the founder of the site clearly stated that it was acceptable to use downvoting to express disagreement:
pg: I think it's ok to use the up and down arrows to express agreement. Obviously the uparrows aren't only for applauding politeness, so it seems reasonable that the downarrows aren't only for booing rudeness.
For completeness, here's the primary moderator 'dang' very explicitly confirming that downvoting for disagreement is still allowed at least as of 2 years ago:
dang: Downvoting for disagreement has always been ok on HN.
I don't care about the imaginary internet points, but it doesn't take many downvotes to bury/kill/whatever a post. If a few early responders "downvote disagree" then nobody will ever see it.
It's almost as if they thought, "How can we best encourage groupthink?" and this was the answer.
And yet, years after Dan took over as moderator, and over a decade after Paul posted his comment, the site is still going strong, and arguably is one of the best places for discussion on the internet. There is clearly a tension between allowing people to "bury" unfavorable opinions and denying them the ability to express themselves at all, but somehow it mostly works.
One thing I'd recommend, if you haven't done so already, is to at least occasionally browse with "Showdead" turned on (accessible through your profile link in the upper right). The majority of the things that are dead deserve to be, but the others can often be rescued by vouching for them.
It also may help if you think of voting as "rearranging the page" rather than killing an opinion. The opinions are still there for those who wish to upvote them, they are just at the bottom. Like the dead comments, most of the things greyed out at the bottom deserve to be there---but the rest one can try to rescue with an upvote.
And yet, years after Dan took over as moderator,
and over a decade after Paul posted his comment,
the site is still going strong, and arguably is
one of the best places for discussion on the internet.
HN is a success and does the vast majority of things well. But it makes no sense to sweep aside criticism with "well, it's working." It succeeds because of the things it does well and in spite of the things it does suboptimally.
HN moderation generally works well in spite of the actual technical implementation; it succeeds because HN has a high-quality readership who generally follows the internet etiquette of "downvote posts that harm the discussion, not simply because you disagree."
One thing I'd recommend, if you haven't done so already,
is to at least occasionally browse with "Showdead" turned on
(accessible through your profile link in the upper right).
The majority of the things that are dead deserve to be,
but the others can often be rescued by vouching for them.
Amen! I wholeheartedly agree and I do that for this very reason.
Interesting info, but overly opinionated. I tried turning off font smoothing just now but the lack of boldness made everything appear dimmer, and I had to strain my eyes a bit when reading my code. Also I use my MacBook without a separate screen, with no scaling, so I really have to disagree with this:
> This will make everything on the screen slightly bigger, leaving you (slightly!) less screen estate. This is expected. My opinion is, a notebook is a constrained environment by definition. Extra 15% won’t magically turn it into a huge comfortable desktop
For me a notebook is not a constrained environment, and screen scaling very much makes the difference between "a huge comfortable desktop" and not.
This post is from the author of Fira Code, a font with ligatures that is incredibly pleasant to use. Highly recommend to give it a try if you haven't already!
Fira Code is a big job (with so many ligatures) and working on it obviously takes great care, but let's not pretend that the same author is responsible for the font as a whole - unless you know something about Fira Mono that I don't?
Thanks for the heads-up. Personally, I'm not a huge ligature fan in my code as my eyes tend to go directly to them instead of the code I want to read. It's great that Fira Code comes with a non-ligature version.
You're welcome! I can see how that could be a distraction for some. Personally, I enjoy the aesthetics. And as a bonus, I always get people intrigued when I show someone the code with this font that they never saw before.
Nonetheless, in my opinion it's still an awesome monospace font even without the ligatures.
I used to long for the days of 200dpi monitors when I was younger and we lived with 72dpi or less on-screen. Now, my vision has deteriorated to the point where I honestly can't see the difference despite being a type nerd. I'm increasingly using the zoom feature in MacOS to read smaller UI elements. I suspect my days of driving my MacBook display at "more space" (2048x1280) instead of "default" (1792x1120) are numbered.
It's frustrating that consumer Hi-DPI display development has completely stalled in the last five years.
As mentioned in the article, macOS works best using true 2x integer scaling. Using 2x scaling, a 4k monitor will result in an equivalent workspace to a 1080p display—unusable for productivity and development. A superior option is iMac/LG 5k display results which nets an equivalent workspace of 1440p.
The only options for greater-than-4k displays on the market currently are the 27" LG 5K ($1200, 2014) and the Dell 8k ($3500, 2017). I'm convinced this is due to the manufacturers focusing their attention on the high-margin gaming market which has no need for resolutions higher than 4k.
I'm holding onto my 30" 2560x1600 Apple Cinema Display until I can get something that offers me the equivalent amount of screen real estate at 2x retina scaling.
> an equivalent workspace to a 1080p display—unusable for productivity and development.
I have to disagree here. I use (and have used) a 1080p display for years and don’t feel the need for anything larger. That might be an artifact of my early years on 80x25 character displays, though.
Not to take away from your point, but Doom was actually written on a NeXTstation and cross-compiled for PCs, so at a minimum they were using 1120x832 resolution (2 bit greyscale).
I use a Pro Display XDR for programming / design work, I think you forgot this one. It seems like it would fit your needs (with the obviously premium price) as it’s 32”.
It’s overkill for most, but still the best display I’ve ever used and I love it. I actually tried an Asus 4K 120hz for much less but the fan noise was a deal breaker even if the screen didn’t break on arrival (it did).
I think the Pro Display should be disqualified for the price :). Like with the new Mac Pro Apple made the dream Display/Machine for professional video artists with high-paying customer. As a price, they completely neglegted their power user base which requires something different than an iMac.
I can only hope that either Apple or Dell makes a more basic version of that screen based on the same panel. And of course, on a high end screen I would expect more than one input. You might want to connect your laptop to your screen too...
Alternatively, I am hoping for an iMac with a larger screen.
I'm impatiently waiting for a slightly less overkill 6K 32" option, it's obviously the best size + resolution for Mac.
My hope is that someone will take the panel and pair it with a ~700nit backlight (vs 1k sustained 1.5k peak for XDR) and price it around $2k. Also interested to see what Apple has in store for the iMac, if the refresh comes with a new and improved display I may just go in that direction.
Color accuracy is far more important to me than 120hz capable. The faster displays today tend to be subpar at color accuracy, illumination uniformity, etc.
As someone who also stares a monitor for work have you noticed any difference in eye strain or comfort? I have two 4K monitors but considering going to 5K+. I liken it to shoes or a mattress. If we spend all day with X object it, then quality should be a priority.
No difference to eye strain or comfort vs the LG 5k I used before, although a major improvement vs my old 1440p monitor. Width makes a good improvement in Xcode and the color is very accurate to the iPhone screen which helps. I prefer one big monitor now I have this.
I agree with your analogy, literally my livelihood feeds through this screen, and I find a good monitor lasts a long time.
Same here. I'm using my Dell 3008WFP (also 30" @ 2560x1600) until something genuinely bigger & better comes out. I bought it over a decade ago and it still works great for programming.
I still use mine at work, and am extremely happy with it. The font rendering on my iMac 5k is better of course, but for work, the additional screen estate (especially vertically) is hard to beat. I am of course contemplating the 38" Dell with 3840x1600 :).
On the other side of things, running dual 4k displays for anything remotely intensive (a browser, IDE, instant messaging, mail client, and terminal in my case) can bring some machines to a crawl or run the fans at high speeds indefinitely. I think graphics technology has to catch up to hiDPI before I'd go past 4k.
And let's not even get into the fact that 4k 60hz is a struggle and a half -- you need HDMI 2.0 or displayport to make it work at all, and many modern laptops lack either port, forcing you to use USB-C/thunderbolt with adapters that have poor spec support and even worse documentation.
Even if someone offered me dual 5k displays to replace my dual 4k ones, I would turn them down right now. Before we move on to 5k displays, we need to get 4k60 down. And while I'm on my soapbox... where the hell is 4k90 or 4k120? I only bought my 4k monitors last month, with virtually unlimited budget, but they just don't exist. I imagine they've fallen victim to the same issue: the cables to support it just barely exist, and certainly can't be doubled up for dual monitor setups without a mess of wires plugged directly into a laptop or desktop.
If you use a Mac, a 5k monitor with integer scaling (2x) will take fewer GPU resources to run than a 4K with fractional scaling (1.5x). This why I refuse to switch to a 4K monitor for now. Will do the leap to a 5k monitor but the options are very slim now.
I'm also hanging on to my 30" Apple Cinema (and adapters to work with a new USB-C MBP). The LG 5K seems like a decent replacement though, I'm not totally clear why it seems so unpopular.
I've been using two LG 5k monitors for years and they are excellent. They got a bad reputation when they were released for build quality issues, but I haven't had a single issue.
Why is everyone so attached to scaling? What is this shit software that 1) everyone relies on and 2) whose UI and font size can't be adjusted like any normal software (or web page) since the 90s? Is this yet another brainfart from Apple?
Windows has great High DPI support. It's Windows applications that often haven't seen upgrades since before even 2012. Windows' commitment to backward compatibility is the struggle. Windows hasn't had the option to just change processor architectures every dozen years on a seeming whim and subsequently force all software to be rewritten or die.
macOS X was released in 2001 with the "Cocoa" application UI library. While "Carbon" helped older Mac OS applications run for a limited time on OS X, no applications today use any graphics stack older than Cocoa's 2001 release, as the drop of all support for PPC-targeted apps insured that in macOS' switch to x86. (Cocoa had High DPI support baked in from NextStep, even if it took ~11 years to be "needed".)
Win32 was first beta tested by developers in 1992, and has had to remain stable and backwards compatibile to those first early 90s versions. There are still 32-bit apps written in 1992 that are expected to run unmodified on Windows 10 today. The last processor-architecture related API drop that Windows has been allowed by public perception and corporate strategy was Win16 support was dropped on 64-bit processors. (Hence why Windows on ARM has struggled and the current iteration of Windows on ARM now involves a 32-bit x86 emulator as a core feature.)
Mac apps did have to be upgraded, it's just that Apple has been much better at requiring upgrades. There's barely no comparison here. There is no way that you can possibly find today a version of macOS that still supports Mac Classic applications unmodified from the 90s (for instance 94's Glider Pro v1) with High DPI support, yet Windows absolutely must run applications from Windows 95. Sure, Windows sometimes still stumbles in High DPI support for pixel-perfect applications written three decades ago in the 90s, but it at least tries, macOS shrugged and gave up.
Scaling just works greatly as Apple implements it. Especially, if your screen has a 200ppi or larger. Trying to change the font sizes across all applications is a sysiphus task and eventually you end up with one, which doesn't behave greatly. Especially, if one of the applications is Linux running in a VM. I have a Dell 24" 4k - and for 4k at 2x, the screen should be about 22". But with macOS I can set it to a virtual resulution of 2300, which works great.
On my 15" MB Pro, I usually use the 1680 resolution, but if needed, I can quickly change it to 1920.
Been developing software for 30+ years, with up to 2 extra monitors, and I always go back to a single display. I'll put myself up against anybody with multiple displays. The thing is, I just alt-tab (or otherwise) to different things - it's not like I can actually perceive more than 1 window at a time, for the most part. The head twisting to see multiple displays bugs me, as does the eye movements across a huge display. Plus there is just more opportunities for distractions with more display area.
Finally, I work most effectively when I hold what I need in my head, and find that multiple displays fights against that need.
The only real exception to that is if I'm writing front end code of some sort, having the live updates helps a ton.
Lastly, if I could have a bigger display on my laptop, I'd be up for that!
I used to feel this way. I found it depends on the work.
A lot of my old work didnt depend heavily on looking at a terminal output constantly.
These days, i have to have about 3-4 things running. The extra screen just becomes a log window.
You can argue the software should be redesigned, i do too, but the problem for me remains and a second screen while not solving it, alleviates it a lot.
It might have something to do with using a twm, might not.
All i know is that i dont miss the days of tab cycling and desktop flipping.
Tab cycling can be a productivity killer, especially when you need to cycle between more than 4–5 applications (which can happen quite often, but OP wrote “alt-tab (or otherwise)”.
What works excellent (at least for me, not necessary for you) is to set shortcuts to your most used applications (be it the function key row, or some hjkl combinations). With such a setup you can just press a key / combination from muscle memory, instead of opening the tab switcher overlay, parsing it, deciding how often you need to press tab, and then actually switching to the other application window. Keeping shortcuts to applications feels like you would only have to do the last step.
With a laptop, an extra screen does seem to help somewhat in situations like you describe, I agree. That's because a laptop screen isn't quite big enough to bother with tiling your windows. If I had a laptop with a 20" display, maybe a touch bigger, it would be plenty for 95% of my development needs.
But since those don't exist, my general preference is an available external screen that I can use if needed. I like to couch/chair code a lot with my legs up, so I don't like to rely on an external monitor, so I try not to get used to one full time.
He puts up some nice technical arguments, and I'm sure a 4k monitor is right for him, but really it comes down to what you're actually comfortable with. I stopped worrying about resolution the moment I got my first 19" (CRT) monitor. Resolution was high enough to fit as much text as I needed, and really that's all that ever mattered. I've used retina displays but I've never felt they make enough of a difference to matter. I've worked on 32" displays, and while they're nice, I've never felt like I'm missing real estate when I move back to a 1080p display.
Font smoothing has never bothered me, but what DOES bother me is this trend towards lower contrast, and "flat UIs" that lack any differentiation between sections in the UI. Win95, while ugly by today's standards, was easy to read and follow. And the black-on-white text is something I miss dearly.
The author says that you don't need to choose between 4K and 120Hz, because there are "reasonably priced" monitors on the market that have both. But their recommended monitor ($900) seems a lot more expensive than a 4K 60Hz monitor (which is around $300-$400 when I search on Amazon).
The author also says (they think) that you need the discrete graphics card in a MacBook to take advantage of a 120 Hz display, and that the integrated Intel Iris graphics won't do.
I was actually shopping around for a new MacBook earlier today, and noticed that the 13" MacBook Pros only come with integrated GPUs, and you have to get the 16" MBP if you want discrete graphics. So if you like the portability that a 13" laptop has, this might not be a good tradeoff (not to mention that the 16" one costs $400 to $800 more than the top-spec 13" one).
I totally agree with the stuff they say about rendering fonts on 4k screens, but I don't think I'd be willing to take the hit in portability, or shell out an extra $700+, to get 120 Hz.
For what it’s worth, the latest 13” MacBook Pro (with integrated intel graphics) can power the 6016x3384 Pro Display XHR at 60Hz.
I am unsure what refresh rates it can achieve with a lower resolution.
But really as someone who made the jump from 60Hz to 120Hz, I wouldn’t bother unless you’re gaming. I accidentally was running my monitor locked to 60Hz and when I fixed it, I barely noticed the difference - it was only really when playing FPS that I noticed.
I disagree that all available resolution should go to increasing sharpness and detail of text, which, after all, can only improve so much. There's only one metric that really matters for text, and that's reading speed. Any beauty or detail which doesn't serve to differentiate characters (thereby increasing reading speed) is essentially pointless.
A better reason to get a better monitor is to fit more text on it. While coding or learning, there might be one or more editors, terminals, browsers, chats, etc. Can all this be behind tabs or in virtual desktops? Of course. But the value of having it all available at a flick of the eyes is so much higher! Even if you turn off all the animations, flipping virtual desktops or tabs requires a much more severe context switch than casting a glance to the right, left, or (slightly) up or down. I'm currently using a 5k2k monitor at its native resolution, and it's pretty awesome.
I have been hoping for two decades that VR would solve this, allowing me to have a truly vast workspace that moved and zoomed just via eye movement, but not only is that day still far away in eye-tracking terms, VR goggles have so far all been much worse for text than high DPI screens, in my experience. I appreciate that there are VR workspaces in development, though, like Immersed, for the day when the hardware is good enough.
Yeah, for some reason these are all coming from "gaming" monitor makers. I think the last one is trying to conjure up a "pro" vibe with the hood, but yes - these are definitely tacky.
Acer, Asus, et al. tend to just remarket OEM technology with relatively little value-add. I suspect these are all built on the same panel, but I could be wrong.
Oh! Is that what those funny looking "hoods" are about? I thought for a second they might be functional, but I guess they're just decorative? "Gaming" equipment (monitors, chairs, mice, keyboards, etc.) rarely seems worth it to me, simply because they spend too much effort and cost on things that are non-functional, but make the thing look attractive to the target audience. I am not a part of that audience, so I find the esthetic rather garish most of the time.
The one exception for me is that I do like a couple of gaming mice. It's not so much for the hotkey functionality, but because they have a really high resolution, which makes mousing easier and more precise to me. Other than that, my setup is all really basic, office-style equipment, and I love it.
Hoods are for reducing reflections on the screen. They are functional, and are not (at least originally) a “gaming” feature. Somewhat common on high-end color-accurate displays. How useful they are depends on the lighting.
The monitors linked are actually quite good, the gaming styling they have is rather unfortunate.
> Oh! Is that what those funny looking "hoods" are about? I thought for a second they might be functional, but I guess they're just decorative?
The main use of those monitor hoods is to remove glare caused by ambient lighting.
They're common (and often included) accessories for displays intended to be used for photo editing, color grading, and similar types of work.
I was kind of surprised to see one on a "gaming" monitor though.
4K at 200% scaling feels wrong for a big monitor like a 27". The desktop looks like a 1080 desktop. And if the monitor is to be used as a gaming monitor too then 4K is again wrong because even with the best graphics cards, 4K@120 is out of reach.
The perfect 27” monitor for me would be one that has 5K@120 because at 200% scale you’d get a desktop that looks “1440-size” and when gaming you can run half resolution at 1440@120 with a decent graphics card.
These don’t exist though, at least not as IPS I think, so I’ll stick with my 1440@144 for a while longer.
The LG UltraFine is a 27-inch 5K (5120 x 2880) IPS display. The maximum refresh rate is 60 Hz, so not the best for gaming; but it's designed for a Mac anyway. I've had a couple for over three years now and it's the best display for a Mac I know of, outside of Pro Display XDR, I suppose.
Yeah an alternative is always to just keep the screen I have now (1440@144) for gaming, and simply get a second screen for work with High DPI. Seems silly to get a 5K@120Hz mega expensive screen to play games at half resolution. I'll buy a bigger desk instead.
> Times of low-resolution displays are over. High-resolution displays are a commodity now.
I reject that premise. 4K is basically the maximum number of pixels you can get for less than $1K, and that's only a handful more pixels than a 16" MBP screen. It does not make sense for most people to shell out the $$$ just so that they can see the same number of pixels from slightly further away.
It is absolutely not the right time to upgrade your monitor. Wait until 5K+ becomes affordable again[1] so you can actually get the benefits of an external monitor (more usable screen space) without having to sacrifice the sharpness and quality of retina.
I have one, too. It's pretty sad that 16:10 has fallen out of style and that there is no 27" 2560x1600 display available.
Dell has the Ultrasharp UP3017 with 2560x1600 at 30", maybe that's something you can try if you want more pixels. It's just pretty expensive and has a big bezel since it was released in 2017.
I bemoan the fact that there are so few 16:10's on the market and nothing high resolution. 16:10 is such a great aspect-ratio and the extra vertical screen real-estate is very useful when programming.
It is interesting to see some laptops move off of 16:9 to 16:10 or 3:2. This gives me hope that we'll see new/better 16:10 panels.
The image quality upgrade from HD to 4K is pretty massive. I still have a few 1200 and 1080 displays around and they look like shit compared to my main display. You can't fight time and the market forever.
Do you find any problem with text being too small? I am typing this right now on a on a 30" 2560x1600 display which is right at 100dpi which I find is very readable. In the past I had tried 27" displays at 2560x1440 which has a dpi of 108 which made things just small enough to be hard to read. Obviously I could scale stuff in the OS, but that typically left me with some UI elements looking fuzzy/distorted. My ideal I think would be a 29" 5k monitor, so that probably is never going to happen.
I want to thank the author for the article, it comes at the perfect time for me. Earlier today I just had a main power line incident that resulted in burning almost half of all my electric equipment in the house, so I need to buy immediately at least one monitor and I was in doubt about spending too much on it (I need to buy other stuff too) and I am a cheap bastard, but I am moving a bit up versus what I was planning to get.
I saw some discussions about multiple monitors; I am a heavy user of multiple monitors, 2 most of the time and 3 when really needed; I don't have space on my desks (home & office) for 3 monitors all the time, so it depends on the needs and not on coolness. What I plan to do now is to get an asymmetrical configuration at home with one regular (1080p) monitor for some work and gaming (the CPU is an i3 7100, so I don't do much gaming) and a larger and better one for work only cases where a lot of information needs to be on the screen at the same time. As the article did not mention multiple monitors and buying 2-3 monitors that are 120 Hz/4k resolution is extremely expensive for a home setup, I think it is worth mentioning this kind of compromise of mixing size to have one of each. Not having OCD the different size is not such a big deal, while the extra functionality/productivity helps.
The short version of this is that TVs generally have framerates that are a multiple of 59.94, because NTSC is actually 59.94Hz (rather than 60), for various historical reasons -- basically, for color TV to become a thing, without obsoleting every single B&W TV in existence at the time, they needed to slow the framerate down by 0.06Hz[1]
There's still plenty of TV-targeted panels out there that only do 59.94Hz, and not 60Hz. Some of them are even in computer monitors! I imagine likewise that there are a lot of panels that do 119.88Hz (59.94Hz * 2) and not 120Hz.
And just to be confusing, a number of things (both hardware and software) tell you that you have a 60Hz signal when you actually have a 59.94Hz signal! A really good example is windows -- refresh rate settings in the OS get quite interesting, showing 59.94 in some places, and 60 in other places![2]
Strongly disagree. Any subpixel rendering on low-resolution displays must be disabled. Otherwise you get color noise and blurry text. Pixelated fonts are not bad.
It is fine on HIDPI displays though because it isn't as prominent there and the effect actually works.
Instead of high resolution pick a display with a decent aspect ratio. 16:9 is a joke. 16:10 is superior but not exactly a game changer. 3:2 is very rare, especially for desktop displays, and 4:3 is pretty much dead. We do not talk about 5:4.
I have the same opinion as the author here. When I buy a 4k monitor, it's generally for the higher PPI. There's just less eye strain for me reading less pixelated fonts.
Personal sweet spot me in terms of pleasant readability is: 4k 24" @ 200% scale (so same screen estate as 1080p) or 5k 27" @ 200% scale (same screen estate as 1440p).
This is a fantastic post, I just bought a 28", 158 dpi monitor recently and spent quite a bit of time figuring out how to maximize the font quality.
Since there weren't any suggestions for linux, this is what I did for Ubuntu 20:
* 2x scaling
* Downloaded libfreetype, enabled subpixel rendering option, replaced existing libfreetype.so
(I think Ubuntu 20 might already have this enabled, but just to be safe I did it manually)
* Turned off all hinting options in font configuration
* Turn on lcddefault filtering
(necessary if you have subpixel rendering enabled)
* Specifically set 158 dpi in Alacritty terminal config
(Not entirely sure that's necessary?)
You're welcome, I hope it helps! To preface this, I think Ubuntu 20 comes with good font rendering options by default, so my suggestions are mostly unnecessary. I just wanted to experiment to see if I was missing anything. All the options I'm referencing are explained in detail in this blog post (e.g., subpixel rendering and lcd filtering), they're all just configuration options in the freetype2 font renderer.
I find it so strange that most people still talk about their monitors' resolution in terms of pixel counts. Oh, you have a 4k monitor? Well, if that's a 43" 16:9 4k monitor, then that's pretty close to 100 dpi, so you basically have a big-ass traditional ("lodpi") monitor. Whereas if it's a 24" monitor, then it's a smallish hidpi screen. Rather than pixel counts, I find it much more helpful if you tell me the size in inches and the DPI.
This is related to one reason why I'm not so crazy about 4k: At most of the usual screen sizes, they occupy an unpleasant middle ground between a traditional screen (~100 dpi) and a 2x hidpi/retina screen (~200 dpi).
All 3 values count, beyond a certain dpi pixels count much less, but before they count a lot (they are the "shelves" for your characters and everything else, given that sub-pixel simulation is very poor).
Unfortunately most monitors around are still at a limited 1920x1080 or less.
For people with poor vision big size counts a lot, good vision small might be better, etc...
I’m not willing to give up my 3840x1600 ultrawide. That real estate is now non-negotiable to me. So to improve the dpi I’ll need 7680x3200, and at 120hz I don’t think that’s happening any time soon.
Same here. For coding work, I almost don't even see how it's worth discussing -- sheer real estate is going to win for me, every single time.
I say "almost" because I've been doing this for nearly 25 years and I've learned that other coders have shockingly diverse preferences and styles. But still. Give me IPS, give me bucketloads of real estate. Everything else is secondary.
Agree, I use a 3840x2160 43" 16:9, which gives a 'low' density of around 100ppi, but I have absolutely no trouble reading the smallest text. There is no way that giving up the real estate would be worth an arbitrary increase in density.
It might be sooner than you think. 8K screens are easily available, 8K 120Hz screens have had working development versions for a while, and slightly cutting the height to 3200 means that you can fit the data into a single displayport 1.4 connection, using graphics cards that are already out.
Yeah, I'm at 5120x1440 on a 49" ultrawide, and I'm not really interested in ever going back to a multi-monitor setup to get the same amount of real estate.
The "Antialiased text in low PPI is better with hinting" argument is unknowingly underlain by Eurocentric assumptions. Asian characters such as Chinese or Japanese have always looked significantly better on the macOS style of text rendering compared to Windows' ClearType, so much that there exists an app called MacType, created by an Asian developer, to force Windows into using Mac-style rendering.
Interesting bit of trivia on why the frame rate is '119.88' instead of 120: It's using the NTSC/ATSC frame-base of 1.001.
Ever since old TV standards moved from black and white to color within the US, the timing has been "frames per one and one onethousandth of a second". So 120/1.001 = 119.88.
Was a neat trick in the analog days to maintain compatibility with the older black and white TV sets. Fastforward to the digital age and that fractional timing difference makes keeping audio and video sync an absolute nightmare.
> According to my research among programmers, 43% are still using monitors with pixel per inch density less than 150
OK...
> Let's take Consolas [...] at 14px, which is default in VS Code
A 14 pixel font at 150 DPI is a 6.75 point type size in real world units! Good grief, I can't read that anymore. Frankly I doubt I could have read it in my 20's. I certainly never would have wanted to code in it under any circumstances.
This is a great article. Even if you disagree with the conclusion and recommendation, it shows what you need to look out for when buying a monitor.
I had to go through a similar exercise recently. The DisplayPort / USB-C / Thunderbolt story is just insane. I was looking for a monitor to use with both MacOS and Windows. HDMI can't do 4k @ 60Hz, DisplayPort cables never tell you which version they support.
I ended up with LG UHD 4K 27UL650, which I'm very happy with, but switching between Mac and Windows is still difficult, because this monitor only has 1 DisplayPort input. I've settled to using HDMI for Mac and 30Hz instead of switching cables all the time.
Improvement in experience is very subjective but I agree with the author on everything except maybe the 120Hz.
Have you considered a KVM switch? I really like mine. It allows up to four computers leaving me with two extras that sometimes come in handy (plugging in someone else's laptop or sometimes I connect my Raspberry Pi on).
It was at this point I realized the entire premise of the article was a very long-winded way of saying "I like using high DPI monitors when I code". And then tossing in a bunch of technical details to make it sound like a "fact".
At work (I remember the place well!) I have a pair of 1920x1024's. What size? Not sure. I rotated them 90 degrees, so there are two vertical panes of 1024x1920, side by side, making a 2048x1920 area.
Approximately square is the most productive size. When we're working with text, be it code or prose, the height of it is unlimited, but the width isn't. So it ends up in multiple columns.
Books are mostly oblong (taller than they are wide), but that's only if we think of them in their closed state. An typical open book is typically a bit wider than it is tall (but not greatly so), and you have two columns of text (the two pages).
That open book geometry is an awful lot like my rotated monitor setup, isn't it.
Until I rotated the monitors into this configuration, I was consistently just using one and ignoring the other. Moreover, ironically, using virtual desktops! I keep virtual desktops as large as 4x4. I would rather pane among virtual desktops with hot keys than turn my head between two excessively wide monitors placed side by side.
People made remarks about my disuse of the second monitor, which prompted me to come up with a good way of using it.
Yeah, IDK. As nice as "retna" displays are I still use 1080p screens on my desktop. At my viewing distance, and with sub pixel rendering, the smallest text on Hacker News is perfectly legible and even clean looking.
Sure, if I get within 12 inches of the screen I can start to make out the pixels. But even then due to sub-pixel rendering it's hard to make out individual pixels at a foot if the contrast isn't too great between the text and the background.
That being said interestingly one of my screens, a newer Asus that's precisely one model version higher than the other, handles high-contrast sub-pixel text rendering quite a bit worse than the other :|
Disappointed that he didn't mention pixel-perfect programming fonts (e.g. proggy) which offer high-density text on low-resolution monitors while not looking nearly as bad as his Consolas example. These fonts make a huge difference imo.
Yeah, I've started using terminus... probably 10 years ago now, and I still haven't found a programming font that I liked more. Tried using some retina macbooks with hyped-font-of-the-month, they just didn't work for me.
I do the same, but with a 36" screen to match the pixel pitch of standard size 1080p monitors. I can't imagine giving up 75% of my work area in exchange for prettier fonts.
Similar. My last screen upgrade is a 32" unit at 2560x1440. It has more-or-less the same pitch as my previous 23.5" 1920x1080 devices. With no scaling everything is the same size but I can have more visible. Sometimes I use the larger one effectively as two portrait screens instead of one landscape. One of the older monitors rotates, I have that set portrait too.
Couldn't agree more. My primary monitor is a 43" 4k LG at unscaled, native resolution. I can fit five 94 column wide terminals tiled across, each showing 140 lines of text.
Is the text super crisp? Nope. Would I give up all this space for smoother text? Not a chance. Would I replace it with a 43" 8k monitor? In a heartbeat.
Most software lets you increase text size. This lets you maximize functional resolution by letting ui elements remain small and text to remain legible. I don't need my window header to be twice as thick.
Only tangentially related, but encoding text into the subpixels themselves is still one of my favorite github repos I've stumbled across in a good long while :)
Ever since I saw that picture of Linus writing code on a shitty 90s laptop on his mom's kitchen table I feel like I don't deserve any trifle creature comforts.
I have the Samsung CHG90, which is 3840x1080. It's literally a 4k TV cut in half horizontally. It enables me to very comfortably display three windows side by side. More than anything I have ever bought, this monitor has improved my productivity. They now have a newer version that is 5120x1440.
The trade off is this monitor is very low res, about 100ppi. I don't know anything about how the various OSes render text. But I have found that by far, Ubuntu renders the text the best. Text in both OSX and Windows looks absolutely dreadful on this monitor, but Ubuntu is really quite pleasant. Which is not at all what I was expecting.
I'll upgrade when color E-Ink is ready, has a good spectrum and has 20" for a (near?) reasonable price.. I'm not sure who needs a high DPI Lite Brite, but it's certainly not relevant at my age.
Those colorful upscaled images of ClearType text are not that helpful. I wrote a tool several years ago to illustrate the effect of subpixel font rendering: https://github.com/pvorb/subpixel-illustrator
My monitors usually last me for a long time, right now I'm using a Dell P2715Q 4K that I got 5 years ago and I'm staring at it every day for multiple hours.
It wasn't that expensive but a good monitor is definitely worth the money and is not something you should cheap out on if possible.
> $200 is quite cheap for a 27” monitor. A good non-gaming 1440p 27” costs about $300-400 not on sale.
I think grandparent post meant 1920x1080. Depending on your needs (no adaptive refresh rate, no color accuracy, only one HDMI input and one VGA input, no rotating stand, no vesa mount), you can get a 27" monitor for about USD 100. For most programmers, two 27" 1080p monitors ought to be sufficient.
I'm just amazed I never realized that a 27" 1080p monitor is barely over 80 pixels per inch. I'd imagine that doubles to a respectable 100+ ppi with 1440p monitors but I suspect it isn't worth paying more than twice the price in my context.
I have a single 1080p 13'' monitor (my old MacBook Pro) and that's sufficient for me. I seriously don't understand what people do with that extra space. This is enough to see 3 horizontally split files open on Emacs. And you can double it (6) if you also have vertical split, but then page becomes too small.
1080 27'' sure sounds nice, but why do you think we need two 27'' monitors?
Yeah, I'm sure it's fine. But $200 isn't "pushing it", that's a very entry-level price. An objectively nice monitor is gonna be between $400-$700, usually.
The article is deliberately opinionated but I basically agree: the experience of editing text on a 4k/Retina screen is vastly superior to what it was like previously and I'd never want to go back.
It's clear that many of the people who think they disagree are Linux users who have (for good reasons) not yet had a period of time to try it out with good OS support. At least, not on their laptop screens. I really hope good Linux support for laptops with high DPI displays emerges, if it has not already (I've read mixed reports).
27" 4k is going to be pretty tough to read text at 1:1 scaling even with 20/20 vision.
IMO you're better off grabbing something like a 25" 2560x1440 monitor. You can comfortably read text at a few feet away, even small text with not the best contrast.
If you start scaling to 150% or 200% with your 4k display you'll end up with the same screen real estate as a 2560x1440 or even a 1080p monitor depending on how much you scale up and you'll end up paying a lot more. For example a really good 2560x1440 monitor will run you about $300-350 (IPS display, low input lag, good color accuracy, etc.). The first 2 monitors in OP's post are $1,500 to $1,730.
I did a huge write up on this at https://nickjanetakis.com/blog/how-to-pick-a-good-monitor-fo... if anyone is curious. I evaluated a bunch of monitors and ultimately picked a 25" 2560x1440 since it's the sweet spot for PPI at normal viewing distances while having normal vision.
For programming, having a 2560x1440 1:1 scaled display is going to be a huge win. You can easily fit 4x 80 column windows side by side in most code editors. After using that set up for years, there's no way I would ever want to work on a display that's less than an effective 2560x1440 resolution again.
For me, 2560x1440@144 on a 27" panel is by far the Goldilocks of writing code. 1080p feels like it's not tall enough with fat editors like Visual Studio, and anything bigger in size or DPI is starting to get too difficult to keep in focus.
I do also have a separate workstation with a 4K 43" monitor that I use exclusively as a standing setup. The monitor is fairly far back, but still requires me to move my head slightly to view the extents. This is the workstation I typically use for daily stand-ups so that I can have a ton of relevant information up all at once. I will also do some large-scale codebase reorganization efforts on this setup as I can have 3-4 solution explorers side-by-side and still have 90% of my monitor remaining for code editor windows, debuggers, explorer windows, browsers, etc. That said, this setup is tiresome if you are trying to laser focus on a single area of the codebase for an extended duration.
I will typically alternate between these setups throughout the day as appropriate.
Also, I do appreciate the color accuracy of the 4k 43" monitor (IPS/8-bit). I don't do a lot of artistic work, but when I am trying to see if a certain shade of grey makes sense for a element relative to its contents, having an accurate monitor really helps pick a good color code. With TN you get much more 'coarse' results and its hard to find a good middle point.
"Now, it might be tempting to use, for example, 1.5× scaling. That would give you an equivalent of 2560×1440 logical pixels, which, you might think, is much better. This is not how you should use it! The idea of a 4k monitor is NOT to get more pixels but to get the pixel-perfect, high-density UI rendering. Otherwise, a normal 1440p display would work better. A simple rule to remember: pixel alignment outweighs everything else. 1440p display is better at displaying 1440p content than 2160p display is at it."
I like this idea in theory, but I disagree with it in practice.
I use two vertical 4k displays (specifically: Dell U2720Q) and I use them at the native resolution.
This is because I want IntelliJ to take up an entire vertical display where I can see a lot of code without having to scroll. I can also divide the other display into two halves (horizontal line, one window on top and one on bottom - I use a third party app from the macOS app store called magnet for this).
I can appreciate the smooth fonts, but the screen real estate is more valuable.
I guess all of this is to say that the best option for me would be an 8k display at 2x scaling. A 4k display at 1080p though isn't worth the trade-off (I'll take lower quality text for the additional space).
They do say right after that that using the 4k display at native is also fine, so maybe it's not an issue?
I'm currently using a Sony 43" 4K TV [1] as my monitor. It supports uncompressed 4:4:4 chroma subsampling [2] which makes for a huge impact on visual quality. It's also quite inexpensive. I find that the height of the monitor is about as big as I can tolerate. I certainly wouldn't mind higher resolution, but that doesn't exist at this size.
LG now has a 48" OLED TV [3] that supports a 120Hz refresh rate. I'm looking forward to trying that out. Either that or the new Samsung Odyssey G9 ultrawide [4] which is about the same price. It's also 240Hz but with VA pixels (which apparently aren't that bad). The G9 will be better on the vertical axis (not too tall) given its size. The extreme curvature is also interesting - not sure about that yet.
- flatscreen, 16:9, normal monitor, no "gamer" or other weird stuff.
- reasonably priced
I don't live in a rich country so there's not much variety. I got tired of searching for it. All monitors with more than 1080p I've found are either "gamer" (which cost 5x as much because of the gamer tax and other features useless to me, like 144hz, gsync and "3ms response times") type of monitors, curved screens, ultra wide screens, etc.
The problem is that a little more can be waaay worse. You need to to double your pixels to avoid turning everything into a blurry mess, which means 200+ PPI. But there are literally no monitors like that not made by Apple. A 4K monitor at 32 inches is 138. No thank you. Gross. There actually are some 4K 22-inch monitors, but I don't want a 22-inch monitor.
I'm just going to stick with my 1440p 27" monitor until someone can sell me something better.
I find your opinion very subjective. In my experience, I remember that in 2013-2014 when i switched from MBA to retina MBP i experienced the wow effect. But recently MBP fall off and screen got broken. Now using 720p 20” monitor and have not noticed significant difference. Mostly programming, as in reading and writing text, be it in text editor or shell. It is as good as before. Less screen estate, right, it does not affect my workflow much.
1. Buying 4K TVs as large PC monitors is a dangerous game, despite all the sweet lies about “convergence” supposedly enabled by HDMI. You have to make sure the TV has a true “PC mode” with true 1:1 pixel mapping and no “subsampling” (if it exists, it is usually enabled by labelling the HDMI input as “PC“), that you weren’t conned with an RGBW pixel format (or any non-RGB pixel format for that matter), and that the panel can display 6500K colour temperature in "PC mode". Also please note some TVs with IPS panels have a pronounced chevron pattern in their subpixel layout instead of the traditional 3 vertical stripes, which makes ClearType look slightly different even in true "PC mode" (I personally find it cute and inoffensive, your mileage may vary) and that all 4K VA panels have bad horizontal viewing angles.
2. Make sure your RGB levels match. That is, use “full range” for RGB 0-255 displays (PC monitors) and “limited range” for RGB 15-235 displays (TV monitors). Sometimes the drivers don’t get this right. If it’s wrong, and you get “limited range” on an RGB 0-255 display, you will get washed out blacks instead of inky blacks.
More than 5 years ago, I bought two 24" 4k monitors from Dell (UP2414Q), with a PPI of ~180 (and sitting over a foot away), they give me a retina experience with my desktop. They were an early product from Dell, so despite great picture quality, they are buggy and have large bezels.
This week, one of them failed. While looking for a replacement, I was expecting the range of 24" "retina" displays to have improved somewhat. Surprisingly, it seems the opposite has happened. Dell no longer make an Ultrasharp range 24" monitor with 4k, everything is now 27".
Although they do make a 'P' series equivalent, it's an old model released around the same time as my original ones, with similar bugs and early hardware (no 4k 60hz HDMI)
27" is sadly too large for me to fit two side-by-side on my desk (and even if they would fit, 27" is too large for my liking).
Now I'm left struggling to find a replacement, I may have to live with downgrading to something like 120 PPI with a 2k (1440p) 25" display.
It's sad that retina really hasn't caught on much in the desktop monitor space, we should really have 3k as a standard resolution for smaller desktop monitors.
FYI: The Dell P2415Q does have 4K60hz over HDMI as long as you get one manufactured recently. They did a silent spec bump in 2017 or 2018 or so. I use a number of these and while they are still slightly buggy, I've found them much better in that department than the UP2414Q.
Yeah... I've got a small anecdote. I find that I write better code on lower resolution screens. Right now, I'm on a 1440x900 display at 75 dpi. And it's annoying, for sure. But it keeps my code concise and well-organized. It forces you develop your code around a mental model, instead of a visual model.
I find that at 4K, my code starts spreading out -- horizontally and vertically. That's fine when you're writing it, but when you come back to it months later, it's so hard to comprehend that much information all at once IMO. It's weird. And I might not be explaining it properly.
If you were to write your code on an 800x600 display, you would make sure that every function did not go over 800 vertical pixels. It would probably give you a headache if you did.
I can look at code and almost immediately tell that it was written on a 4K display. Oftentimes, everything is spread out and chained obtusely.
Completely different game for web dev though -- having the browser side by side with your code helps.
...
Also, 4:3 is awesome for code. All I need is two, side-by-side panes of 80-column text :)
I think that it's a long article to just say that a 1) good 2) big screen, or two, or 3 is a good idea. The bigerrer, the beterrer.
I wrote a lot of code on Mac classic format (512x342) using Monaco 9 -- it was great; next step was 640x480, and 800x600, and then (for a rather long time) 1024x768.
It was still perfectly fine, as (at the time) Apple screens were some of the best, nice crisp, clear with good contrast. I didn't feel particularly handicapped because of the screen real estate, you just adapt to what you have.
Now my main machine is Linux with 2*32" 4K screen side-by-side, and about 60cm from me. I use Liberation Mono 17pt as a terminal font, and 9pt is just microscopic on the screen.
And guess what? I still wish I had a bit more screen space sometime! :-)
One thing I always tell more junior developers who ask me about what the most valuable piece of information about programming in my whole career is: Get a good SCREEN, get a good KEYBOARD, and get a good CHAIR, the rest are just details.
Oh, also, make sure the screen(s) are perpendicular to any window, and watch that posture!
Talks about 4k at 120hz without mentioning that this usually sacrifices chroma, making text unreadable. Fine for games, but not for coding.
No mention of Display Stream Compression.
No mention of the importance of panel type (IPS vs VA vs TN).
No mention of the various HDR standards (and especially how HDR400 isn't really HDR).
Suggests getting a 5k, 6k, or 8k monitor while also insisting you get a 120hz display (it's going to be one or the other for the next couple of years).
Suggests monitors whose adaptive sync feature only works on Nvidia GPUs.
The author seems to know a lot about font rendering but he doesn't seem to know that much about monitors despite having VERY strong opinions about them.
I suggest people wait till the new year before a big monitor purchase. EVE Spectrum is slated for Q4 of this year. If it delivers on its promises, it'll be the best value for what you get. And if it doesn't, you can fall back on the ROG XG27UQ or its contemporaries - assuming your GPU can support DSC, 4K, and 120hz.
The other side to this is that it may also be time to upgrade your eyes (with glasses, by getting your eyes checked). It's more likely this is true if you don't think it's time to upgrade your HD monitor.
Somehow I went for years without realizing that I just needed glasses. I could read everything before, but after getting the glasses everything was suddenly unnaturally crisp.
I notice that the main topic for > 100Hz monitor owners is how smooth it is to drag windows about. When they show off the monitor, they grab a window and wiggle it really fast. Given no-one has yet come up with a more meaningful demo I'm starting to think it's the only benefit!
There is probably a term for this type of performance chasing. It's like car owners bragging about their top speeds despite only ever driving on roads with speed limits.
Only benefit I've noticed is it's easier to read text and smooth-scroll at the same time. Otherwise it just makes mouse movements and rapid console vomiting look nice. It's really unnecessary for programming in my opinion, use the budget for more pixels.
The hedonic treadmill is getting brutal. Going up to a 4k/34" monitor was nice but it quickly became normal. The main effect was how bad it made 1440p/27" monitors!
Luxury. Once you have gone so high you can't go back. I have a 144 Hz monitor now. Even when not playing games, scrolling, moving the mouse, moving windows. It's smooth and I never want to go lower again. Using the computer is fun again.
>You can read scrolling movie credits just fine at 24 FPS
Honestly I actually struggle with that. But working and gaming at 60hz is still perfectly fine with me. I even purposely run my Valve Index VR headset at 90hz instead of 120hz because I don't need the extra frames
You can't play a game at 15 FPS. It's a moving slideshow. And it was never fun. 30 fps looks pretty bad, and even 60 fps is decently choppy. I'm not sure what your exact point is, but once you try 144hz or higher, you'll never, ever, go back.
I have a 1440p 144hz monitor and a 4k 60hz monitor. I prefer the higher dpi one. I personally don't feel that extra-smoothnes during web browsing or web development when on the 144hz one, it maybe feels a bit better, but I don't mind 60hz.
I had to use 4k@30 Hz for the past week (waiting on a USB-C to DisplayPort cable to replace HDMI) and actually gotten used to it. Turning off animations and smooth scrolling helps.
Well, I don't know what DPI you are using (perhaps the display is tiny?) but it sounds low. Of course, we all used low DPI displays for years and were happy with them because we did not know better. Now that I've become accustomed to higher DPI, low DPI screens look like crap.
It's the exact reason I am actively avoiding using 120 or 144hz screens. I don't want to get used to it and then need to ditch my current display. For several years I bounced back and forth between gaming on a PC and PS4. The whiplash between 60 and 30hz was absolutely brutal - but I never noticed it back when all my displays were 30hz.
So in the end, I am not sure what advice, if any, I am trying to give you. I love my huge, high-DPI screen, but if you are happy with what you've got, you might seriously consider sticking with it for a while.
to reinforce your point, while I can appreciate more pixels, per display or per inch, I appreciate more being able to pick up a 1366x768 laptop and be productive and not feeling like I need to whine, complain, or write supercilious blog posts.
I came to the comments expecting lots of (irrelevant, I know) complaints about the painfully bright yellow background of the article - but nobody has mentioned it yet. Is it serving different colours to different people, or am I unusually sensitive to this colour? I found the page unreadable without switching to reader mode.
This is actually a rather bad time to buy a high-end monitor because if you get one right now it will likely be outdated by the end of the year.
If you want a monitor that can do all of the cool things at once (2160p, hdr, 120Hz+, freesync) without using hacks like dual cables or realtime lossy compression, then you need both a video card and a monitor that support hdmi 2.1 or displayport 2.0.
Currently there are very few monitors that support the newer hdmi/dp standard, and there are no video cards that do. However both of the new consoles scheduled to be released later this year (xbox series x, ps5) will support hdmi 2.1. This also implies that upcoming discrete gpus will support this as well.
This means if you want a monitor that can do everything, right now there are only a tiny number of screens, and no video cards. Barring any delays, the situation should be very different in a few months.
Agreed. I expect more products released with new interfaces.
BTW current GeForce and RADEON supports DisplayPort DSC that could enable more resolution / frame rate by compression theorically but I don't know any existing displays.
Does anybody make a reasonable 16:10 aspect ratio 4K monitor yet? Because I'd buy it, but I can also stick with 2K forever if nobody wants to shut up and take my money.
I'd possibly consider upgrading from my 2x Dell 2408WFP setup if modern IPS panels weren't such absolute garbage with respect to backlight bleed compared to the IPS panels made 15 years ago.
Even my old laptops with IPS panels look better than newer ones. I was in the market for a laptop last year and returned 3 different laptops with IPS panel displays from 3 different manufacturers because every single one of them had backlight bleed that was so bad that it was plainly visible even in a well lit room. The last company pushed back trying to claim that that level of backlight bleed was considered "normal" but eventually refunded my purchase once I threatened to do a chargeback through my credit card company.
Do there actually exist any companies anymore that produce decent monitors with minimal backlight bleed?
Some people have mentioned that once you use 4K, you can't go back. I guess I'm in the group that has used 4K (or 5K, in my case), and found it easy to go back.
At work, we have these 5K, 27" LG monitors: https://www.apple.com/shop/product/HMUB2LL/A/lg-ultrafine-5k... . Don't get me wrong, they're great, but... I don't feel like I'm missing anything when I go home and use my Dell U2412M (1920x1200, 24"). And working from home the past three months hasn't tempted me to upgrade my monitor, either. 1920x1200 gives me a good amount of real estate at 100% scaling, and that's the main thing I need.
Now it is true that if I look at the font in this text box in Vivaldi, the rendering kind of sucks; Firefox is a bit better. There was definitely a time, around when I switched to Windows 6.x from 5.x, when I was passionate about this, and dislike for the Windows 6 font rendering was one of the reasons I stuck with XP as long as I did. Maybe 4K would help with that, but by this point I've adjusted. And even so, spending $900 on the monitor the author suggests is not the most appealing option. That's a lot of GPU upgrade you could get relative to the $300 I spent on my current monitor in 2011. Or I could double the RAM in all my personal systems and get a nice microphone (which would have more of an impact on my working from home experience). Or I could built an entire Ryzen 7 3700 desktop for that price. There are just a lot of things I'd rather prioritize.
I'm also find the proposed productivity benefits dubious. As long as the fonts are acceptable enough that I don't find them actively distracting (and as mentioned, with bad enough font rendering that can be the case, so perhaps the author is simply more sensitive to that than I am), how much I can focus is going to have a lot more impact on my productivity.
I do somewhat fear that this might re-ignite my former font pickiness, though...
Wow. He was right. I just changed my 15 inch Macbook Pro scaling away from default and WOW the text is clear. I also just turned my second monitor onto 144hz and WOW the animations are fast. My work from home setup just got significantly better.
As a Guy that does back&front programming and gaming on the same PC, my subjective opinion it's that having a decent color gamut it's important. I saw details that on a old monitor (with a worse color gamut) dissaper or looks worse.
Also, a ultra wide screen gives a lot of screen space to handle a code editor/IDE and a browser on the same screen at same time.
At some point on the future, I would replace my secondary monitor following this priorities:
* Correct color gamut.
* Minimal resolution of 1080, better if is 1440.
* Minimal refresh rate of 60hz, better if at least is 75hz.
This push me to get a LG IPS monitor like the ultra wide main screen that I have now.
Huh, 4k screens have come down. I could replace this eight-year-old 24" 93dpi Dell screen with a 27" 183dpi HP screen for $540 (plus shipping). I think that's around what I paid for this one when it was new.
(why that HP screen? First thing on Wirecutter's 'best monitors' list, which I think was how I chose the one I'm looking at right now.)
I guess it'd be nice to turn off Illustrator's glitchy anti-aliased previews for good. But this monitor still works perfectly fine. Call me when I can get a color e-paper display with a 60hz or better refresh rate, I'm increasingly tired of staring into a giant light all the time.
Wow, I just turned off the font smoothing setting on my 4K monitor and the test is dramatically easier to read. Really surprised that Apple doesn't provide a better definition of this feature/turn it off on HiDPI displays.
4k monitors are still so much more expensive and have noticeable costs on laptop performance at least. When I got a macbook air with USB-C, I was initially planning on getting two 4k monitors. However, that would mean spending ~$500 to get a 27" monitor, $120 for a thunderbolt adapter than can drive both displays at their proper refresh rate, and >$200 if I wanted an adapter that can also charge the laptop.
All told, that's the total price of the laptop just in periphery. I ended up going with a $30 usb-c adapter and some ~$100 dell 1080p screens, and I'm happy as a clam.
Personally, my takeaways from this article are to disable font smoothing. I disabled it on my MBP and I like how the default fonts are rendered now. They appear less bright and more crisper. I haven't changed these settings on my MBP since I got it, so it's likely enabled by default. The rationale behind it makes sense in low resolution cases, but to have it enabled by default on rMBP doesn't make sense to me.
However the recommendation to reduce the resolution scaling option is a no for me. The text is wayyyy too big for me and I lose out on the screen real estate.
That part of the article does not apply to my 13'' 2015 MBP... I checked and it already uses 2x resolution by default. Perhaps it applies to larger screens or newer models.
I was an early adopter. Didn’t happen voluntarily though, my client wanted me to develop embedded Linux software that drives a 4K HDMI output, so I was kinda forced to upgrade. Never looked back. When that monitor broke just after it’s 3 years warranty expired, I’ve bought very similar one with the same specs from another brand, 27” 4k IPS.
However, I think 120Hz is overkill for programmers. The price/performance proportion is not great either, my current BenQ PD2700U is offered for just over $500 in the US, while the monitors recommended by OP are above 2 grands.
I use a 32" 1440p monitor and while I would have liked to get a 4k one instead, even with getting tax back by declaring it as a business expense (since I use it primarily for work), 4k still would have been more than I was willing to spend on it right now. I don't want to use a smaller monitor (might even like one size larger) and tend to use large fonts anyway since my eyesight isn't super great, so 1440p seemed like a much better fit for me, given the price of a good 4k one. Maybe in a year or two I can upgrade.
This is a fantastically well written blog article, and there needs to be more attention paid by the industry to issues such as these ones raised by the author.
I've generally found that all the IT vendors have an incredibly lazy and sloppy attitude to imaging of all types, not just display scaling or font rendering. "The pixels went on to the display! Good enough! Ship it!" is the attitude.
Compare this to televisions, where there 120 Hz is common, 8K is a thing now, wide gamuts and HDR are not only supported but consistent and even calibrated out of the box! Dolby and NetFlix in particular have put significant effort into making sure their customers see the original artistic intention, not some faded or garish mess.
In the PC, Mac, and smartphone world it is literally impossible to send someone either a still image or video that'll look even vaguely correct. It'll be too bright, or too dim, faded, or worse: the colours will be stretched to the maximum display gamut. Photos of faces look like zombies or clowns in full makeup.
HDR is simply impossible to use for still image exchange. No matter what some random JPEG committee is working on, don't even think about such nonsense as sending someone a HDR file, it will practically never work, no matter what display the recipient is using. Or even just an SDR 10-bit file. You're wasting your time.
To give you an idea of how insanely bad this is, for years and years if you enabled HDR for your display under Windows 10 it would simultaneously wash out the desktop colours, dim it until it was way too dark, and then blur everything on top of that so that your 4K monitor is pointless.
Clearly nobody actually tested HDR support at Microsoft. After they received complaints, both to their tech support and in public... they did nothing. For years. YEARS.
I just find the whole thing incredibly sad. It's 2020. The future! The physical technology for nearly perfect display output has been available for years, but... no. You just can't have it, because the software won't allow you.
It doesn't matter what display you buy. You can't have HDR, or 10-bit, or colour management on the desktop, or your browser.
Firefox has had full ICC colour management for many years now. It's off by default. Internet Explorer truncates everything to sRGB even on a wide-gamut screen. Safari stretches colours for un-tagged images.
The only HDR and colour-managed thing you can reliably do on a PC is watch NetFlix. That's... it.
I had a 34" 4K monitor in 2019, and it was a mess in Windows 10. I sold it and picked up an ultra-wide, 38" 3840x1600 (~109 PPI) and I couldn't be happier. I agree that a "good" monitor is a necessity, but I disagree with OP that "good" = 4K 120 Hz.
With any legacy application, and even many modern ones, UIs were either fuzzy, or tiny, or a mix of the two, even with all of the Windows per-program custom high-DPI settings tweaked. The UX was not consistent enough to be enjoyable, so I ditched it.
For what it’s worth, the “fuzzy” version is usually about what you’d be getting on a display if it wasn’t HiDPI. Also don’t forget to exclude the rarer, but still happens scenario where the UI is scaled twice for some reason, so everything is huge.
I used non-antialiased bitmap fonts on a low-resolution display for terminal windows for the past 3 months, and it was great. Now I've switched back to 5K and it's fine, but there really was nothing wrong with bitmap fonts.
I also tried a vintage "mechanical" switch keyboard and mouse, just for fun. I think modern macOS doesn't work quite as well with a mouse because of the weird, narrow, disappearing scrollbars (though a mouse scroll wheel or scroll ball can help with that.)
Bitmap fonts look crisp and clear at small sizes. I find I enjoy them much more than a lot of the fonts in the article. Try out Terminus or Tewi at 9pt if you're curious. They make Fira Code and Inconsolata look blurry by comparison.
I recently changed my terminal emulator from Termite to Alacritty because Pango broke bitmap fonts, and Termite uses Pango. Unfortunately several other things I use are still using Pango, and I've had to switch to subpar fonts in waybar and a few other spots.
Having read the article, I'm still not able to fully grasp why the old techniques for improving text legibility can't be "scaled" with screen resolution. We had imprecise pixels before and we still do, in the sense that they are very finite. I totally get why smoothing and other tricks aren't as necessary as they used to be, but I don't get why they have to be detrimental on Retina displays. At what resolution does this reversal happen and why?
In 2018, as the article says, macOS stopped using subpixel antialiasing across all Macs. On low DPI screens however, the loss of subpixel antialiasing made fonts look blurrier, and look smaller and less dark. This occurs on high DPI screens as well, albeit to a lower degree. To bring low DPI screens up to the same level of "darkness", the behaviour of "Use font smoothing when available" was changed to simply makes fonts bolder. This is a plus for people who have difficulty reading the screen.
Furthermore, the author seems to have an obsession around "pixel-crisp" fonts, during the section about the correct resolution for your display. What macOS does is render at 2x resolution, then scale down the image to the monitor's native resolution. Looking back, this was an amazing choice made by Apple, and has allowed for an amazingly bug-free high DPI experience. Even today, Linux and Windows have serious issues with high DPI. 99% of macOS users don't notice that their desktop is not running at native resolution, nor would they care.
If the author just focused on monitors, this article would be a lot better.
Also, 120Hz might be nice for a desktop, but if you are plugging in a laptop, you will notice your power consumption double. For battery life, choose a 60Hz monitor.
It's interesting to see an argument against ClearType/font smoothing, but not anti-aliasing in general. They both change the font from what the designer intended, so why is font smoothing so much worse?
I actually disable anti-aliasing in games on my 4K display - not just for performance, but because anti-aliasing genuinely isn't needed at that density. I think the author could go all the way here too, since after all, AA is just another crutch for low-density displays.
I agree with the problem and the severity of it, but I disagree with the proposed solution. Don't get me wrong, the more screen real estate the better, but I recently switched to GNU Unifont as my primary font and wow is it nice and crystal-clear at 12pt. Bitmap fonts might be "ugly" and inflexible, but they do seem to entirely sidestep boneheaded antialiasing strategies (provided they're configured to not be needlessly antialiased).
On lower dpi monitors I use bitmap fonts with no smoothing for coding, they are crisp. See http://upperbounds.net/index.php?menu=download for a bunch of these fonts (I use Proggy Clean with slashed zeroes and bold punctuation). Note that you may have to play with your font size and turn off any antialiasing in your editor to get it looking right.
Has anyone tried using an LG OLED TV as a monitor? I do. I love OLED - however the LG TVs have 4 subpixels (RGBW) which messes with font rendering regardless of what settings you play with.
On Windows, chrome (including chromium edge) seems to have the biggest issue. Windows cleartype fixes most of the native text elements but some apps like chrome look fuzzy no matter what.
Has anyone found a good solution for this or for similar displays with non standard subpixel layouts?
I am using the same three Dell Monitors since 2008, they are still working perfectly (1680x1050) and in my opinion the text rendering with ClearText subpixel AA is excellent on both Sublime Text and VSCode.
Of course, on macOS that's an other story, they didn't always had ugly fonts, but that's been the case since a very long time now and that prevented me to switch more than once (I still have use macOS in a VM to build for iOS)
I don't know. Text is important to me, but high refresh rate is much more.
Even scrolling text or dragging windows around at >120 Hz is a game changer. Going back to the MBP display gives me instant headache until I re-adapt. It's hard to describe, it's like all of a sudden everything has a very distracting input lag.
Now I just need to wait for a 4k 144Hz IPS that can be driven via MBP (or just get what's on the market and run eGPU)
At the end of the article, the author mentions the Acer Nitro XV273K, which Macs can drive at 4K@120hz (though you have to jump through a hoop in the Displays prefs on each boot)
4k can be driven by Mac at 120Hz, but the mac has to be new enough (2018+ is my guess, based on thunderbolt 3 upgrade in 2018) and have a discrete graphics card.
I bought a 4k screen for same programming rationale. Deep regret.
From CRT days I know I can see 60hz but not 75hz. I had stupidly assumed that modern 60hz with freesync would be ok. It's not. Especially @ grey colour for some reason (Asus screen).
The fact that I was coming from a 120hz laptop screen didn't help the situation either.
Anyway...so I'm buying a 144hz 1440p gaming screen next. Screw this resolution game...I need flicker free for my sanity.
I have a Dell S2716DG [https://www.dell.com/lv/business/p/dell-s2716dg-monitor/pd] and I absolutely love it: 1440p, 27in, 144hz, 1ms response time (on fast mode), and g-sync. It's great for games and looks very classy. Also the price isn't bad at around ~$500. I would highly recommend it.
To his comment about truly disabling Font Smoothing on Windows I think that, in addition to disabling ClearType, there is also a font smoothing setting to turn off as well. You can disable it by:
* Open System in the Control Panel
* Click on Advanced System settings
* Go to the Advanced tab
* Click on Settings button under Performance.
* Uncheck the box for Smooth edges of screen font.
* Click on Apply
Conclusion after reading tens of comments here: no way to know if I'm young enough for 27" 4k (which is finally affordable in my country) or if I should go with 2560x1440 at 24", which feels outdated in 2020...
And that isn't a great step over my working 22" 10-year old 22" FHD Dell.
How long until we have affordable 'Macbook like' crispness as 22" or 27"?
The guy needs to chill. Relax. He's got a point about PPI but don't get too hung up on pixels. Your eyes are your guide.
For my purposes a single 4k 43" monitor is fine. Its not ultrawide either. No need to upgrade. I have another identical one at the other place I "nest". Its like using 4x HD (1920x1080?) monitors but no bezel in between. Very comfortable and the text doesn't hide in obscurity. Plenty of space for everything and doubles as a excellent TV when needed.
Personally I think most of the trouble people have is due to staring at screens too long. You shouldn't be spending more than an hour staring at a screen continuously anyway. Even thats probably excessive. Take a break.
Its kind of like those people who's eyes are so tired they change their preferences to light text on a light background. Ummmm. No. Not good.
But I'm an old programmer, amongst other hats, so what would I know? I remember when 640x480 was crystal clear and so sharp and 1024x768 was "wow" extra real estate!
But yeah. Go higher res and bump up the physical size at the same time. Completely addictive if you have the desk space.
That's a lot of words to say that vector fonts suck for less than 24 px sizes. use bitmaps; there's lots of them to mitigate that problem. It was not that long ago that 640x480 was a screen and it sucked but things got done.
My last (almost) 2k CRT last just long enough that i could afford a 4k LCD when it died. I'm pleased with a pair of these but once MOAR PIXELS become affordable i'll buy them.
Really the problem is they are using vector fonts; I use bitmap fonts in my xterm, and it is not as bad as the vector fonts used in Firefox. The text is not blurry if you are using bitmap fonts. How can I force Firefox and other programs to prefer bitmap fonts? (What I have tried, just results in no text being displayed at all.)
(High DPI is probably useful if you are doing a lot of print previewing, though.)
I did that. But I still can't seem to configure it to use bitmap fonts. I got it to use bitmap fonts for the tab titles, the location bar, and the status bar, but it won't use bitmap fonts for anything else.
The 4k 120hz monitors he is recommending all have a fan noise issue, since they all use the same basic design with a dinky small fan in the back and I think panel. I did a deep dive on it myself and decided in the end not to do it because of the fan issue:
These monitors and most of the ones mentioned in the thread are tiny. Whatever happened to the dream of the full-wall display? I really like my 43" 4K LG monitor, but apparently this type of monitor is rarely produced these days. A quick Newegg search shows exactly 1 monitor with similar specs and it's more than twice the cost of what I paid for mine two years ago.
I use 232" 4K monitors side by side; Before that I had 230" 1920x1200 (not 1080p!) side by side too; to me it's sufficiently "a wall" that I put them about 80cm from me, each angled a bit inward.
Quite frankly anything bigger and you get a kink in your neck by having to move your head so much!
Having a large monitor (> 30") makes it harder to read your content all at once, and you constantly have light/information in your peripheral vision (which is not good for the eyes). Asuming that you keep them as the same distance as "normal" monitors (1m - 1.5m)
I find this article resonates with me on many points. Particularly, high DPI and high refresh rates are not gimmicks, they have meaningful impact in making the user experience better. It's something that Apple has been pushing for years, from Retina to ProMotion. Sure, not everyone will appreciate the difference (Some people are happy with Macbook Airs, some need Threadrippers), but that doesn't mean it's not a meaningful difference for many users.
One area that I disagree though:
> The idea of a 4k monitor is NOT to get more pixels but to get the pixel-perfect, high-density UI rendering. Otherwise, a normal 1440p display would work better.
Not in my experience. Sure, 2x scaling is ideal for sharpness. But there's a tradeoff with screen real estate. I regularly switch between a 1440p monitor at work to a 4k monitor @1.5x scaling at home. Fonts are still noticeably sharper on the 4k monitor.
4k @1.5x scaling is not pixel-perfect, but definitely sharper than 1440p.
3840 x 1600 is the perfect resolution for programming in my opinion, and is now starting to become more widely available in 38" wide form factor (equivalent to a "normal" 32" but wider). At this size, you can finally comfortably have 2-3 normal-size windows side by side: your editor/IDE, your browser/app, and one third for documentation, without having to go multi-monitor. The 1600p vertical resolution is fantastic for coding. If you're interested in this size, LG 38WN95C is a great one just starting to become available that hits all the marks: Thunderbolt, 144HZ refresh rate (including G-Sync and Adaptive Sync), IPS panel (must) and looks "normal" without any gamer aesthetics or RGB lighting.
Unfortunately it's going to be a very long time before we get high-dpi equivalents. 4K, which is frequently 1080p equivalent in terms of workspace, is just not doable after you've been using 1600p.
From my limited understanding not the distance is the problem, but the constant focus at a point in the same distance. That’s the reason why they recommend to look at distant objects every 15 minutes or so. Starring at a tree for ten hours straight would have a similar effect, but woods are so complex that your eyes are constantly refocusing.
Developer text editors won't benefit from 120Hz monitors. Their scrolling is not smooth pixel-by-pixel, but line-by-line. Also most developers will navigate code with search, and won't need scrolling anyways. I would rather put my cash on a display that renders still images better, and is comfortable on the eyes.
I have been using 42" 4k monitor. It is suitable for some task that require 4 or more windows at same time. For example, development of Field Programmable Gate Array(FPGA) use more than 4 windows (FPGA CAD, Waveform, HDL editor, explorer, etc...). It is comfortable that use these windows without switching of windows.
This is going to sound like I want to eat the cake and still have it, but I wish there was some sort of balance between real estate, sharpness, and affordability. A 24in 4K monitor without scaling makes text too small, but at 200%, it's no different from a 1080p monitor real estate-wise (even though text is much crisper). My current monitor, a 1440p 24in screen at 100%, serves all of my real estate needs, but leaves something to be desired in terms of sharpness, so a 5K screen at 200% and similar physical size would be perfect; alas, there aren't any general-purpose 5K monitors at an affordable price. 8K screens at 300% would offer similar, albeit slightly lower, real estate at even better text sharpness, but there's just one such monitor from Dell and it's ridiculously expensive.
I hope 5K monitors become as cheap as 4K monitors over the next few years.
Have you experienced any of the reported OLED burn-in issues?
I would love to upgrade to an OLED but my screens are on at least 16 hours a day with semi-static content. That's a hefty price to take a chance on burn-in.
Its the C9 and I have all the anti burnin features turned on, I have no issues, you can see the tv moving the whole screen a few pixels every so often.
The new CX generation from LG comes with a 48" model which may fit people better and supports variable rate refresh which is good for gaming as some cards and consoles can support it.
Living with a C9 65" as my main television, not computer monitor, while I acknowledge the picture quality is incomparable from my point of view I am in the camp of that for a general work computer screen it would be best to wait for micro LED technology to spread. OLED still suffers the risk of burn in but this is mostly from fixed elements where the screen is level on for days if not years. Still not worth the degradation that will come as most elements are fixed and always on display with Mac OS.
So for the most part a well lit IPS display will serve users just fine these days and reserve OLED for the living room.
Oled for a monitor is tough because burn in a very real for high contrast static images. They have done a good job but I watch a lot of high end oled tv reviews and it doesn’t take too long for menu bars and the like to start ghosting.
I just bought two new monitors this year, and went with 24 inch 1440p. My reasons were:
* At the distance I sit they look fine to me. In particular I don’t notice any jarring loss in resolution compared to my MacBook retina display which sits next to them.
* The “native” text size on a 24” 1440p monitor is perfect for me. Any smaller and I would struggle to read certain things.
* I was concerned about HiDPI scaling on linux. Fractional scaling was still “experimental” at the time I looked.
* I actually care more about other factors like the stand than 4k resolution, and would prefer to put my money towards that.
* When considering buying a single large monitor and running at native 4k resolution, I had concerns about screen sharing. Would people on smaller monitors have trouble with scaling? Also, having two separate monitors helps me with window management.
Echoing some of the comments here, part of this depends on whether you’re using MacOS or not and what else you’re using the monitors for. I use 4K monitors for both Mac and Windows and I stick close to the Apple recommended HiDPI approach which means your 4K display isn’t larger than 24” and your 5K display isn’t larger than 27”. YMMV but most people who have a bad 4K experience are trying to do 4K at 27” or greater. Due to DisplayPort 1.2 bandwidth limitations, the 5K 27” monitor market never got - great range of options. You either have the Thunderbolt-based monitors that are mostly for use in the Mac ecosystem (and only some Mac models) or a few expensive dual-cable DisplayPort 1.2 options or DisplayPort 1.4 options with an assortment of compatibility issues.
I have the ultrafine 5k. 27 glorious inches @ 218ppi. Not only does this provide a colossal amount of real estate for crisp text (I run with no scaling, which requires good eyesight), but combined with a macbook pro, provides a near perfect computing solution. One cable provides video, audio, usb hub, webcam and power. Two monitors might be better, but I find leaving the laptop screen open on a stand works well. One quarter of the ultrafine provides 2560x1440 resolution, so side by side, or corner layouts work very well.
> I run with no scaling, which requires good eyesight
I have been on the fence about upgrading to a large hi-res screen specifically with the aim of maximizing screen real estate, which effectively means little scaling. However I couldn't find much info about running hi-res monitors with little or no scaling. Could you share your experience and/or post a screenshot, just to get an idea of how big UIs end up being?
Judging by the resolution of my screenshot, it appears as though there is still some scaling going on. I can effectively reduce the scaling by zooming out in VSCode, but I'm limited by my eyes, not by the resolution of the monitor. In other words, the text becomes illegible far before it becomes pixel limited.
I am old old old, and I agree. This is probably something Mac users would notice once they are used to 'retina' resolution. At work we have two monitors on stalks and laptops. Work provides laptops to plug into these. The big (23"?) monitors are big, that's all. They hold less text than the laptop screen and it's much worse to read. I just use the laptop as it's easier on the eyes.
At home I always had multiple monitors but once I got a 4K monitor, at most I have the laptop open for more screen, often CNBFed.
All this is probably more based on what you are used to. If you have big screens and nothing of 'retina' resolution you are not going to care until you do.
All my life I had an obsession with more pixels. As kids we always envied the classmates with the highest resolution graphics cards and monitors. The pinnacle of my life at one point was the unnamed best CRT ever made which got up to 1280x1024, but then LCDs happened.
My new pinnacle is a beautiful 32" IPS panel on a Benq 4k monitor. I don't care for the refresh rate jump from 60 to 120 as much as 30 to 60. But I absolutely insist on large panel area, and lots of pixels to fill it with, so I don't understand how the blog author can live with 27". This is basically programmer nirvana and I don't know what could make it better, maybe some kind of VR setup with similar PPI but I doubt it.
One key takeaway is that using external monitors connected to a Mac sucks.
My eyes are not so good and I like big text:
* On my Windows 10 laptop connected to a run of the mill 24" 1920x1080 Dell monitor, I put the scaling at 125% or 150% and everything is rendered bigger and sharp. Maybe one or two dinosaur apps are upscaled and blurry.
* On MacOS, the serious OS for graphics people, I can either render everyting sharp at 1x (with tiny menu bars) or have everything rendered as a blurry mess.
On a sidenote...I'm liking my Mac less and less. Reasons to stick with it are: no ads in my start menu and tracking in my calculator app like on Windows, no need to install crappy third party drivers to get peripherals to work.
> notebooks are not good for development. They are great in mobility and convenience, and this argument might outweigh everything else for some people. I accept that. But still, a desktop monitor + external keyboard are always better than a notebook.
This!
I can‘t understand how some folks can make dev work with the display of their laptop alone, without an external monitor. And then the trend of trying to setup the ipad pro as the sole working environment, with an external keyboard and a trackpad. Watching those (youtube stars) trying hard to make the setup work, them looking down on a tiny screen, the angle alone! We have had much better setups for years, why giving up on this to work with an inferior setup.
RE notebooks vs monitor+keyboard. I have a work macbook, a personal macbook, and a personal thinkpad. I keep work context on the work laptop, one personal project context (ui dev) on my macbook, and one personal project context (backend dev) on my thinkpad.
I really like that context switching is a physical experience, which feels much more intuitive, concrete and solid as in "it works". I don't have to open/close tabs, do cmd-tab or maybe ctrl-tab to find the right window, or switch to another digital workspace to switch to the other context.
Additionally, those devices being laptops allows me to use them almost anywhere under battery. Whenever I put a sweater under them, I can get them in a comfortable position. Managing multiple tiled windows on one screen is overhead, and so I am fine focusing on one window at a time, and so I am able to show slightly larger font which means I don't have to squint of bend forward too much.
I do think it would be nice if I had slightly larger screens.
Indeed, and more screens is even better! Ideally I would never need alt-tab anymore. I'm still waiting for a VR workspace, screen everywhere, eyes are then focused on infinity, so that's also better.
Nice article. But for a post griping about illegibility of bad character rendering, why did they decide to make the background this egg-yolk yellow? I'd prefer a white background to that mess. Is there some reason why they chose yellow?
I'm typing this from a surface. And I have an iPad Air 3 too.
There are people who get eyestrain and headaches from using both devices, due to font rendering system and frequency of lamp oscillating to adjust bright.
I even installed something called Dithering Settings for Intel Graphics and bought/installed a program called Iris on my Surface Pro.
I searched this article for both "eyestrain" and "fatigue" with no results.
I think engineers and nerds should resolve problems regarding ergonomics and eyestrain.
I feel like being cheated by the industry since my first Samsung Syncmaster 3, running at 60 Hz and everyone saying it was safe and that the eye could see no more than 60 Hz (BS!!)
I wouldn’t get a 4K display since most of them are still only 60Hz. Yeah, you don’t need a higher refresh rate to program but it makes daily desktop use SO much more pleasant for me.
There are some high refresh rate 4K 27”s but those are pretty expensive.
I just wish there were more 5K displays out there. A few months ago I bought and returned a 4K display because I'm so used to 1440p so that I lost some real estate with 4K at 2x scaling.
At the moment I have to upscale the fonts on most websites (e.g. 130 % on HN). Fonts look very nice (although not as nice as on the 4K display) but most UI elements are very small. Fortunately that isn't a real problem since my workflow is keyboard-driven anyway, so I rarely have to click on any buttons.
In the end it's still better than the 27" 1080p display I have in my office where you could see every damn pixel. I really don't miss working there.
Based on the article, it looks like a poor man's decent option is a 1440p monitor at 60Hz that is running at 1:1 scaling.
Benefits:
* Cheap-ish
* Reasonable resolution with crisp text
* Small difference between 60 and 120Hz
Drawbacks:
* If you want crisp text and reasonable real estate, forget any other resolution than 1440p. It just won't align with the physical pixels and will look horrible.
I'd love to get a 4k monitor at some point, at least that would give me the scaling option in MacOS (1440p doesn't), but if I want really crisp text I have to render the equivalent of 1080, which isn't that much real estate for something like IntelliJ.
I have been using my current home PLP setup for 10 years. Dell U3011 (2560x1600) in middle and Dell 2007FP (1200x1600) on the sides. Off and on I have looked for high dpi replacements, but never found anything suitable.
At the beginning of the year, my U3011 started blanking out on me. As it became more frequent, I started shopping again. But wow, the monitor situation right now is sad. There are no high dpi monitors with enough vertical space. And since they're all at least as short as 16:9, they're unusable in portrait as well.
I ended up repairing the U3011. It's going to have to last me a while. There's just no upgrade path.
I adore my HP Z27. It’s 4K, usb-C, charges my 15” MBP and serves as a thunderbolt hub for other peripherals. The chrome or bezel around the screen is extremely minimal, and the only light that isn’t the backlight is a tiny little LED indicator dot that is very subtle and Apple-esque. From time to time I use it with an old PC and even RPi’s. Real intuitive mode switching and menus. Highly recommended along with a heavy duty monitor arm to keep it from shaking.
As much as I miss my 2012 MacBook Air and consider it the greatest computer I’ve ever owned... I definitely couldn’t go back to a non retina experience.
I agree with other commenters that monitors are subjective, but I find it surprising that FAANG companies are so stingy when it comes to monitors. We get top of the line desktops and laptops, but rather mediocre monitors.
I have an LG ultrafine 5k, and a cheap BenQ 4k. The difference is night and day. I hate the BenQ* and will dump it as soon a I can.
My suggestion is to hold off until you can save up and buy one of the suggested displays when it's in your budget.
* The BenQ has a matte finish which effectively blurs everything, e.g. it's like it reduces the resolution by 10-20%. The colors are not accurate to my macbook display, despite trying to color calibrate. The brightness is dimmer as well, or looses contrast if I make it bright. Built-in speakers sound awful and volume cannot be controlled by software. I could go on.
Interesting article, but I'll stick with my dual 144 hz 27" 1080p monitors for less than the cost of a single 4k 144 hz that I wouldn't be able to use for gaming without investing another $10k in my tower.
8K TVs are coming!!! I've been an enthusiastic user of 4K TVs as monitors for 40" TVs. They're only $200-$250 for a decent one! So much real estate.
4k at 40" is basically what the DPI of a 30" 2560x1600 was.
But a 80" monitor for 8k is ridiculous. So with 8K we can finally just pick the real estate you want and the DPI will be great.
It amuses me that the press always say "what will you watch on 8K!" ... this is just like 4K. The "content" on 4K isn't broadcast, streaming, or disc based. It's all generated content by game consoles and computer applications, and upscaling.
Another option for low DPI users is to use bitmap fonts instead of vector ones. I find blurriness of truetype fonts on low res displays to give greater visual fatigue than the jagged edges of misc-fixed or terminus.
Try experimenting with the font rendering settings in the prefs. There's a "middle" option of "greyscale-only" text aliasing in addition to the regular off or on settings.
Depending on your monitor, font choice, and personal preferences you might see an improvement by playing around with these.
Anecdotally, when I had an older MBP with a weak integrated GPU, lowering the anti-aliasing setting to greyscale (or off) seeming to increase the responsiveness and framerate on my 4K display in JetBrains IDE's. It was particularly noticeable when scrolling.
I only use hi-dpi monitor with non 100% scaling on my laptop, and to this day I still can't get over how bad raster images look like when they're not showing in 1:1 pixel to pixel (on web pages, mainly). It's just a blurry mess, integer scaling or not.
I'm aware this article is mainly about text rendering, just want to point out something I hate with hi-dpi + non 100% scaling.
Also, on Windows with 125%, I don't find text as blurry as the author showed on Mac. They look pretty crispy to me. I guess that "scaling twice" thing is a Mac only issue (at least for text)?
I have two monitors. A 27" Apple Cinema display from 2012 that I got used in 2015, and an LG Ultra HD (4K) that I picked up last year new.
Yes, the LG is newer, 4K, has a higher DPI, etc. But I often find myself putting my most important windows on the Apple Display. The color is just so much more beautiful and vibrant, the clarity and contrast are superior, it's just so much better.
Sure, I can't fit as much code on it as the LG, but boy is it pretty. Apple definitely puts quality in their monitors. It's a shame they left the consumer market in 2016.
I still use an old boxy crt as my second monitor. It's at a low resolution, even for it, but it's primarily for slack and video meetings. I code and webbrowse using OSX full screen swipe mechanism on the main laptop screen.
I often reflect on how blurry and oddly colored the crt is in comparison, and that how back in the day I never noticed or cared.
It works fine for slack, and I prefer bigger text for chat anyway. Because it is still working, I don't see a reason to put it in a landfill or be "recycled".
I recently bought a Thunderbolt 3 monitor as what I have vowed is my 'Last Thunderbolt 3 purchase (except replacing broken things)'.
I'm holding out for Thunderbolt 4, when I can (hopefully) treat my wiring topology like USB instead of daisy chained all to hell and back. Because of this, I looked at good qualities in a secondary monitor that were survivable on a primary. For instance sound quality and real estate are less important, raising the priority of PPI, color gamut, and VESA mount.
Slightly off topic but I have the 16" Macbook Pro with maxed out graphics, and as soon as I connect my monitor, the fans start getting ready for lift-off (~5.6k RPM). It's a Samsung 4k 32" (DP 1.4 to usb-c).
I know this is a common issue but anyone else able to resolve this yet? It drives me crazy that this powerful machine starts sounding like jet just from connecting 1 monitor (Apple says it can support four 4k monitors).
The CPU and GPU don't look to be under any stress as such.
Anyone else gets eye strain from retina macbooks? Everything looks too sharp. The sharp contrast between the tiny pixels is visually perfect, but seemingly the perfection is what is causing me eye strain. Or perhaps it is some other effect.
Its just that when I use my retina macbook for hours, my eyes hurt. In particular, it is a relief to go back to a older non-retina macbook that I use for some minor projects. The non-retina macbook seems less visually taxing on my eyes.
"This will make everything on the screen slightly bigger, leaving you (slightly!) less screen estate. This is expected. My opinion is, a notebook is a constrained environment by definition. Extra 15% won’t magically turn it into a huge comfortable desktop. But at least you can enjoy that gorgeous screen and pixel-crisp fonts. Otherwise, why would you buy a retina screen at all?"
LOL
Has the author considered that maybe perhaps some people prefer more screen space and less "pixel crisp fonts"?
I'm currently using a 27" external monitor at 2560x1440 ("QHD") and this is wide enough for five terminals side-by side at 83 columns each. (Monaco 10pt, anti-aliasing off, iterm2)
This presents itself as a treatise, but it very much is not. It totally ignores critical aspects that should influence monitor choice.
In decreasing importance (imo):
- cost
- contrast ratio / black level
- all aspects related to temporal feel that aren’t refresh rate (response/decay time, sample and hold, backlight strobing, etc.)
- color gamut
- color precision (nobody wants banding from 6 bpc + FRC)
You can’t honestly make recommendations to people for $1000 monitors without even addressing these points, even if you don’t think they’re important for your workflow. It is very poor form.
As a geezer with glasses, I find my laptop screen works better than a monitor. The laptop screen is down in the reading zone of my progressive lenses, yet I can still look up out the window. A big monitor requires me to use glasses without any distance vision, making me feel like I'm trapped in a fishbowl.
I also love the fact that my window layout is exactly the same whether I am working at my desk or a coffeeshop. Or at least when I used to work at coffeeshops...
It seems there's no shortage of gripes with Apple's display support.
I'm still incredibly frustrated with just how much of a mess multi-display docks are with MacOS and how it just doesn't support DisplayPort MST at all.
Nice writeup! I recently switched from a 32" gaming >4k to a 27" 4k on my macbook and it's incredible. Love the single thunderbolt cable and resolution <3
I'm still in love with the default sized bitmap font used by xterm, I'd never want to look at any other font if I had a choice (which I do for terminal programs)
For all of the space given to macOS "Font Smoothing", that setting really doesn't do much at all. I'm testing on macOS 10.15.5 and the setting seems to do nothing at all for web pages (viewed in Safari), or Terminal.app, or VSCode, or MacVim, or Xcode's editor. It does affect some of the UI chrome in Safari's toolbar, and Xcode's sidebar, but not the editor. So it really seems to have no effect at all on programming.
My 5 years old iMac 5K 27" monitor is just fine, thank you. But lately I'm beginning to notice the effect of the brightness of modern displays in my eyes...
I want to get a 240hz ultrawide personally, both for games and instead of having 3 separate monitors (I hate the bezels between them). Samsung is making a good one, 49 inches, HDR 1000, 240hz, and adaptive sync [0]
I have an 8K monitor, but unfortunately it’s unusable with AMD graphics cards (amdgpu) under Linux. The NVIDIA proprietary drivers have worked like a champ.
Was used to HDPI screens at home and previous gig. Now at work with Linux and some cheap Dell 24"s: just zoom like 200%, until text is huge. If you only fit 80 cols like god meant to, type will be clear, you can have your monitor further away, improve your posture and eye strain. Code might even get better.
Guess coders care little about typography, like everyone else, mostly out laziness and lack of appreciation.
Great theory. I'd definitely love to have 4k 120hz displays, but until I'm not broke, I'll stick with my old low-res monitors I picked up for $20 apiece. They suck, but not everyone has the money for $500 panels, let alone the insanely-priced four-figure ones the author recommends. I'll probably just suffer through another decade of garbage text until they get cheap.
Yep. I've got my own little fleet of machines, mostly laptops with busted displays, that I run as servers for my own personal edification. Dirt cheap and useful. I've even got a pentium box that still works fine.
As far as I'm concerned, if I am going to buy a 4K monitor, I would probably go for a >40 inch one to be able to fit a lot of text on it rather than say a 32-inch Hi-DPI monitor. With subpixel rendering, fonts look good enough for me even on Lo-DPI displays. I'm just afraid that with Hi-DPI becoming more and more common, subpixel rendering will eventually disappear...
Yeah, but if you are going to use the same scaling for the UI at 24" as you would at 40", your eyes might start bleeding :)
...my Android smartphone also has the same resolution (actually even slightly higher) as the monitor I'm using to type this, but I wouldn't dream to use the same scaling for both.
I couldn't care more for resolution. I use three 1080p 27" Dell monitors and the only upgrade I did was moving to IPS which slightly relieved my eye-strain. Now if you can find me a monitor which will reduce eye-strain, I'll just dump money on that. For someone who uses the screen for 10-12 hours on a normal day, headaches induced by the eyes are my biggest worry.
this all seems like a lot of hassle to go through for some slightly smoother font edges. i like high res and high refresh rates and all that other good stuff, but i'd rather have slightly uglier fonts and not have to do an arcane dance every time i plug in a screen - i'm pretty sure that erases any negligible productivity benefits you might get from a better display.
My current setup consists of three 24" 16x10 monitors.
Don't know why i should switch.
That is a total resolution of 5760*1200 plenty enough. With the added benefit that ide, browser, terminal, mail, slack and whatever neatly snap into position and can be reached with the press of a button.
I don't see any reason to trade sharper text rendering for a worse aspect ratio.
16x9 just feels wrong to me; i am working and not at a cinema.
I think a lot of people jumped straight to the comments here to discuss their personal opinion on ideal monitor setup.
This is a really well-written and deep article by someone who is clearly an expert. All points are well illustrated and I learned a lot about how my MacBook renders graphics from reading it.
And yeah at the end it gets into what monitor you should buy. But that wasn't the point.
We still have so far to go with screen resolution on desktops and laptops. While 200 dpi monitors are a huge improvement over the ridiculous 1080p or 1440p that most people still use we won't hit the zone of diminishing returns until we get near 600dpi. Yes even from a laptop viewing distance 200 dpi and 600 dpi is like night and day if you have good eyesight.
I concur with everything said, but it is still a bit unclear to me who is this article targeting. What is the takeaway? "Anyone who can afford a $1500 monitor should have one, so poor font designers don't have to break their back manually hinting fonts anymore"? Isn't that a bit like solving poverty by moving to a better neighborhood?
I switched to 34inch 1440p ultrawide monitor after WFH began. Previous monitor was 27inch 4k monitor. As I put the monitor at the edge of the desk, I don't notice significant differences of the text quality.
As I use work laptop and personal desktop both, changing to ultrawide that supports USB KVM is god blessed. It removes any clutter to switching between the PCs.
Or you can just use pixel fonts (which imo look better than subpixel-aa fonts on 4k anyway) and donate the difference in monitor costs to [optimal virtue signalling charity]. Gaming on a 4k monitor sucks also, I don't know what kind of supercomputers people are running but my meager 1080Ti can't peg any of the relevant games to 144Hz at 4k.
That's your problem right there. Most people, myself included, are perfectly fine at 60Hz since we don't do competitive FPS gaming. At that refresh rate, my 2070 is perfectly happy to render games in 4k and I really don't notice any significant difference compared to 120Hz. If > 60Hz is really important to you, it's going to be another generation of video cards before single-card gaming can handle it well (probably the 3000 series cards will be capable).
Offtopic, I really like how the twitter post has been embedded as a screenshot and not using twitter's scripts.
Even if that means that I can't click any link in a tweet; I probably wouldn't have noticed otherwise.
I prefer the UX (and especially: fast load times) of an image to the one of a script. Also, an image means fewer requests and fewer potential privacy issues.
The "120hz dance" reminds me of my "drag my 2 external monitors around in System Preferences every time I dock my MacBook, otherwise they switch places"-dance.
It's amazing that all these display bugs still exist when Apple has presumably invested so much into their $6000 Pro Display.
Dear Apple: Why not spend a little time making the software work, too?
Currently, I work with a 720p 12.5" display, and fixed point bitmap fonts. I'm moving like crazy these days, can only rely on laptops. And Retina Macs are quite expensive where I do live; plus, we are not allowed to import used goods from other countries, and importing a new Mac yields a customs tax that's very painful to pay.
I'm waiting for 4K/32inch/120Hz monitor without local dimming(because it's power hungry) but stil no releases. Maybe due to most manufactures want to add HDR(10bit color) but 4K/120Hz/10bit exceeds DP 1.4 bandwidth so need to reduce color to 4:2:2 or use DP 2.0 or HDMI 2.1.
I really hope 120Hz going to standard for work monitor.
This is all very interesting. I've recently, and very seriously considered getting a medium sized UHDTV (say, 40-50") to use as a display. The purpose of this would mostly be to render very high resolution photos while keeping more of the image visible at full resolution than is possible on a smaller screen.
Well timed article for me. I'm looking to buy an external monitor to go along with my older (mid 2015) 15" MBP.
I was searching hacker news and elsewhere for opinions last night. I got confused, frustrated and gave up. Today is no exception. The only thing I know is, I want a large hi-res monitor that is at least comparable to my laptop.
> It’s also possible to run a 4k display at native 3840×2160 pixels. It depends on the size of the display, of course, but in my experience, even 27” 4k displays are too small to run at 1×. The UI will be too tiny.
There is a really nice LG 43" monitor (not a TV, it uses displayport) which is optimal for using 1:1 pixel scaling.
For me, I find 1440p + high refresh rate as the true sweet spot, the "affordable" monitor listed in this post is $900.
That's not a bad deal for a 4k high refresh rate monitor, but if you play any games you would need at least a 2080 or 2080ti which is another 700-1200, 1440p high refresh monitors go for around $300-400.
I'm reminded of the 90s, I landed a job using IRIX as my desktop. In the days of 72dpi monitors, the 144dpi 21:9 SGI display was a wonder! I think it also worked on the Mac, but there was no scaling like there is today, where there is a render pipeline and application pixels != screen pixels.
I inherited an old iMac with a 21" 4k monitor. There is simply no going back. I just ordered another one to have 2 of them. Everything looks better and even though almost no displays in a good price range have the same cd/m2 ("nits"), I am still looking forward to my new 2x4k life!
I hate the idea of scaling and blur, but I find I really like 6K-equivalent on my 27" 4K monitor at my desk (driven by a MBP below the monitor). And no, I don't have budget for a Pro Display XDR, as beautiful as it is. If I could get an IPS 6K panel for 15-2500, I'd save my pennies.
> Well, the split does not exist anymore. Since not that long ago (yes, I’m too lazy to check) you can have both! You can have a 4k monitor that runs on 120 Hz. In fact, that discovery was the main motivation for this article.
With... good color gamut? Without ghosting? If so I want to buy that monitor today.
As a web developer, I have Eizo ev2785 with 125% scaling and it worked fine so far.
From code maintenance perspective, I noticed that if you feel that there isn't enough space on your screen, it might be the right time to refactor and split it to the smaller chunks: extract another view partial, class etc.
For the past decade I’ve been holding out hope for a competitive e-ink monitor to use for coding, but it seems we’re still years away. Dasung and Onyx both offer (somewhat small) monitors but from what I can tell the reviews are less than stellar.
A great e-ink monitor - that would be my dream monitor.
>But even today you can peek into the future, if you have extra $4,000 to spare. This is Dell UP3218K, world’s first and only 8k monitor:
Note that the UP3218K isn't the brightest monitor in the world. I've read some reviews that claim you need a darkened room to see the full color gamut.
I have the same monitor as the author (Acer Nitro XV273K) and it's simply amazing with VSCode. It makes working from home a very pleasant experience, I'm not looking forward to going back to the office and a 1080p monitor... Not to mention how good games look at 4K 120 fps.
The author claims that high-resolution displays are a commodity now. That may be true, but those can be really expensive depending on where you live. In Brazil, most laptops are still (sadly) sold with 1366x768 displays. 1080p displays are that much more expensive, let alone 4K.
Eventually I want to go to 4k but price was a non-starter for me. I wanted 3 monitors and I was able to grab 3x2k (1440p) screens on black Friday last year for around $200/ea. I have them arranged in a "Tie-fighter" orientation (1 horizontal in the middle and 2 vertical on the sides) with an older 1080p screen above the center monitor (it's just for security cams/monitoring).
I split the 2 vertical monitors into 3rds (top, middle, bottom) and I have keyboard shortcuts to move/resize windows as well as snap every window to it's designated space. I have been extremely happy with this setup so far.
I use a 32" monitor at 4K with 1x scaling. It is a productivity booster for me. More horizontal space is just much nicer to host debug sessions and jump between code sections. At the same time (of debugging), I can tile web browser to the other side and either do some lookups, or run a Jupyter notebook.
It has been years since I gamed regularly but my brother and I were probably some of the best at a particular FPS. It has to be about 15+ years ago. When we played in the physical presence of others we noticed that we were some of the only ones to turn the display settings way down while keeping resolution the maximum. Probably 800x600 or 1280x1024 in those days. The game ran smoother, and the less complex textures made movement standout.
People are playing CS:Global Offensive because its a smoother, less complex texture compared to other modern games (ie: Overwatch).
I'd dare say that the "serious" FPS players stick to Counterstrike, and other easier-to-render games with 200+ FPS. (even if the monitor doesn't support it, the higher FPS results in smoother gameplay and fewer hiccups).
There's of course a huge casual crowd playing Overwatch, PUBG and Fortnite. But CS tournaments remain a thing to this day.
No, but in 2002/3 we would play Quake during our CCNA classes every now and again instead of class work. The hardware was old so it was the original quake.
I would strongly advocate for a 1440p 34" 21:9 ultra widescreen monitor.
The 21:9 aspect ratio is perfect for having the standard IDE + chrome tab arrangement. It doesn't have the issues brought in by having multiple monitors and 1440p is a good middle-ground resolution.
The most important thing here: does your new monitor look good for you? Do you like how it renders graphics/text/whatever-you-use-it-for? If so, please disregard articles like TFA. They are trying to convince you of something you, by definition, don't need.
Yes, the monitor looks fine. It is connected to an old Dell Optiplex 390 with an i3-2100 and 8 GB of memory. I don't watch movies or play video games on it or anything. It is basically a glorified terminal for me to citrix/remote desktop to work.
I've found my off-brand (Aukey) "blue" mechanical keyboard has been a greate improvement in my quality of life though (even though it is not very ergonomic).
It is funny how when I started using Visual Studio around 2008 I didn't have a 1920x1080 monitor and I wanted to see all the panels and it was so painful.
Right now, my biggest pain point is my horrible Internet (Wi-Fi) connection. It is especially painful because I move my mouse or type something and nothing appears on the display and I don't know whether the remote computer is slow or my network connection is crapping out again.
Almost feels like I am whining about a non-issue because even fifteen years ago, I was on a dial-up "soft" modem and it would have been unthinkable to get pretty much live full 1080p remote desktop.
Personally I can’t go back to a 60hz monitor at this point. So really can’t go to 4K until there is a reasonably priced 144hz monitor with low input lag and response times. I’m okay sacrificing color for speed (TN panels are okay with me).
I turn off any font/scroll smoothing and animation whenever possible. I use a good bitmap font for coding. It's readable and snappy. No hinting BS. There's no need for animation in Emacs or tmux after all.
It is just me that I used to find CRT monitors much more soothing and soft to my eyes. I feel the LCD monitors to have a very synthetic contrast, brightness and colors which I find harsh to my eyes.
You can pry my cheap A- panel Korean no-name brand 8+ year old monitor with one working button (thankfully the power button!) and a bunch of stuck pixels from my cold, dead, miserly hands.
I just recently went from 2 4k monitors to 3 2k 144hz monitors. I'd argue as long as the panel quality is decent, 144hz is a much better quality of life improvement than 2k to 4k is
Those are the ugliest monitors I've ever seen. I thought all monitors were basically boring rectangles. No, apparently there are monitors with weird designs on them and shrouds.
For coding I just use bitmap fonts, and I don't have any blurring issues this blog describes. I can do that even on old IBM T41's screen, and text still looks just fine.
I use a 2k Dell monitor which I picked up for around $400 a few years ago. To me, this strikes a nice middle ground between high resolution and affordability/comparability.
Opinions or preferences aside, I thought this was very well written! Enjoyable to read, and I learned a lot about pixels, fonts, screens et-cetera that I did not know before!
More than resolution, it is aspect ratio that I want. I love my 1920x1200 monitor. I wouldn't trade it for a 2560×1440. The 16:9 ratio is simply too short.
I think the creator of this site should revisit its dark mode feature because when I enable it I am getting black on black which makes the text hard to read.
2k (2560 x 1440) seems to be ideal for 27 in. monitors. Does anyone really find 2k insufficient on a 27 in.? And I find 1080p fine on a 24 in. monitor.
I actually “downgraded“ from a 43” 4K to a 30” (2560 x 1600). Trying to focus strained my eyes, and the brightness from the big monitor meant I had to position it as far back as I could on my desk, which made it even harder to focus and caused even more eye strain. I also realized a 16:10 ratio is more friendly for coding.
Frankly, 4K/5K monitors seem like a gimmick for most people. Especially puzzling is why you would pack so many pixels into smaller (~27“) monitors and require more power and graphics muscle for imperceptibly “better” images.
It’s the default under both Windows 10 and Gnome 3 for hi-dpi displays. In fact, it’s near impossible to get fractional dpi working across the board on Gnome 3.
I'm not sure how, but GTK3 applications violate the layers and are not affected by the nVidia setting. I think it's because nVidia sets a fake Xorg resolution (e.g. twice your actual) but GTK3 sees through that and uses its own internal dpi setting. I also presume this would be the case regardless of whether Gnome 3 is your DE/WM or not.
i have a 4k 27" paired with a thinkpad. On ubuntu with gnome there is a huge performance dip on x11 with fractional scaling turned on. So i use this at 1x with font scaling of 1.25x.
This person recommends 3 monitors, NONE of which appear to offer adjustments that are critical for ergonomic health. I don't find this perspective trustworthy.
I'm not going to sacrifice my spine for slightly fancier characters.
Higher resolution (to an extent) allows me to see more stuff.
4K on a 27" doesn't work for my workflow.
My workflow on a laptop is to put two windows side by side so that I can compare things/read documents/code against documentation.
Unfortunately most websites nowadays are designed for widths of 1280px, and anything less than that is sometimes treated as mobile or gives it a terrible responsive design. It frustrates me, endlessly.
On a 16" MBPr I can get 2048x1280, which allows me to put up two windows side by side at 1024px, which does work for most sites, but sometimes there are some sites which are just broken.
A big reason why I need larger windows is when doing side by side PR Reviews on GitHub, 1024px can only show 56 characters, 1280px can only show 78 characters. Personal opinion, GitHub has too much white space. I tend to have another window open to compare against documentation/etc.
I'm comfortable is 27" 2560x1440. Two windows side by side at 1280px. Pixel pitch is 0.2331mm
Using 4k at 27" is too small for me. Pixel pitch is 0.1554mm . Without the correct dongle/cable MacBooks can't even push at 4k 60hz via HDMI. Tip: use DisplayPort.
I've settled on 38" 3840x1600 because it allows me to setup three windows side by side at 1280. Which works great for my workflow.
Which sometime tends to be:
- 1 window for source material
- 1 window for my main focus
- 1 window for a comparison material
38" 3840x1600 pixel pitch is 0.229mm.
I settled on the Dell U3818DW, because it can charge my MBP via USB-C. Only recommendation is if your laptop is asleep, don't leave your laptop plugged int. It likes to wake it.
The 1600 also gives me some extra room vertically.
If you can afford it, I'd recommend giving 3840x1600 a try.
As an early adopter to monitor technology, I have to say that I consistently regret it.
I got one of the first 4k mainstream monitors. I paid $3000 for it. This was back in the day when DisplayPort didn't really "do" 4k, so it was done by pretending it was two monitors internally. This broke EVERYTHING. For years, I struggled with Linux trying to treat two monitors as one big one (and putting new windows right on the border). Welp, they fixed that. But trying to treat two monitors as though it's one was completely impossible. I eventually got it to work by enabling a bunch of random features in the nVidia driver, that when enabled together triggered a bug that broke Xrandr, so everything thought I just had one monitor. (I could not, of course, add a second monitor.) Miraculously, they never fixed that bug. It worked for half a decade at least. (At some point in there I switched to Windows, which of course supported it perfectly because the driver was specifically hacked to detect that model number and do extra stuff.)
Several years later, I wanted to get a monitor that supported more colors than sRGB. Big mistake! While inexpensive, I learned that NOTHING supports color spaces correctly. The Adobe apps do, but that's about it. Online image sharing services go out of their way to MODIFY the color space that you tag an image with, so there is no hope of anything ever showing the right colors unless you manually clip them to sRGB. Things like the Win32 API, CSS, etc. have no way to say what color space a color is encoded in, so there is no way to make the operating system display the right color. ("background-color: #abcdef" just means "set the color on the user's display to #abcdef", which is a completely meaningless thing to do unless your working colorspace is sRGB, and the user's monitor works in sRGB. It worked for years, but was never correct.) The worst thing is, nobody appears to care. ("It just makes colors more vibrant!" they'll tell you) Big mistake. Do not buy unless you never want to see a color as the author intended ever again. (I solved my photography colorspace problem by switching to black and white film. Take that, colors! You can't display them incorrectly if there aren't any!)
The next thing I jumped on is high refresh rate. I waited until 144Hz IPS panels were affordable, and got one. It sure is better than 60Hz, which looks like a slideshow, but there are of course problems. The first is... it is pretty optimistic to think that an IPS panel will actually update at 144Hz. They do not. The result is blur. I run mine at 120Hz with ULMB (which basically strobes the backlight at the display update rate). That looks really good. There are some artifacts caused by the IPS display, and 120Hz is noticeably slow, but moving things sure are clear. You can pan a google map and read the labels as it moves. Try that right now on your 60Hz display, you can't do it!
But because of IPS, at 144Hz without ULMB, you get a smooth mush. At 120Hz with ULMB, you can read moving content (like player names attached to people in an FPS, it is trippy the first time you use it). Having said that, it's bad for anything that doesn't render at 120Hz. Web browsers, games, CAD... great! Videos... AWFUL, just awful. On a 30/24Hz video, frames get strobed 4 or 5 times, and this causes your brain to think "hey, a slide show". (You can record the display with a high speed camera, and it will look completely different from what you see in real life. Darn brain, always messing things up.) Things like pans skip and jerk, as your brain tries to interpret the video stream as a series of <image 1> <black screen> <image 1> <black screen> <image 2> <black screen> ... instead of a smooth blend of <image 1> <image 2> ... You can post-process the video to "invent" frames in the middle, so your monitor displays new image data each time it strobes the backlight. I do this with mpv and it looks great. But if you watch video in a browser, you are out of luck.
My TL;DR here is that buying any sort of fancy monitor is just going to make you very unhappy. You will learn everything in the world there is to know about color space math, pixel transition times, using high speed cameras to debug issues (what a time sink), how your brain processes moving images, etc. It won't make you any happier. It won't make you better at programming.
If you play competitive games, get a TN 1080p 240Hz monitor, simply because that's what everyone else uses. Don't use it for anything except the game, because every second that you use it it will make you unhappy. But it's absolutely a joy to play a game on it. (Why 1080p and not 1440p? Guess who bought a 1440p 165Hz monitor. Not anyone that has ever contributed code to the game or played the game at a professional level. But I did! Guess who gets to live with the bugs.)
If you are a programmer, just buy whatever. Every single monitor ever designed will make you unhappy.
If you are a programmer who works with color, get yourself a good therapist. You will be meeting with them on a daily basis, and even then, you'll still be scarred for life. It's all about damage control at this point.
> Text can’t be made look good on low-resolution displays.
BS. Maybe true on user-hostile OSX/windows. But If LCD manufacturers provided the correct information on the DID data, it would be very trivial.
On linux and with a little trial and error (you only have to do it once per monitor ever) you can fine tune the subpixel hinting. I use a photographer loupe (magnifying glass) to look at the a white region on my ancient LCDs to see the subpixel configuration, set it on my X config and have perfect aliased text just fine. ...well, except on some gtk2 applications :) But if you work more than a few minutes on those you have other problems.
My take on having used quite a variety of displays: 1440p with a high (120, 144 etc) refresh rate is the way to go.
With 4K you can lose a lot of performance especially since they are often paired with integrated graphics. Also what's the point of a higher pixel density when it's higher than you can resolve? Meaning if you have to zoom that's a sign that you get nothing more from increasing resolution. Also despite claims to the contrary, a lot of software is still not well suited for 4k resolutions.
High refresh rates are completely underrated as it's at the same time a meme but also a widespread false belief that human eyes have some kind of limit that is being satiated by your 60hz displays. Scrolling code on 144hz is smooth.
Bonus point that isn't mentioned in the article: HDR is a meme, don't buy into that crap.
Also: CRTs are criminally underrated and still unfairly judged. We lost something from that era. Colors, black levels, subjective image quality and input lag have still not recovered from the peak of display technology in the year 2000 or so.
>Bonus point that isn't mentioned in the article: HDR is a meme, don't buy into that crap.
I'd wager more people are going to notice the difference between HDR and SDR than they would 4k vs 1080p. It is by far the biggest picture quality upgrade I've seen since going from SD to HD.
The problem is HDR sucks on LCDs, so it's largely useless on PC. FALD isn't enough to limit the halos, even with a ton of zones. I own the PG27UQ mentioned in this article and I hate it.
I'll probably just buy an LG CX 48" for my next monitor.
I mean sure HDR is impressive, but for me it's always like 3D movies.. nice for 5 minutes and then you see the flaws. Dimming zones are a really crappy way to get around the fundamental display tech limitation like you said. But OLED has insane problems with burn-in. They officially don't exist but there are endless reports of this happening. Although I can't speak with high confidence about that. In general the major problem with non-OLED displays are the black levels. Going for HDR was a silly PR move.
> I mean sure HDR is impressive, but for me it's always like 3D movies.. nice for 5 minutes and then you see the flaws
I mean, it basically "just works" on OLEDs. I've watched thousands of hours of HDR content now and maybe 2-3 hours of it was poorly done and distracting. vs. 3D, which I can never sit through at home. (Though, I don't mind well done 3D in the theater)
>But OLED has insane problems with burn-in. They officially don't exist but there are endless reports of this happening. Although I can't speak with high confidence about that.
I've got two OLED TVs, one four years old, the other two. Neither have had any noticeable burn in, though the older one does sometimes have temporary image retention if I've an extended amount of time playing a game, or watching a channel with a static logo, etc.
I am not one of those people that sits there with MSNBC or CNN or Fox or whatever on all day every day with the logo in the same place, though. My OLED usage goes Movies > TV shows > Twitch > Youtube > everything else.
Until a better tech comes out, I'm all in for OLED, even for computer monitors. You won't see me purchase another standalone LCD screen for anything that I'll be caring about picture quality on.
My productivity has skyrocketed after our company moved remote, I think in large part because my home monitor is good and my office monitor was garbage. Staring at blurry text all day messes with your brain a lot more than you realize.
If you're using a recent Ubuntu with a 4K display, the best way I have found to make things readable without making them huge is by enabling "Large text" under Universal Access. This sort of mimics the old Unity scaling, except you have to set your Chrome to 150% and your console font size separately now. Works for me on both Lenovo Carbon X1 and 32" Z32 desktop monitor.
This sounds like when Jeff Atwood started that fad that if you didn't have three external monitors (yes, three) then your setup was suboptimal and you should be ashamed of yourself.
No. Just no. The best developers I've known wrote code with tiny laptops with poor 1366x768 displays. They didn't think it was an impediment. Now I'm typing this on one of these displays, and it's terrible and I hate it (I usually use an external 1080p monitor), but it's also no big deal.
A 1080p monitor is enough for me. I don't need a 4K monitor. I like how it renders the font. We can argue all day about clear font rendering techniques and whatnot, but if it looks good enough for me and many others, why bother?