In future, your OS will be an agentic LLM which runs software by YOLOing the binaries, and then continuously fixing and refining the environment until it runs without crashing.
Can confirm. I use a Dell 6K 32", and it's frankly amazing. I still use an older Dell 4K 24" (rotated 90º) off to one side for email/slack/music but I just use the single 32" for ~90% of what I do.
Yeah sure, as long as you have a lot of resources for testing widely.
Still, if you were to make an analogy you should target for a few devices that represent the "average", just as its done for (most) pop music production.
Sure, but Macs are around 10% of general desktop computing. To a first approximation, they don't count. User communities vary widely. If you target macs, then a high DPI screen is a must for testing. Otherwise, I dunno; ~ 100 DPI screens are way less expensive than ~ 200 DPI screens, so I'd expect that installed base is significantly higher for standard DPI. But there's probably enough high DPI users that it's worth giving it a look.
To address a question elsewhere, personally, I don't see the benefit to pushing 4x the pixels when ~ 100 DPI works fine for me. My eyes aren't what they were 20 years ago, and it's just extra expense at every level.
I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from - my former employers equipped all the software engineers with dual-4K displays nearly a decade ago.
One is hard-put to buy a developer-power laptop with a sub-2K display these days, even in the Windows world, and >2K displays have been cheap on desktop for a really long time.
I believe there are a lot of people using 1080p monitors because they bought it a while ago and they're still working fine. There's also a lot of lower-end 1080p monitors still being sold today.
> One is hard-put to buy a developer-power laptop with a sub-2K display these days, even in the Windows world
I personally see a lot of 1080p screens on new gaming laptops too. Lots of people get those for work from what I see with my peers. When I sold my RTX 3060 laptop with a 1080p screen, most buyers wanted it for professional work, according to them.
> I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from
If anything, this is exactly the place where I'd expect a bunch of people to be rocking an older Thinkpad. :)
In part though that's not because all those users can't afford >1080p because some of them can it's that insanely high refresh rate monitors and esports players often use 1080p at >300Hz - even the ones without still use 1080p because driving up the frame rate drives down the input latency.
Whether it matters is a bigger issue, 30 to 60Hz I notice a huge difference, 60 to 144Hz@4K I can't tell but I'm old and don't play esports games.
I don't think this is contra to my original point. Nearly 50% of all users are running at greater-than 1080p resolutions, and presumably power users are overrepresented in the latter category (and certainly, it's not just the ~2.5% of Mac users pushing the average up)
FWIW, I didn't mean to reply to you in an argumentative way. Just proposing an answer to this:
> I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from
I still see 1080p fairly often on new setups/laptops, basically, although 1440p and 4K are becoming more common on higher-end desktops. Then again, 1440p at 27" or 32" isn't really high dpi.
If you have 20/20 vision, a 27" display at 1440p (~110 DPI) has a visual acuity distance of 79cm - ie, if you are sat 79cm or further away from the screen, humans are not capable of resolving any extra detail from a higher resolution. High refresh rate 1440p IPS screens are very widely available at good prices, so it isn't that crazy that people choose them.
Phone and laptop have higher DPI screens of course, but I'm not close enough to my desktop monitor for a higher DPI to matter.
That's a common misconception. That acuity calculation is based on knowing nothing about the image; imagine applying it to arbitrary noise. When you have things like lines and edges your eyes can pick out differences an order of magnitude finer.
It's always been a weird topic - science (appears to, due to the aforementioned misconception) say one thing, and yet I have eyes and see a difference that the science says I shouldn't see.
Have you tested it in practice? High-DPI monitors make a very noticeable difference for text and user interface. That's the truth, even if the theory doesn't agree.
I can’t tell you how often I see this. Brand new designs or logos in 2024 or 2025 that look abysmal on a retina monitor because no one bothered to check.
Don't a bunch of the newer tools that wrap Virtualization.Framework (which itself wraps/builds on the lower level Hypervisor.framework) already support this?
I wonder about the same thing. I've come to the conclusion that it's driven a lot by Management-Ideal definition of devops: developers who end up doing OPs without sufficient knowledge or experience to do it well.
What company that has enough infrastructure to dictate an IT Department is also only using certificates on their web servers, and thus doesn't have a standard tool for issuing/renewing/deploying certificates for *all* services that need them?
Given that Caddy has a history that includes choices like "refuse to start if LE cannot be contacted while a valid certificate exists on disk" I'm pretty happy to keep my certificate issuance separate from a web server.
I need a tool to issue certs for a bunch of other services anyway, I don't really see how it became such a thing for people to want it embedded in their web server.
As we repeat every time this comes up, this was literally 8 years ago when the project was in its infancy and the project author was in the middle of exams, and it has not been true since. Caddy has been rewritten from the ground up since then, and comparing it to those old versions is dishonest.
Top effort dispelling the claim that you make poor decisions mate.
Someone references when you made an ass-backwards decision, and insisted you were correct; your immediate response is not any kind of explanation about how you learnt to trust other people's opinions, or even acknowledging that you got it wrong - you resort to petty childlike attempts at insult.
Not the parent commenter but one major difference I see is that they provide an API, making it possible to integrate from code. (i.e. I've seen some "View Helper" type libraries that will integrate, to allow inserting icons programatically).
I mean, he's also the same guy who apparently thought "Unix ideas that have worked for literally decades, nah fuck that. I know better".
It took over a decade before the project made some improvement on how the default install path is handled.
To my knowledge it still has absolutely atrocious dependency resolution relative to things like DPKG.
Not hiring this guy is honestly like a fancy restaurant not hiring the guy who comes up with the new McDonalds obesity burger special menu. What he created is popular, it's not good.
Google is not a fancy restaurant. Five-guys private consultancy is a fancy restaurant. Google is the McDonalds of all McDonaldses, it makes software that is used by everybody, whether they want it or not, and you can't turn a corner without hitting something they control.
I'm just starting for the day and misread that as "...used LLM to generate...", and I wondered what kind of crack you were smoking.
reply