Hacker Newsnew | past | comments | ask | show | jobs | submit | stephenr's commentslogin

> copied all the Darwin libraries from the Darling project and used LLVM to generate all the appropriate dylibs

I'm just starting for the day and misread that as "...used LLM to generate...", and I wondered what kind of crack you were smoking.


In future, your OS will be an agentic LLM which runs software by YOLOing the binaries, and then continuously fixing and refining the environment until it runs without crashing.

Can confirm. I use a Dell 6K 32", and it's frankly amazing. I still use an older Dell 4K 24" (rotated 90º) off to one side for email/slack/music but I just use the single 32" for ~90% of what I do.


Conversely if you only use a ~110 DPI display you won't know how bad it looks on a ~220 DPI display.

The solution here is wide device testing, not artificially limiting individual developers to the lowest common denominator of shitty displays.


Yeah sure, as long as you have a lot of resources for testing widely.

Still, if you were to make an analogy you should target for a few devices that represent the "average", just as its done for (most) pop music production.


> if you were to make an analogy you should target for a few devices that represent the "average"

For Macs, 220DPI absolutely is the average.


Sure, but Macs are around 10% of general desktop computing. To a first approximation, they don't count. User communities vary widely. If you target macs, then a high DPI screen is a must for testing. Otherwise, I dunno; ~ 100 DPI screens are way less expensive than ~ 200 DPI screens, so I'd expect that installed base is significantly higher for standard DPI. But there's probably enough high DPI users that it's worth giving it a look.

To address a question elsewhere, personally, I don't see the benefit to pushing 4x the pixels when ~ 100 DPI works fine for me. My eyes aren't what they were 20 years ago, and it's just extra expense at every level.


I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from - my former employers equipped all the software engineers with dual-4K displays nearly a decade ago.

One is hard-put to buy a developer-power laptop with a sub-2K display these days, even in the Windows world, and >2K displays have been cheap on desktop for a really long time.


I believe there are a lot of people using 1080p monitors because they bought it a while ago and they're still working fine. There's also a lot of lower-end 1080p monitors still being sold today.

> One is hard-put to buy a developer-power laptop with a sub-2K display these days, even in the Windows world

I personally see a lot of 1080p screens on new gaming laptops too. Lots of people get those for work from what I see with my peers. When I sold my RTX 3060 laptop with a 1080p screen, most buyers wanted it for professional work, according to them.

> I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from

If anything, this is exactly the place where I'd expect a bunch of people to be rocking an older Thinkpad. :)


If you look at the Steam hardware survey, most users (as in, > 50%) are still using 1080p or below.

https://store.steampowered.com/hwsurvey/Steam-Hardware-Softw...


In part though that's not because all those users can't afford >1080p because some of them can it's that insanely high refresh rate monitors and esports players often use 1080p at >300Hz - even the ones without still use 1080p because driving up the frame rate drives down the input latency.

Whether it matters is a bigger issue, 30 to 60Hz I notice a huge difference, 60 to 144Hz@4K I can't tell but I'm old and don't play esports games.


I don't think this is contra to my original point. Nearly 50% of all users are running at greater-than 1080p resolutions, and presumably power users are overrepresented in the latter category (and certainly, it's not just the ~2.5% of Mac users pushing the average up)


FWIW, I didn't mean to reply to you in an argumentative way. Just proposing an answer to this:

> I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from

I still see 1080p fairly often on new setups/laptops, basically, although 1440p and 4K are becoming more common on higher-end desktops. Then again, 1440p at 27" or 32" isn't really high dpi.


Writing this on 1280x1024 because it still works fine

The 5:4 aspect ratio is weird, especially in this era of everything 16:9, but it's a second monitor so usually only has one thing open


If you have 20/20 vision, a 27" display at 1440p (~110 DPI) has a visual acuity distance of 79cm - ie, if you are sat 79cm or further away from the screen, humans are not capable of resolving any extra detail from a higher resolution. High refresh rate 1440p IPS screens are very widely available at good prices, so it isn't that crazy that people choose them.

Phone and laptop have higher DPI screens of course, but I'm not close enough to my desktop monitor for a higher DPI to matter.


That's a common misconception. That acuity calculation is based on knowing nothing about the image; imagine applying it to arbitrary noise. When you have things like lines and edges your eyes can pick out differences an order of magnitude finer.

https://en.wikipedia.org/wiki/Hyperacuity


Thanks for pointing this out.

It's always been a weird topic - science (appears to, due to the aforementioned misconception) say one thing, and yet I have eyes and see a difference that the science says I shouldn't see.


Have you tested it in practice? High-DPI monitors make a very noticeable difference for text and user interface. That's the truth, even if the theory doesn't agree.


I'm running a 32" display at 4k, which works out to about the same at 79cm. Apparently a bunch of people sit really close to their monitors :)


Absolutely everyone in my company uses 1080p monitors unless they got their own. That’s just “normal“.

It’s horrible.


Retina still isn't available for large monitors like 38" and above.


Retina is available for only a handful of 5k 27" monitors, most of which aren't great, and all of which are only 60Hz.

It's really hard to buy one given how expensive / badly specced they are compared to 4k monitors, even as someone who value the vertical pixels.


I can’t tell you how often I see this. Brand new designs or logos in 2024 or 2025 that look abysmal on a retina monitor because no one bothered to check.

Stands out like a sore thumb.


Don't a bunch of the newer tools that wrap Virtualization.Framework (which itself wraps/builds on the lower level Hypervisor.framework) already support this?

There's even an example project to do this in code: https://developer.apple.com/documentation/virtualization/run...


> It was common in the 00s in Britain, maybe still is, to serve pasta as a bowl of plain, dry boiled spaghetti with sauce poured on top.

Friends (the US show) had a scene where a supposed CHEF did this when cooking for her parents, in the mid-late 90s.


I wonder about the same thing. I've come to the conclusion that it's driven a lot by Management-Ideal definition of devops: developers who end up doing OPs without sufficient knowledge or experience to do it well.


What company that has enough infrastructure to dictate an IT Department is also only using certificates on their web servers, and thus doesn't have a standard tool for issuing/renewing/deploying certificates for *all* services that need them?


Given that Caddy has a history that includes choices like "refuse to start if LE cannot be contacted while a valid certificate exists on disk" I'm pretty happy to keep my certificate issuance separate from a web server.

I need a tool to issue certs for a bunch of other services anyway, I don't really see how it became such a thing for people to want it embedded in their web server.


As we repeat every time this comes up, this was literally 8 years ago when the project was in its infancy and the project author was in the middle of exams, and it has not been true since. Caddy has been rewritten from the ground up since then, and comparing it to those old versions is dishonest.


The concern isn't that the same code exists, or even that it has odd unintended behaviour.

The concern is that the author failed to understand why his batshit-crazy intended behaviour was a bad design from the start.


So you've never made mistakes in your life? Do you think children are irredeemable if they get a B on their tests in school? What a ridiculous take.


Making a mistake is generally considered "acceptable" if you learn from it and acknowledge the mistake.

The author did neither - he was steadfast that his approach was correct, and everyone else was wrong.


I remember you. You're just grumpy because you didn't think of it first. ;)


Top effort dispelling the claim that you make poor decisions mate.

Someone references when you made an ass-backwards decision, and insisted you were correct; your immediate response is not any kind of explanation about how you learnt to trust other people's opinions, or even acknowledging that you got it wrong - you resort to petty childlike attempts at insult.


Not the parent commenter but one major difference I see is that they provide an API, making it possible to integrate from code. (i.e. I've seen some "View Helper" type libraries that will integrate, to allow inserting icons programatically).


I mean, he's also the same guy who apparently thought "Unix ideas that have worked for literally decades, nah fuck that. I know better".

It took over a decade before the project made some improvement on how the default install path is handled.

To my knowledge it still has absolutely atrocious dependency resolution relative to things like DPKG.

Not hiring this guy is honestly like a fancy restaurant not hiring the guy who comes up with the new McDonalds obesity burger special menu. What he created is popular, it's not good.


Google is not a fancy restaurant. Five-guys private consultancy is a fancy restaurant. Google is the McDonalds of all McDonaldses, it makes software that is used by everybody, whether they want it or not, and you can't turn a corner without hitting something they control.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: