Hacker News new | past | comments | ask | show | jobs | submit login

It was just one example.

Total resource demands are still way, way, way over 100x. Data speeds. Peripheral inputs. Storage. Basically everything.

And no, of course a computer doesn't boot at a different speed depending on display size. It's about the assets and code that fill displays of that size -- all the graphics you've got to load, all the code that has to draw the antialiasing and transparencies and shadows and subpixel font ligatures and everything else.

Same way the code for dealing with storage capabilities is way more than 100x as complex. For peripherals. Etc etc etc.




If you don't fully understand the topic, the smart choice would be to re-examine your own assumptions.

Do you understand how VGA displays or graphics rendering work? Or how computers boot up?

A modern linux system, RHEL, Debian, etc..., isn't going to try to load 4K graphics on a single connected VGA display, especially if you use it without any third party video drivers or adaptors that support 4K video out.

Many motherboards, even in 2023, have a direct VGA out port that it defaults to, reliably. Which is what this typically refers to.

If your still worried, then there's always the option to manually verify the installed files and boot sequence to confirm that it isn't attempting to force it.


I think you're misunderstanding me. This has nothing to do with VGA. I was using 4K screens as just one example of the many, many dimensions of growth.

It's simply the point that there's so much more to load during booting. Your contention that computers only use 100x more resources than in 1984, and should therefore boot in "way under 2.2 seconds" is way, way off.

Computers use way, way, way more than 100x resources compared to two decades ago. Hence, booting still takes a bit of time. It's pretty simple.


You appear to have lost track of the conversation?

> Icons used to be 32x32 monochrome with a mask. Now they're 512x512 in 48-bit color. System fonts used to have ~200 characters, now they have tens of thousands. Extrapolate to everything else and it becomes pretty clear. There's just so much more to load.

If so, let me spell it out step by step. That was the initial reply to me. Therefore...

> I was using 4K screens as just one example of the many, many dimensions of growth.

This example, is likely close to meaningless, as elaborated on previously.

Hence why I suggested to review whether '32x32 monochrome' or '512x512 in 48 bit color', etc., has any observable effect. With the help of a VGA display showing, presumably, graphics roughly corresponding to the first, another display corresponding to the second, and so on.

If you want to discuss something later on, then it should be in your interest to resolve the first claim as soon as possible in your favour.

For example, if you disagree and still think resolution makes a noticeable difference, then show that convincingly, especially as it's a positive claim, which HN readers tend to treat more critically.

It really seems a bit odd to try to skip that discussion and then claim it 'has nothing to do with VGA' which can only reduce the possible avenues to prove your credibility.

i.e. You are the one who raised the possible "to do" regarding lower resolutions. The reason why I started discussing 'VGA' at all was because of that comment.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: