Hacker News new | past | comments | ask | show | jobs | submit login

In a sane world, this would be a feature, not a bug.

Being able to get use out of older hardware is a feature. New hardware not providing any new possibilities is a bug, and that's where we've been for the last 5 years at least.

I think the core point of the article is valid. It's ridiculous that my $230 tablet has a better resolution than nearly every "high end" laptop. And apparently nobody has any ideas for using our tremendously powerful multicore CPUs and near-teraflop GPUs other than rendering increasingly bloated websites.




My point is, there's only so much benefit that one person can get from exponentially increasing processing power. Web browsers and word processors gain new bells and whistles, but the basic functionality is the same as it ever was. You can run a bajillion windows at once, watch Netflix at HD resolution, and not slow down. Games are near-photorealistic. What more do we want a computer to do, exactly?

You mention resolution--once the pixels are smaller than the naked eye can distinguish at typical viewing distance, everything else is polish and marketing. Yes, I know, retina displays are indefinably "crisper", or something. But you can't functionally cram any more useful information on to them, because the user won't be able to make it out.

We're seeing features like this because we've reached the point of diminishing returns on desktops. There aren't going to be any more massive game-changing jumps in raw processing power. In the 90s, we went from Wolfenstein 3D to Doom in a year and a half, and from there to Quake in another two and a half years--all of them gigantic, envelope-pushing leaps in gaming technology. In this century, we've gone from Half-Life 2 in 2004 to...what? Crysis 2? The latest Call of Duty? When was the last time a PC game got the same kind of uproar over graphics that Doom did? The degree of change that used to come once a year is now coming every ten. That's not because everything got boring, it's because we're getting asymptotically close to perfection, at least from a practical home-user standpoint. That's bad if you're in the business of convincing people they need new computers every year, but it's decidedly good if you're a user.


Have you ever used a retina display on a laptop? Every reviewer that I've seen comment on it has said that they cannot go back, and not because of indefinable crispness. Also, some people do run their screens with 2-4x as much info crammed onto it - this would be the main reason I would use one.


The 1920x1280 Nook HD+ is now available for $149. I checked NewEgg and only 7 laptops met that resolution, and 5 of them are from Apple - the cheapest is $1399.

The problem here is Microsoft, not Intel. They never figured out a way to make sure that things looked consistently good on a high resolution display. When the OEMs all decided to piggyback on the TV industry and declared 1920x1080 was "high end", that nailed the coffin.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: