My point is, there's only so much benefit that one person can get from exponentially increasing processing power. Web browsers and word processors gain new bells and whistles, but the basic functionality is the same as it ever was. You can run a bajillion windows at once, watch Netflix at HD resolution, and not slow down. Games are near-photorealistic. What more do we want a computer to do, exactly?
You mention resolution--once the pixels are smaller than the naked eye can distinguish at typical viewing distance, everything else is polish and marketing. Yes, I know, retina displays are indefinably "crisper", or something. But you can't functionally cram any more useful information on to them, because the user won't be able to make it out.
We're seeing features like this because we've reached the point of diminishing returns on desktops. There aren't going to be any more massive game-changing jumps in raw processing power. In the 90s, we went from Wolfenstein 3D to Doom in a year and a half, and from there to Quake in another two and a half years--all of them gigantic, envelope-pushing leaps in gaming technology. In this century, we've gone from Half-Life 2 in 2004 to...what? Crysis 2? The latest Call of Duty? When was the last time a PC game got the same kind of uproar over graphics that Doom did? The degree of change that used to come once a year is now coming every ten. That's not because everything got boring, it's because we're getting asymptotically close to perfection, at least from a practical home-user standpoint. That's bad if you're in the business of convincing people they need new computers every year, but it's decidedly good if you're a user.
Have you ever used a retina display on a laptop? Every reviewer that I've seen comment on it has said that they cannot go back, and not because of indefinable crispness. Also, some people do run their screens with 2-4x as much info crammed onto it - this would be the main reason I would use one.
You mention resolution--once the pixels are smaller than the naked eye can distinguish at typical viewing distance, everything else is polish and marketing. Yes, I know, retina displays are indefinably "crisper", or something. But you can't functionally cram any more useful information on to them, because the user won't be able to make it out.
We're seeing features like this because we've reached the point of diminishing returns on desktops. There aren't going to be any more massive game-changing jumps in raw processing power. In the 90s, we went from Wolfenstein 3D to Doom in a year and a half, and from there to Quake in another two and a half years--all of them gigantic, envelope-pushing leaps in gaming technology. In this century, we've gone from Half-Life 2 in 2004 to...what? Crysis 2? The latest Call of Duty? When was the last time a PC game got the same kind of uproar over graphics that Doom did? The degree of change that used to come once a year is now coming every ten. That's not because everything got boring, it's because we're getting asymptotically close to perfection, at least from a practical home-user standpoint. That's bad if you're in the business of convincing people they need new computers every year, but it's decidedly good if you're a user.