Hacker News new | past | comments | ask | show | jobs | submit login

> but we've hit a point where even high-end chips from that generation aren't really wide enough for a lot of users; four cores isn't really enough for midrange consumer workloads.

So "and software demanding more resources for no good fucking reason.".

You're really just reinforcing the parent's point.




That's silly. Even if you set aside that people want computers to do more things at once, the basic act of decoding 4K video is beyond a Sandy Bridge chip without a discrete GPU. The world has passed it by. Sorry that weirdo tech primordialism doesn't really work, but not that sorry.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: