One thing that didn't seem to be addressed in the slides themselves is the reason that GPUs have fixed function texturing operations, a fixed pipeline stage configuration, etc. It saves on memory bandwidth by making the memory accesses more coherent and dramatically magnifying the usefulness of relatively small caches.
Basically this is a claim that there's a massive amount of graphical quality that will be unleashed once we're unchained from the tyranny of the fixed function GPU behavior that remains, and that this is worth whatever loss in computational power we'll have when we increase the sizes of caches to cover the less-coherent memory access.
I'm not sure if I buy that. Games look quite nice already, and I'm not sure if I could personally tell the difference between what we see in something like Crysis and something like Toy Story, which didn't have the shackles of fixed function pipelines.
An easier explanation that's long been a pet theory of mine is that Larrabee is essentially a thread parallel FP-heavy processor for the scientific market motivated by GPGPU's encroachment on that market segment, and necessarily labeled a "CPU/GPU" in order to not step on toes in the wrong places at Intel.
It's definitely going to be an interesting next couple years in graphics, though.
Pixar has their own style that is intentionally not photorealistic. Also, Toy Story is from 1995. Try instead comparing Crysis to the effects from a modern blockbuster, which have to hold up well when compared to the live action shots.
Ditching the fixed-function pipeline isn't all about realism, though. More flexibility will allow more experimentation with non-photorealistic rendering. Maybe we'll see more games with graphics that look like they were painted or drawn.
Another benefit of more flexible rendering should be easier game development. Writing a complete renderer from scratch will be hard but most games will use middleware for that. The most expensive part of a modern game is the art, and a flexible renderer could do a lot to make generating that art easier.
I do share your pet theory about Larrabee trying to avoid stepping on toes at Intel by labeling themselves a GPU. However, in order for Larrabee to be worthwhile for Intel it needs a mass market, and graphics is it. If Larrabee fails at graphics it will likely not survive.
I think he mostly hopes for something like REYES for games, which would improve quality somewhat (motion blur and DOF look nice), but that's probably never going to become part of the fixed-function pipeline. That's why he's calling for a "back to software" approach.
WRT Larrabee, I agree with you, and I wonder if the scientific market can recover their development costs.
Basically this is a claim that there's a massive amount of graphical quality that will be unleashed once we're unchained from the tyranny of the fixed function GPU behavior that remains, and that this is worth whatever loss in computational power we'll have when we increase the sizes of caches to cover the less-coherent memory access.
I'm not sure if I buy that. Games look quite nice already, and I'm not sure if I could personally tell the difference between what we see in something like Crysis and something like Toy Story, which didn't have the shackles of fixed function pipelines.
An easier explanation that's long been a pet theory of mine is that Larrabee is essentially a thread parallel FP-heavy processor for the scientific market motivated by GPGPU's encroachment on that market segment, and necessarily labeled a "CPU/GPU" in order to not step on toes in the wrong places at Intel.
It's definitely going to be an interesting next couple years in graphics, though.