Yet, from the point of view of someone seating on the GDC Europe rooms hearing how Larrabee was going to change the world of game development, back in 2009, I think we can state it did fail.
Read again, the claims were very specific in regards to a discrete GPU. Also, the author worked at Intel on Larrabee so unless you have proof otherwise I think it's best to give them the benefit of the doubt as to accuracy of the claim.
At GDCE it was being sold as a way of doing graphics, AI and vector optimizations of code in more developer friendly than GPGPU.
Of course, framing the graphics feature is a nice way of sidelining the issue that it also didn't delivered those other features to the games development community.
> At GDCE it was being sold as a way of doing graphics, AI and vector optimizations of code in more developer friendly than GPGPU.
> Of course, framing the graphics feature is a nice way of sidelining the issue that it also didn't delivered those other features to the games development community.
There is clear evidence that the high-end cards that NVidia delivered at these time could outcompete (or keep pace) with Larrabee at that time. Thus when released Larrabee would not have been a strong contender to NVidia or AMD at that time. In this sense I stand by my position that Larrabee failed as GPU.
On the other hand I see no evidence that the rival products by AMD and NVidia could keep pace with Larrabee for AI and vector optimizations. Thus Larrabee was not a failure here. So Intel probably just concluded that in HPC there is much more money to be earned than for consumer devices and thus Larrabee was not released to consumers. And I can see good reasons: If game developers want to exploit the capabilities Larrabee has to offer, they have to depend on consumers having a Larrabee card in their computer. If Larrabee were an outstanding GPU the probability that some enthusiasts will get it was much higher than if Larrabee is just an add-on that some exotic applications/games additionally require.