Hacker News new | past | comments | ask | show | jobs | submit login

Yet, from the point of view of someone seating on the GDC Europe rooms hearing how Larrabee was going to change the world of game development, back in 2009, I think we can state it did fail.



> I think we can state it did fail.

I think we can state it did fail as a graphics processor.


I think the point is the author is being EXTREMELY disingenuous when he claims Intel didn't even want to make a GPU.


Read again, the claims were very specific in regards to a discrete GPU. Also, the author worked at Intel on Larrabee so unless you have proof otherwise I think it's best to give them the benefit of the doubt as to accuracy of the claim.



Read the original post again. I don't care what the news sites said at the time, the guy who wrote the post literally worked on it.

Read the post carefully.


Do you know who is Michael Abrash?

GDCE 2009, "Rasterization on Larrabee: A First Look at the Larrabee New Instructions (LRBni) in Action"

http://www.gdcvault.com/play/1402/Rasterization-on-Larrabee-...

There were another Larrabee talks, but only this one is online.

The marketing message was not only about graphics, but how the GPGPU features of Larrabee would revolutionize the way of writing games.


Yes, I know who Abrash is -- I have one of his most famous books on my shelf. I'll say it again, the author's claims were very specific:

in terms of engineering and focus, Larrabee was never primarily a graphics card

Note the author says primarily and is specifically talking about it in terms of engineering and focus.


Programing games is much more than just graphics.

At GDCE it was being sold as a way of doing graphics, AI and vector optimizations of code in more developer friendly than GPGPU.

Of course, framing the graphics feature is a nice way of sidelining the issue that it also didn't delivered those other features to the games development community.


> At GDCE it was being sold as a way of doing graphics, AI and vector optimizations of code in more developer friendly than GPGPU.

> Of course, framing the graphics feature is a nice way of sidelining the issue that it also didn't delivered those other features to the games development community.

There is clear evidence that the high-end cards that NVidia delivered at these time could outcompete (or keep pace) with Larrabee at that time. Thus when released Larrabee would not have been a strong contender to NVidia or AMD at that time. In this sense I stand by my position that Larrabee failed as GPU.

On the other hand I see no evidence that the rival products by AMD and NVidia could keep pace with Larrabee for AI and vector optimizations. Thus Larrabee was not a failure here. So Intel probably just concluded that in HPC there is much more money to be earned than for consumer devices and thus Larrabee was not released to consumers. And I can see good reasons: If game developers want to exploit the capabilities Larrabee has to offer, they have to depend on consumers having a Larrabee card in their computer. If Larrabee were an outstanding GPU the probability that some enthusiasts will get it was much higher than if Larrabee is just an add-on that some exotic applications/games additionally require.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: