Hacker News new | past | comments | ask | show | jobs | submit login
Intel graphics chief Raja Koduri leaves after five years battling Nvidia and AMD (theverge.com)
79 points by oumua_don17 on March 21, 2023 | hide | past | favorite | 55 comments



It wasn't much of a battle, as anyone who had been saddled with trying to run a game on their intel graphics has seen.


Arguably we haven't yet seen the final outcome of his efforts, given the time frame to release a new card, unless Intel gives up on it.


raja cannot fail, only be failed.

like, the people actually in the room with him don't seem that impressed. see the testimonials about his service at intel from his boss:

https://i.imgur.com/hrVfr8G.png

https://i.imgur.com/LTfYvLi.png

list of good things about raja's time here: it has come to an end. also, we are excited to announce our cafeteria "bring your own fruit to work" program will be extended throughout the week.

dude was a fuckup at AMD too, and honestly Terascale was a mess in the early days too despite eventually being cleaned up later. Once AMD sidelined Raja onto Vega and refocused onto RDNA, they started succeeding again. And all anyone ever says is "you didn't give him long enough".

https://www.pcgamesn.com/amd-sony-ps5-navi-affected-vega

He's been at Intel for five and a half years, close to 6 years. This is his baby, he ran it start to finish and it's still a mess.

He may or may not have been a great engineer (maybe he wasn't the talent on those teams and/or there were more people able to tell him no, or tone it down and work with him to develop his ideas) but either way he rose to the level of his incompetence and finally found someone who wouldn't let him fail upwards in the management track.

MLID is MLID but in this same video he basically dropped that his sources are now saying battlemage and celestial are cut down to a single small die for each generation, basically a tech demonstrator to let them keep working on it, and my guess is it's going to have all the fanfare of Cannon Lake or DG1 - these will be buried, they will serve as milestones not serious revenue products. And while druid still is slated for a full launch, that's a long way off (almost 5 years) and one can assume that if the GPU teams don't catch traction, the whole thing is going into the trash can and intel goes back to making display adapters. And none of that is particularly surprising other than, again, that they're even continuing at this at all.

https://www.youtube.com/watch?v=HzV1RS5Oc6I

having your division broken into two and reorg'd into more successful units with a track record of successful execution and then having (full) revenue release of its products pushed back 5 years (during a company-wide financial crisis) so that the whole thing can be done over by someone more competent is not done lightly, this is really extraordinary, and it's remarkable that the whole thing isn't canceled. frankly it's likely a testimonial to how much of this is Raja's fault that Intel is giving some other people a crack at this.

Charlie Demerjian thinks this was an "amicable split", that's not remotely the tenor I get from that newsletter, but Charlie likes ATI, and Raja is an ATI guy. I think it's pretty telling that this dropped exactly 3 months to the day since AXG was broken apart. Raja's most likely been on garden leave for a contractually-mandated 3 months while they try to clean up his mess. Again.



True. I have a cheap Ryzen laptop with no discrete GPU and I can play a lot of AAA games at 30FPS, do accelerated video encoding and even GPU raytracing. Intel APU are getting a bit better with the 11th generation, but Intel neglected the space for a decade... AMD earned its place while Intel survived off brand recognition for way too long.


I think recognition is all they had. Intel Graphics has always been a brand of slow.


Intel’s iGPUs are slow for gaming, but they are fine for desktop use, which is what most people use them for. If you are building a gaming machine, it wouldn’t really matter how good the iGPU is anyway, because you’ll be getting a dGPU.

And their Linux drivers are pretty solid usually.

They need to do better to compete on the high-end, but I love them for my daily driver work laptop.


Intel integrated graphics has been at least competitive with AMD's since Ice Lake. If you include the older parts with eDRAM, they've been pretty solid for a while.


The only slight advantage Intel GPUs have had is they're somewhat more documented than AMD or nVidia.


Not that he was successful at either company, actually Radeon is much better now.

Any one remembers the flop that was Vega?

It was so hyped, and then it floped, let's hope Intel entire GPU division doesn't flop too


Wasn’t he an exec at ATI when they launched the 9800 Pro? Arguably one of their most successful GPUs, if not the most successful.

So he has had some successful launches, which probably is why he landed these gigs… however yes, Vega wasn’t much of a hit for gaming. It mainly excelled at compute.


As I understand it, the ATI days were his last successes. Since then, his career as an exec has involved presiding over products that are somewhere between terrible and mediocre.


Wait, are you talking about the same Vega that's in the AM4 APUs and Ryzen Embedded? If so how would that be a flop? It's been a huge success.


No, I meant Radeon Vega56 and Vega64


edit nvm, I’m the one who can’t read.


He worked at both Intel and AMD (at different times obviously, AMD first). This is what the original poster meant by “either company.”


Finally! This is Great News for Intel. The last obstacle in Pat’s way to turn around Intel. (Other than the Board)

The next step should be to scale down GPU design, and attract Nvidia and AMD to Fab with Intel.

There are at least a few of us on HN who have been very critical of Raja since the beginning.


No please keep working on that GPU.


He must be charming personality and great interviewer for people to lure him everywhere :)


In fairness, regardless of his actual competence, if you're a company looking to hire a VP with experience in shipping powerful GPUs for graphics and general compute, then your pickings are very slim in terms of candidates. The fingers one one had are probably enough to fill the list of candidates.


You mean coding puzzles is not a thing?

I think what Intel needs is a culture of innovation. Where things can be experimented with, failed, evaluated with data-driven insights.

I actually think somebody who is creative, can articulate vision, and act effectively on criticism (eg change direction), be just a little less -inclusivity-building and more meritocracy-building -- is all they need.

Somebody like Elon Mask.


I feel like you've got to pick from Nvidia if you're Intel, no? Picking an engineering director at Nvidia would be better than picking a VP that was responsible for AMD's Vega.


He used to work at ATI during the 9800 Pro days. That was pretty much their most successful GPU.


Seems like it might be a bad sign so many execs are leaving Intel.


its a GREAT sign - it means Gelsinger is serious about making progress and is willing to replace ineffective leaders

I would be more concerned by the opposite - presuming to fix the company using the same people who were responsible for the decline


This sounds like you are trying to gaslight yourself. Gelsinger did not get rid of an "ineffective leader." Koduri left to go do something else.


I have heard from people who worked with him that Raja Koduri was not always the most effective leader, despite having cultivated a brand as a "celebrity" computer architect. From the outside, he has been responsible for a lot of let-downs from ATI and AMD's graphics divisions, and has otherwise managed somewhat uninspiring efforts. Intel's graphics effort has certainly been a disaster, and that flows from Koduri. This seems like standard corporate seppuku.


Likely not even close as the endgame was a face-saving move both for Intel and Raja.

From the outside this looks like a classic case of giving someone a window seat and sending the message, if not outright telling them: 'figure out your next career move'. He botched Arc badly (both in execution and messaging) and was briefly promoted then demoted. Had he tried to stick around, he most likely would have been let go and soon.


if you tell your boss you are leaving and they say "good luck" and don't try to counter, they probably aren't that bothered that you are leaving

in this case, Intel's situation on GPUs is so dire that it makes total sense to clear the bench...the status quo was bleak


I have heard Arc is unstable


I've been using it as a daily driver. Lots of rough edges, especially with older software, but it's relatively fine if you've comfortable with tweaking. I'd say I've had a significantly smoother experience with it than I did gaming on Linux five years ago. I'd probably be a lot more frustrated with it if it wasn't so inexpensive compared to NVidia's offerings.


Ironically on Linux older software works pretty well with Arc because of DXVK but basically all new DirectX 12 games don't even launch because Intel's driver is missing some feature for VKD3D.


Intel should scour Jensen Huang and Lisa Su's family tree to find more GPU Executives.

(Jensen is the CEO and founder of Nvidia and the uncle of the CEO of AMD, both are from Taiwan.)


>Jensen is the CEO and founder of Nvidia and the uncle of the CEO of AMD

IIRC this myth has been debunked. Jensen and Lisa are not related.


I though it was more that Lisa Su's own grandfather is actually Jen-Hsun Huang's uncle or something. So related but not like it was published.


Indeed, that Su's grandfather would be Huang's uncle is the claim: https://www.techtimes.com/articles/253736/20201030/fact-chec... The source is this video: https://www.youtube.com/watch?v=ovdss5CBrxU No idea if this was verified or debunked.


"battling" - what a failure Intel's graphics strategy has been


I take a very different impression. Is the latest Intel GPU the best in class? Nope, not even close. Does it show strong potential after so little time? Absolutely. While I have my grievances with Intel, I wish them nothing but success so we are not stuck with a two-party GPU monopoly.


> Does it show strong potential after so little time?

(Checks watch) Uh... Intel has been working on graphics for over twenty-five years. https://en.wikipedia.org/wiki/List_of_Intel_graphics_process...

It seems like every 5 years, they say, "This is it, we're gonna be competitive this time," and ... they never are. Not even close.


>Intel has been working on graphics for over twenty-five years

Yeah but integrated graphics and dedicated desktop/server class GPUs for graphics, multimedia and general compute are two completely different beasts in terms of HW and SW.

Qualcomm, ARM, Imagination also have been making GPU IPs since forever, but I doubt any of them can jump in the ring with Nvidia and AMD. Do you think they wouldn't like a piece of that lucrative market share? But they know that dislodging the Nvidia-AMD duopoly so late in the game is an exercise in futility.


They're not "completely different beasts", and one area Intel fell flat on their faces with was drivers.

A company that has been making GPU drivers for so long shouldn't be releasing the utter dumpster fire that was their discreet GPU drivers. Reviewers found features were missing, didn't work, or were horribly unreliable.


Their integrated GPU drivers weren't any better throughout; I've had to RE some in the past and I could imagine what their source looked like to generate some of the monstrosities I saw. They seriously should've just released the specifications for their GPUs and let the community write the drivers.


If you want to count iGPUs, Intel has been trouncing Nvidia and AMD in terms of units sold for most of those 25 years.


Wasn't the last time they went anywhere near this ~14 years ago?


Nope, larrabee was more recent than that, and it stumbled along as xeon phi for a bit too. They've been abjectly failing to deliver Aurora for about a decade too.


Larrabee was released in 2010, so not massively far off

Xeon Phi is also not a GPU product, surely


> Does it show strong potential after so little time? Absolutely.

GamersNexus laughed them out of the room and said the drivers were unreliable, the software featureless, and performance was beyond underwhelming.

As someone else pointed out: Intel has been in the GPU market for nearly three decades. They have a massive amount of in-house talent and fab ability.

They're also arriving right as the market has collapsed - it's completely saturated and a huge amount of demand evaporated overnight. There's little to no market for current consumer or even workstation class GPUs in terms of "AI" - growth in that sector is almost entirely in datacenter-tier products, which Intel offers nothing for.


Not from what I have seen. GN is pretty encouraging and frequently reports on Arc driver improvements.


It's almost as if it's incredibly difficult to come out the gates with a competitive GPU.


It's really not that "difficult" to come out with a competitive GPU.

However, it is really expensive. Even worse, from Intel's point of view, that enormous expense is almost all in software--which must be maintained for an extended amount of time and looks like a pure expense that will never be recouped.

Even AMD is struggling with this.

If Intel wanted to win this space, they needed to embed a bunch of staff in every major gaming company to make sure their engine runs well on the card.

If Intel really wanted to displace Nvidia, they needed to completely open the specs and fund every single academic ML/AI/Graphic research team for the next 10 years if they used Intel cards.


IMHO the real target is the enterprise market - and no one really cares about the gaming market anymore; look at Nvidia and AMD, both pretty much gave the middle finger to gamers with their high prices.

The cynic in me feel that Intel is targeting the gaming market now because they have no shot in the enterprise world and they need _someone_ to fund their GPU development. Still it’s welcomed competition. Enjoy it while it last I guess.


They'd maybe have a shot if they'd delivered anything in the enterprise market. Some of us are desperate to get a decent alternative to NVIDIA (AMD software story is... not there, but at least you can find some MI100 or MI250 gear if you're lucky) and Intel keeps not delivering the enterprise GPUs we're begging them for.

Not even asking for a A100/H100 level thing now, but please break the A40 30-TFLOPS-and-lower-market (I even don't mind putting two 'max' 1100 for one A40...).

Just start selling some gear to benchmark and port stuff to sycl dammit. No big OEM (HPE, Dell, Lenovo) has any to put in their servers (or at least they're not selling them to me) and why make it so hard.

Either Intel doesn't remember how to sell server gear or they can't deliver anything but supercomputer stuff and EVEN THERE it's still spotty it seems.


AMD's software stack is usable if your willing to run a buggy distro like Ubuntu 22.04 LTS.

You know it's buggy when the Ubuntu installer gets on the network just fine, but it misnames the Ethernet interface and chokes on the WiFi drivers firmware. Meanwhile Debian 10 and 11 just work on the same hardware with zero futzing or debugging m


Which is good for benchmarking, good for AMD (and interesting to hear, thanks) but doesn't bode well for production use. But AT LEAST you can buy some gear and get some dev mindshare and perf numbers.


They gave NVIDIA and ARM many years of advantage. Now they are battling for their lives.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: