list of good things about raja's time here: it has come to an end. also, we are excited to announce our cafeteria "bring your own fruit to work" program will be extended throughout the week.
dude was a fuckup at AMD too, and honestly Terascale was a mess in the early days too despite eventually being cleaned up later. Once AMD sidelined Raja onto Vega and refocused onto RDNA, they started succeeding again. And all anyone ever says is "you didn't give him long enough".
He's been at Intel for five and a half years, close to 6 years. This is his baby, he ran it start to finish and it's still a mess.
He may or may not have been a great engineer (maybe he wasn't the talent on those teams and/or there were more people able to tell him no, or tone it down and work with him to develop his ideas) but either way he rose to the level of his incompetence and finally found someone who wouldn't let him fail upwards in the management track.
MLID is MLID but in this same video he basically dropped that his sources are now saying battlemage and celestial are cut down to a single small die for each generation, basically a tech demonstrator to let them keep working on it, and my guess is it's going to have all the fanfare of Cannon Lake or DG1 - these will be buried, they will serve as milestones not serious revenue products. And while druid still is slated for a full launch, that's a long way off (almost 5 years) and one can assume that if the GPU teams don't catch traction, the whole thing is going into the trash can and intel goes back to making display adapters. And none of that is particularly surprising other than, again, that they're even continuing at this at all.
having your division broken into two and reorg'd into more successful units with a track record of successful execution and then having (full) revenue release of its products pushed back 5 years (during a company-wide financial crisis) so that the whole thing can be done over by someone more competent is not done lightly, this is really extraordinary, and it's remarkable that the whole thing isn't canceled. frankly it's likely a testimonial to how much of this is Raja's fault that Intel is giving some other people a crack at this.
Charlie Demerjian thinks this was an "amicable split", that's not remotely the tenor I get from that newsletter, but Charlie likes ATI, and Raja is an ATI guy. I think it's pretty telling that this dropped exactly 3 months to the day since AXG was broken apart. Raja's most likely been on garden leave for a contractually-mandated 3 months while they try to clean up his mess. Again.
True. I have a cheap Ryzen laptop with no discrete GPU and I can play a lot of AAA games at 30FPS, do accelerated video encoding and even GPU raytracing. Intel APU are getting a bit better with the 11th generation, but Intel neglected the space for a decade... AMD earned its place while Intel survived off brand recognition for way too long.
Intel’s iGPUs are slow for gaming, but they are fine for desktop use, which is what most people use them for. If you are building a gaming machine, it wouldn’t really matter how good the iGPU is anyway, because you’ll be getting a dGPU.
And their Linux drivers are pretty solid usually.
They need to do better to compete on the high-end, but I love them for my daily driver work laptop.
Intel integrated graphics has been at least competitive with AMD's since Ice Lake. If you include the older parts with eDRAM, they've been pretty solid for a while.
Wasn’t he an exec at ATI when they launched the 9800 Pro? Arguably one of their most successful GPUs, if not the most successful.
So he has had some successful launches, which probably is why he landed these gigs… however yes, Vega wasn’t much of a hit for gaming. It mainly excelled at compute.
As I understand it, the ATI days were his last successes. Since then, his career as an exec has involved presiding over products that are somewhere between terrible and mediocre.
In fairness, regardless of his actual competence, if you're a company looking to hire a VP with experience in shipping powerful GPUs for graphics and general compute, then your pickings are very slim in terms of candidates. The fingers one one had are probably enough to fill the list of candidates.
I think what Intel needs is a culture of innovation.
Where things can be experimented with, failed, evaluated with data-driven insights.
I actually think somebody who is creative, can articulate vision, and act effectively on criticism (eg change direction), be just a little less -inclusivity-building and more meritocracy-building -- is all they need.
I feel like you've got to pick from Nvidia if you're Intel, no? Picking an engineering director at Nvidia would be better than picking a VP that was responsible for AMD's Vega.
I have heard from people who worked with him that Raja Koduri was not always the most effective leader, despite having cultivated a brand as a "celebrity" computer architect. From the outside, he has been responsible for a lot of let-downs from ATI and AMD's graphics divisions, and has otherwise managed somewhat uninspiring efforts. Intel's graphics effort has certainly been a disaster, and that flows from Koduri. This seems like standard corporate seppuku.
Likely not even close as the endgame was a face-saving move both for Intel and Raja.
From the outside this looks like a classic case of giving someone a window seat and sending the message, if not outright telling them: 'figure out your next career move'. He botched Arc badly (both in execution and messaging) and was briefly promoted then demoted. Had he tried to stick around, he most likely would have been let go and soon.
I've been using it as a daily driver. Lots of rough edges, especially with older software, but it's relatively fine if you've comfortable with tweaking. I'd say I've had a significantly smoother experience with it than I did gaming on Linux five years ago. I'd probably be a lot more frustrated with it if it wasn't so inexpensive compared to NVidia's offerings.
Ironically on Linux older software works pretty well with Arc because of DXVK but basically all new DirectX 12 games don't even launch because Intel's driver is missing some feature for VKD3D.
I take a very different impression. Is the latest Intel GPU the best in class? Nope, not even close. Does it show strong potential after so little time? Absolutely. While I have my grievances with Intel, I wish them nothing but success so we are not stuck with a two-party GPU monopoly.
>Intel has been working on graphics for over twenty-five years
Yeah but integrated graphics and dedicated desktop/server class GPUs for graphics, multimedia and general compute are two completely different beasts in terms of HW and SW.
Qualcomm, ARM, Imagination also have been making GPU IPs since forever, but I doubt any of them can jump in the ring with Nvidia and AMD. Do you think they wouldn't like a piece of that lucrative market share? But they know that dislodging the Nvidia-AMD duopoly so late in the game is an exercise in futility.
They're not "completely different beasts", and one area Intel fell flat on their faces with was drivers.
A company that has been making GPU drivers for so long shouldn't be releasing the utter dumpster fire that was their discreet GPU drivers. Reviewers found features were missing, didn't work, or were horribly unreliable.
Their integrated GPU drivers weren't any better throughout; I've had to RE some in the past and I could imagine what their source looked like to generate some of the monstrosities I saw. They seriously should've just released the specifications for their GPUs and let the community write the drivers.
Nope, larrabee was more recent than that, and it stumbled along as xeon phi for a bit too. They've been abjectly failing to deliver Aurora for about a decade too.
> Does it show strong potential after so little time? Absolutely.
GamersNexus laughed them out of the room and said the drivers were unreliable, the software featureless, and performance was beyond underwhelming.
As someone else pointed out: Intel has been in the GPU market for nearly three decades. They have a massive amount of in-house talent and fab ability.
They're also arriving right as the market has collapsed - it's completely saturated and a huge amount of demand evaporated overnight. There's little to no market for current consumer or even workstation class GPUs in terms of "AI" - growth in that sector is almost entirely in datacenter-tier products, which Intel offers nothing for.
It's really not that "difficult" to come out with a competitive GPU.
However, it is really expensive. Even worse, from Intel's point of view, that enormous expense is almost all in software--which must be maintained for an extended amount of time and looks like a pure expense that will never be recouped.
Even AMD is struggling with this.
If Intel wanted to win this space, they needed to embed a bunch of staff in every major gaming company to make sure their engine runs well on the card.
If Intel really wanted to displace Nvidia, they needed to completely open the specs and fund every single academic ML/AI/Graphic research team for the next 10 years if they used Intel cards.
IMHO the real target is the enterprise market - and no one really cares about the gaming market anymore; look at Nvidia and AMD, both pretty much gave the middle finger to gamers with their high prices.
The cynic in me feel that Intel is targeting the gaming market now because they have no shot in the enterprise world and they need _someone_ to fund their GPU development. Still it’s welcomed competition. Enjoy it while it last I guess.
They'd maybe have a shot if they'd delivered anything in the enterprise market. Some of us are desperate to get a decent alternative to NVIDIA (AMD software story is... not there, but at least you can find some MI100 or MI250 gear if you're lucky) and Intel keeps not delivering the enterprise GPUs we're begging them for.
Not even asking for a A100/H100 level thing now, but please break the A40 30-TFLOPS-and-lower-market (I even don't mind putting two 'max' 1100 for one A40...).
Just start selling some gear to benchmark and port stuff to sycl dammit. No big OEM (HPE, Dell, Lenovo) has any to put in their servers (or at least they're not selling them to me) and why make it so hard.
Either Intel doesn't remember how to sell server gear or they can't deliver anything but supercomputer stuff and EVEN THERE it's still spotty it seems.
AMD's software stack is usable if your willing to run a buggy distro like Ubuntu 22.04 LTS.
You know it's buggy when the Ubuntu installer gets on the network just fine, but it misnames the Ethernet interface and chokes on the WiFi drivers firmware. Meanwhile Debian 10 and 11 just work on the same hardware with zero futzing or debugging m
Which is good for benchmarking, good for AMD (and interesting to hear, thanks) but doesn't bode well for production use. But AT LEAST you can buy some gear and get some dev mindshare and perf numbers.