A console wants a competitive GPU and a competitive CPU. Nvidia has the first, Intel has the second, AMD has both. The first one is more important, hence the Switch, and AMD's ability to do it in the pre-Ryzen era. (The original Intel-CPU Xbox had an Nvidia GPU.) Console vendors are high volume institutional buyers with aggressive price targets, so being able to get both from the same place is a big advantage.
For PCs, discrete GPUs are getting bought separately so that doesn't apply. AMD does alright there but they were historically the underdog and highly budget-constrained, so without some kind of separate advantage they were struggling.
Now they're making a lot of money from Ryzen/Epyc and GPUs, and reinvesting most of it, so it's plausible they'll be more competitive going forward as the fruits of those investments come to bear.
For gaming, AMD GPUs are generally better bang for buck than Nvidia - notably, they tend to be about the same bang for less buck, at least on the high end and some of the middle tier. The notable exception is ray tracing but that's still pretty niche.
If AMD gets their act together and get the AI tooling for their GPUs to be as accessible as Nvidia's they have a good chance to become the winners there as you can get more VRAM bang for, again, less buck.
In a market where they have similar performance and everything minus ray tracing for somewhere between 50% and 70% of the competitor's prices it will be pretty easy to choose AMD GPUs.
They already have the best CPUs for gaming and really are positioning themselves to have the best GPUs overall as well.
I also think something important here is AMD's strategy with APU has been small to large. Something that really stood out to me over the last few years is that NVidia was capturing the AI market with big and powerful GPU while AMD's efforts were all going into APU research at the low end. My belief is that they were preparing for a mobile-heavy future where small, capable all-purpose chips would have a big edge.
They might even be right. One of the potential advantages of the APU approach is if they GPU can be absorbed into the CPU with shared memory, a lot of the memory management of CUDA would be obsoleted and it becomes not that interesting any more. AMD are competent, they just have sucky crash-prone GPU drivers.
When I run a LLM (llama.cpp ROCm) or stable diffusion models (Automatic1111 ROCm) on my 7900XTX under Linux, and it runs out of VRAM, it messes up the driver or hardware so badly that without a reboot all subsequent runs fail.
You're probably using it for graphics though; the graphics drivers are great. I refuse to buy a Nvidia card just because I don't want to put up with closed source drivers.
The issue is when using ROCm. Or more accurately when preparing to crash the system by attempting to use ROCm. Although in fairness as the other commenter notes it is probably a VRAM issue so I've been starting to suspect maybe the real culprit might be X [0]. But it presumably doesn't happen with CUDA and it is a major blocker to using their platform for casual things like multiplying matricies.
But if CPU and GPU share a memory space or it happens automatically behind the scenes, then the problem neatly disappears. I'd imagine that was what AMD was aiming for and why they tolerated the low quality of the experience to start with in ROCm.
[0] I really don't know, there is a lot going on and I'm not sure what tools I'm supposed to be using to debug that sort of locked system. Might be the drivers, might be X responding really badly to some sort of OOM. I lean towards it being a driver bug.
I have had a RX580, RX590, 6600XT, and 7900XT using Linux with minimal issues. My partner has had a RX590, 7700XT on Windows and she's had so many issues it's infuriating.
If you’re building a midlife crisis gaming PC the 4090 is the only good choice right now. 4K gaming with all the bells and whistles turned on is a reality with that GPU.
The 4090 is also upwards of $2k. This is more that what i spent on my entire computer that's a few years old and is still very powerful. We used to rag on people buying titans for gaming, but since Nvidia did the whole marketing switcheroo now titans are just *090 cards and they appear as reasonable cards.
My point is that Nvidia has the absolute highest end, but it's ridiculous to suggest that anyone with a budget less than the GDP of Botswana should consider the 4090 as an option at all. For actually reasonable offers, AMD delivers the most performance per dollar most of the time.
PSA: GeForce Now Ultimate is a great way to check this out. You get a 4080 equivalent that can stream 4K 120Hz. If you have good Internet in the continental US or most of Europe, it’s surprisingly lag-free.
Specifically modern "normal" gaming. Once you get outside that comfort zone AMD has problems again. My 7900XTX has noticeably worse performance than the 1070 I replaced when it comes to Yuzu and Xenia.
AMD tend to have better paper specs but worse drivers/software; it's been like that for decades. At least they're now acknowledging that the state of their software is the problem, but I haven't seen anything to give me confidence that they're actually going to fix it this time.
The original Xbox was supposed to use AMD CPU as well, and parts of that remained in the architecture (for example it uses Hyper Transport bus), backroom deals with intel led to last minute replacement of CPU for an intel P6 variant.
Also, the security system assumes that the system has a general protection fault when the program counter wraps over the 4GB boundary. That only happens on AMD, not Intel.
Also, AMD is the obvious choice for Valve over something like Nvidia due to AMD having a decent upstream Linux support for both CPU and GPU features. It's something Nvidia never cared about.
NVidia has had more reliable and, at least until recently, more featureful drivers on Linux and FreeBSD for decades. They're just not open-source, which doesn't seem like it would matter for something like the Steam Deck.
Here's a secret: a lot of the featurefulness of AMD's current Linux drivers is not due to AMD putting in the work, but due to Valve paying people to work on GPU drivers and related stuff. Neither AMD nor Nvidia can be bothered to implement every random thing a customer who moves maybe a million low cost units per year wants. But with open source drivers, Valve can do it themselves.
Pierre-Loup Griffais recently re-tweeted NVK progress with enthusiasm, given that he used to work at Nvidia on the proprietary driver it's replacing I think it's a sign that Valve wouldn't particularly want Nvidia's driver even given the choice, external bureaucracy is something those within the company will avoid whenever possible.
Going open source also means their contributions can propagate through the whole Linux ecosystem. They want other vendors to use what they're making, because if they do they'll undoubtedly ship Steam.
Reliablie is very moot when they simply support only what they care about and don't support the rest for those decades. Not being upstreamed and not using standard kernel interfaces makes it only worse.
So it's not something Valve wanted to deal with. They commented on benefits of working with upstream GPU drivers, so it clearly matters.
NVidia are the upstream for their drivers, and for a big client like Valve they would likely be willing to support the interfaces they need (and/or work with them to get them using the interfaces NVidia like). Being less coupled to the Linux kernel could go either way - yes they don't tend to support the most cutting-edge Linux features, but by the same token it's easier to get newer NVidia drivers running on old versions of Linux than it is with AMD.
(Does Valve keep the Steam Deck on a rolling/current Linux kernel? I'm honestly surprised if they do, because that seems like a lot of work and compatibility risk for minimal benefit)
> The decision to move from Debian to Arch Linux was based on the different update schedule for these distributions. Debian, geared for server configurations, has its core software update in one large release, with intermediate patches for known bugs and security fixes, while Arch uses a rolling update approach for all parts. Valve found that using Arch's rolling updates as a base would be better suited for the Steam Deck, allowing them to address issues and fixes much faster than Debian would allow. SteamOS itself is not rolling release.
Upstream is the kernel itself and standard kernel interfaces. Nvidia doing their own (non upstream) thing is the main problem here. They didn't work with libdrm for years.
Being a client or even a partner doesn't guarantee good cooperation with Nvidia (Evga has a lot to comment on that). As long as Nvidia is not a good citizen in working with upstream Linux kernel, it's just not worth investing effort in using them for someone like Valve.
Stuff like HDR support or anything the like are major examples why it all matters.
Yes, but none of that is important for a console. You're talking about integration into libraries of normal desktop distros which aren't that important when the console can just ship the compatible ones.
But Valve contributes directly to AMD drivers, they employ people working on both on Mesa and DX => vk layers. And kernel. It's very neatly integrated system.
The PS3 could do general purpose computing. Its built in operating system let you play games, watch movies, play music, browse the web, etc. At the beginning of the console's life you could even install Linux on it.
The hardware of consoles has been general purpose for decades.
The current and previous generations of xbox and playstation do use x86 CPUs with integrated AMD GPUs. But they aren't just PCs in a small box. Using a unified pool of GDDR memory* is a substantial architectural difference.
*except for the xbox one & one s, which had its own weird setup with unified DDR3 and a programmer controlled ESRAM cache.
They're certainly good enough to be the integrated graphics solution for office work, school computers, media centres, casual gaming (the same as AMD's low end), and other day to day tasks.
That's most of the market, but I think it's a stretch to say that they're good enough for consoles.
The A770 seems like a solid midrange card, about on par with 4060 Ti and 6700 XT. Even not that far from 4070 in some newer games Intel’s drivers are better optimized for. You also get 16 GB of memory.
Of course considering it’s Intel the power usage is way too high. Also AFAIK compatibility with pre DX11 games is poor and its not even clear Intel is eve making any money on it..
For PCs, discrete GPUs are getting bought separately so that doesn't apply. AMD does alright there but they were historically the underdog and highly budget-constrained, so without some kind of separate advantage they were struggling.
Now they're making a lot of money from Ryzen/Epyc and GPUs, and reinvesting most of it, so it's plausible they'll be more competitive going forward as the fruits of those investments come to bear.