Yes, nvidia (the proprietary GPU driver) was way better than fglrx (the proprietary amd driver) and even nouveau (the open source reverse engineered driver for Nvidia) worked better than radeon for me in those days. It was intel for smooth sailing or nvidia for acceptable performance.
I think their unpopularity these days come from three factors:
1. For the gamer crowd, RTX 2000 was a price hike and even the higher end cards were not that impressive for performance. It looks like they were resting on their lead against AMD.
2. People love to root for the underdog and AMD was behind for the longest time.
3. For the developer crowd, the closed source Nvidia drivers were not great for Linux compatibility, and the Mac crowd couldn't care because they couldn't have Nvidia after their fight with Apple.
I think for a sample of the general public who have opinions on Nvidia, it's 1,2 and 3 in that order while for HN users, it's 3, 2 and 1 in that order.
> 2. People love to root for the underdog and AMD was behind for the longest time.
AMD has had pretty much always a solid mid-tier offering in the last five years, but people would still rather buy an nVidia card (like, say, a 1050) instead of the better AMD (at the price point) because of the "1080 Ti" halo effect.
nVidias software stack is pretty bad. Sure, the driver core works pretty well. The nVidia control panel looks like it was last updated in 2004, which is actually true (go find some screenshots of it from 15 years ago running on XP, it looks the same). Now, that doesn't mean it has to be bad (don't fix what isn't broken), but the NCP is actually clunky to use. Not to imply AMD's variant is necessarily better, but at least they're working on it.
The nVidia "value-adds" like shadow play and so on are all extremely buggy. And while their core driver may be good, you still get somewhat frequent driver resets and hangs with certain interactive GPU compute applications.
>AMD has had pretty much always a solid mid-tier offering in the last five years, but people would still rather buy an nVidia card (like, say, a 1050) instead of the better AMD (at the price point) because of the "1080 Ti" halo effect.
Do people actually think this way? "I'll buy the nvidia 2060 super rather than the amd 5700 xt (for the same price, and benchmarks higher), because nvidia has the 2080 ti"
The people who look at benchmarks aren't the ones deciding that way. It's more like "I heard Nvidia makes the fastest GPUs, I'll buy the Nvidia one I can afford."
people say that but much of the time AMD has excluded themselves from consideration for other reasons. 5000 series, 7000/200 series, and 480/Vega series all saw major cryptomining booms that raised prices far beyond those of the equivalent NVIDIA cards, so for much of the time they were simply not priced competitively. Like, for almost a year you would have to pay $1000+ for a Vega that barely matched a 1080 while you could get a 1080 Ti for $800 or so.
Also, there's been constant driver problems literally since the ATI days, just recently drivers basically crippled Navi for the entire duration of the product generation, drivers crippled Vega previously, Fiji was a driver mess, Hawaii and Tahiti were not problem free either (the famous "company A vs B vs C" article [0] notes that "company B's drivers are bad and are on a downtrend" and that was during the heyday of GCN, in 2014, those were the "stable drivers" that everyone rhapsodizes about in comparison to the mess that is Navi/Vega). Terascale was a fucking mess too.
AMD products often have weaker feature sets. It took them years to catch up with G-Sync (offering low-quality products with poor quality control for years until NVIDIA cleaned things up with GSync Compatible certification). NVENC is far better than AMD's equivalent (Navi's H264 encoder is still broken entirely as far as I know, it provides less than realtime encoding speed and extremely poor quality), so if you want to stream with AMD you have to use your CPU and crush your framerate, or purchase a much more expensive CPU. AMD has no answer for DLSS, and is just implementing their first generation RTX support with the next generation.
This is what I refer to as the "NVIDIA mind control field" theory. That consumers are just so wowed by the NVIDIA brand that they can't help themselves. The reality is AMD simply has not been that compelling an offering for that much of the time. There has been a lot of poor execution over the years from the Radeon team and prices have often not been as good as people remember them to be.
And a few good products don't change that either. Like, it took something like 8 years of AMD slacking off (Raja mentioned in a presentation that around 2012 thatAMD management thought "discrete GPUs were going away" [1] and pulled the plug on R&D - certainly a very attractive idea given their budgetary problems at the time) for NVIDIA to reach their current level of dominance. AMD has never led the market for 8 years at a time. Maybe a year tops, usually NVIDIA has responded pretty quickly with price cuts and new products. NVIDIA has never let off the pedal the way AMD did (and yes money was the reason but that doesn't matter to consumers, they're buying products not giving to charity).
I've heard everyone complain about drivers for amd, and (though I realize anecdotal evidence is no evidence at all) in my experience those people are full of crap.
I used the Fury(Fiji) and the VII(Vega 20) on linux and windows, and haven't expereinced any of the crazy shit people have claimed. Such as....
>(Navi's H264 encoder is still broken entirely as far as I know, it provides less than realtime encoding speed and extremely poor quality), so if you want to stream with AMD you have to use your CPU and crush your framerate
Where did you hear that? Not even slightly true. I used GPU encoding on both cards (@1080p, 60fps, 5500kbps, 1 second keyframe interval, 'quality' preset, full deblocking filters, and pre-pass turned off because I'm not insane) with zero issues. I would have liked to see more of an improvement in the VII's quality vs the Fury, but I wouldn't go as far as calling it non-functional.
I don't know who managed to get any of those to encode at less than realtime. I've done 1440 at 15500kbps to another machine for re-encoding and I can still play star citizen on High@60 fps with the same pc that's encoding.
And for the record, I also have a GTX 1660, 1080, and had a 770 before the Fury. I'm not fanboying, just relaying my experience.
Edit: Just to be clear, I'm not saying Navi is good to go. I'm refuting the general statement "if you want to stream with AMD you have to use your CPU and crush your framerate"
edit3 (lol): I 100% have no way to refute anything navi related. Looks like they completely changed the encoding engine in navi: https://en.wikipedia.org/wiki/Video_Coding_Engine#GPUs. Equally so, that chart incorrectly lists the VII as having version 4.0 instead of 4.1, so it may not be that trustworthy. I can't say until I have a Navi card to play with.
I had a 5770 and a 290x and both had periods (2013 for the 5770, 2018 for the 290x) of about a year where the latest drivers would randomly bsod on windows if you played a game on one monitor and watched a hardware accelerated video on the other.
It turns out anecdotes are different for everyone, that doesn't make them full of crap.
> I've heard everyone complain about drivers for amd, and (though I realize anecdotal evidence is no evidence at all) in my experience those people are full of crap.
> I used the Fury(Fiji) and the VII(Vega 20) on linux and windows, and haven't expereinced any of the crazy shit people have claimed. Such as....
I don't know what kind of reply you're expecting. You've managed to use two AMD cards and not have driver crashes? Uh, good for you, but that doesn't mean the rest of us are "full of crap". I've also owned two AMD cards, and for both of them the drivers were flaky, on Linux and Windows. I'm sure they're not broken for everyone, but they were broken for me. Maybe if I bought another AMD card I'd get lucky this time around, but I'm not going to take the risk.
That was an intro to the second half of the comment which exclusively focuses on video encode capability. I Didn't directly refute that driver quality sucks. From the original:
>I'm refuting the general statement "if you want to stream with AMD you have to use your CPU and crush your framerate"
"those people are full of crap" is a pretty direct insult. You shouldn't make that kind of accusation when you're not able to substantiate it or even willing to stand by it.
I still don't really understand the context around the Apple fight and it's a huge bummer, since Apple hardware with Nvidia GPUs would be the best combination.
When I used Linux the closed source Nvidia drivers were better than anything else and easily available, the complaints around them seemed mostly to be ideological?
The price complaints seemed mostly about 'value' since the performance was still better than the competition in absolute terms.
Nvidia had some GPUs that ran hot and had a above average failure rate so apple were unhappy because it made a couple of models of macs look bad. They also had enough revenue of their own that they didn't care enough to invent some SKUs so people couldn't compare macs to PC laptops.
The big issue with Nvidia GPUs in Linux these days is with Wayland. There are some graphics APIs that are the current way to create contexts, manage GPU resources etc but Nvidia went their own way which would require compositors to have driver specific code.
Many smaller compositors (such as the most popular tiling one for Wayland) don't want to write or support one implementation for Intel/AMD and one for Nvidia so they either don't support Nvidia or require snarky sounding cli options to enable Nvidia at the cost of support.
Interesting - makes sense, thanks for the context.
I'd suspect part of the reason Nvidia went their own way is because their way is better? Is that the case - or is it more about just keeping things proprietary? Probably both?
If I had to guess, some mixture of ability to improve things faster with tighter integration at the expense of an open standard (pretty much what has generally happened across the industry in most domains).
Though this often leads to faster iteration and better products (at least in the short term, probably not long term).
"When people complain to me about the lack of Nvidia support in Sway, I get really pissed off. It is not my fucking problem to support Nvidia, it’s Nvidia’s fucking problem to support me."
I'd suggest his users asking for Nvidia support are evidence of this being wrong.
That aside though, it seems like Nvidia's proprietary driver doesn't have support for some kernal APIs and the other vendors (AMD, Intel) do?
I wonder why I've always had better experience with basic OS usage using the Nvidia proprietary driver over AMD in Linux. Maybe I just didn't use any applications relying on these APIs. Nouveau has never been good though.
Not really a surprise given the tone of that blog post that Nvidia doesn't want to collaborate with OSS community.
Don't people rely on Nvidia for deep learning workflows? I thought that stuff ran on Linux? Maybe this is just about different dev priorities for what the driver supports?
> Don't people rely on Nvidia for deep learning workflows? I thought that stuff ran on Linux?
It all comes down to there always being two ways to do things when interacting with a GPU under Linux: The FOSS/vendor-neutral way, and the Nvidia way.
The machine learning crowd has largely gone the Nvidia way. Good luck getting your CUDA codebase working on any other GPU.
The desktop Linux crowd has largely gone the FOSS route. They have software that works well with AMD, Intel, VIA, and other manufacturers. Nvidia is the only one that wants vendor-specific code.
While AMD has been great at open source and contributing to the kernel, they also (from what I can remember) have been subpar with their reliability (both in proprietary and open source).
NVIDIA has been more or less good with desktop Linux + Xorg for the last 5-7 years (not accounting for the non support for hybrid graphics on Linux laptops).
I think you can use an NVIDIA GPU as a pure accelerator without it driving a display very easily.
Well - they charged more at each price point because they were faster.
At some prices I think it wasn't enough to justify the extra cost from a price to performance ratio, but that doesn't seem like a reason to think they're bad.
It's possible I'm a little out of date on this, I only keep up to date on the hardware when it's relevant for me to do a new build.
If vendor A's cards are $180, $280 and $380 and vendor B's cards are $150, $250 and $350, it's common practice to group them into three price points of $150-199, $250-299 and $350-399 so that each card gets compared to its nearest in price.
The prices are way closer together than that, because both companies sell way more than 3 cards. There are three variants of 2080 priced differently and two of the 2070 and 2060 each. That's seven price points above 300$ alone without looking at the lower segment (2 of those cards are EOL but still available a bit cheaper at some vendors). nVidia and AMD have always had enough cards that are at the same MSRP.
Nouveau is only slow because about 5 years ago Nvidia started signing the card firmware, excluding Nouveau from using some features like setting appropriate clock speeds. They release some pared down firmware that Nouveau is allowed to use, but not after years from release have passed.
Before that mess (around the 900 series), nouveau was fast.
Can confirm; I had a laptop (Dell XPS 14) with a GT630m and Nouveau worked well enough that I never felt the need to install the proprietary drivers over 4 years of use.
I'm replying to the parent comment about AMD pre-open-source drivers. So we're already 5 years ago.
I had my Linux install in a VM for a number of years because fglrx was an aggravating experience (unstable, often broke on kernel updates) to use and radeon had abysmal performance on my 290x. To the point that my laptops 540m (on nouveau) would outperform the 290x under linux.
I still have that 290x in one of my systems, and since amdgpu supported it it has been a very pleasant experience. But even that took a while as they started with 3xx and supported the 4xx cards before going back for the 2xx cards.
I think their unpopularity these days come from three factors:
1. For the gamer crowd, RTX 2000 was a price hike and even the higher end cards were not that impressive for performance. It looks like they were resting on their lead against AMD.
2. People love to root for the underdog and AMD was behind for the longest time.
3. For the developer crowd, the closed source Nvidia drivers were not great for Linux compatibility, and the Mac crowd couldn't care because they couldn't have Nvidia after their fight with Apple.
I think for a sample of the general public who have opinions on Nvidia, it's 1,2 and 3 in that order while for HN users, it's 3, 2 and 1 in that order.