Hacker News new | past | comments | ask | show | jobs | submit login
GPU Hoarding: ‘I Felt Like I Was Buying Drugs’ (wsj.com)
39 points by uptown on Feb 13, 2018 | hide | past | favorite | 60 comments



NVidia has announced they don't want to chase the mining market. They're worried that they'd build up capacity and the mining market would go bust.[1] But they may have to. High-end NVidia cards have doubled in price. The gaming industry is unhappy. Gamers are unhappy. NVidia is looking at a "mining version" of their next card so they can still sell to graphics users.[2]

[1] https://www.digitaltrends.com/computing/cryptocurrency-minin... [2] https://www.tweaktown.com/news/60839/nvidia-unveil-mining-sp...


NVIDIA may be publicly saying that, but I am sure they are absolutely thrilled with the way sales (and margins) have been growing. And it wasn't gaming, AI, or autonomous cars - it's mining.

The gamer problem is exaggerated. While GPUs are definitely hard to buy in quantities > 1, and the prices are definitely much higher than 6 months ago - pre-built gaming machines (Alienware, Lenovo Cube, iBuyPower, etc) are still affordable. It's almost absurd seeing how these computers are now sold for less than the sum of the parts (especially the RAM and the GPU).

So, one thing has definitely changed for gamers - it's actually much cheaper now to buy a whole gaming PC than to build one. So much so that "crypto enthusiasts" now buy these machines, swap out the GPUs for low-end versions, then put them on eBay.


The gamer problem is not being exaggerated. Price per $ has literally decreased over the last 3 years.

I'm not being hyperbolic, $300 in 2015 buys you a better graphics card than $300 in 2018.

In normal times, a $300 card in 2015 costs $150-180 in 2018. It's easy to then see that the 2018 $300 card much be generations further.

However, today, a nVidia GTX 1060 will run you around $350 if you can find one in stock, and its performance per dollar is literally inferior to say an AMD R9 390 from 2015, which could be found under $300.

Very rarely in tech do we exist in a market where 3 years ago performance costs more today than it did then.

Would you be okay paying $600 or $700 for an iPhone6 or LG G3? It's shocking to us when years and years go by and your dollar doesn't change what you get.


It really feels like computer advancement has stalled out when you bought a 290 in 2014 for $250 and a 580 in 2017 for $200. There is about a 10% performance difference there. I remember going from a 2007 8800 GT to a 2009 GTX 285 and quadrupling performance at the same price point.

Its like the trends flipped themselves on their head. CPUs were barely improving from 2011 to 2016 but now Ryzen has forced Intels hand and suddenly in a year the jump is substantial (20% per thread, with 50-100% more cores). That being said, now that we have the "new normal" besides the 7nm jump in late 2019 - 2020 I don't anticipate substantial gains in CPUs going forward. 6-8 core should become the average high end but business as usual continues, hopefully with Spectre and Meltdown mitigation.

Likewise, GPUs had explosive performance gains from 2012 to 2015, but now the 1000 series from Nvidia has been out for almost 3 years with no new high end GPUs, and AMDs last full suite of GPUs was the 300 series 3 years ago as well. My understanding is the Vega cards are barely a sidegrade to Fiji, which were the last substantial improvement over the previous 290x series (and those were a substantial improvement over the 7900 series).


I've been looking to upgrade my six-year old graphics card for almost two years now, but the pricing has been consistently bad and even worsened over that time span —— I simply can't justify buying new card:

- If I spend the same amount of money I have roughly spend before for my graphics cards (~200 €), I can't, because that price point does not exist. I could drop 100-150 € on something that's pretty much the same (+-10 %) as my current card, though.

- If I want a significant upgrade, I'd have to at least spend twice as much as I want to, likely more (400-500 € range).

The price range that most gamers (at least those I know) buy their cards at simply does not exist for the time being.

That's true regardless of vendor. Even if I wanted to buy nVidia, the exact same thing applies to their lineup. There are cards <150 €, and then there are cards >>300 €.

I used to joke a few years back why there were so many models of everything; why would they need a dozen or more models to cover the price range from 50-xxxx € in 30 € steps? That "complaint" seems ironic in retrospect...


As a gamer, my issue is that the same graphics cards are much more expensive than just a couple of months ago, if you even find them.

I live in Japan, and last year I bought a GTX 1060 (6GB) for around 32000 Yen (a little under $300). This past weekend I was browsing around some computer shops and I found one shop with the same card I bought less than 12 months ago for 55000 Yen (~$510). They also had a GTX 1070 Ti for 77000 Yen (~$710), which is significantly above MSRP.

Every other shop I visited were sold out of everything above the GTX 1050. Those cards haven't had their prices affected by much during the last few months. There's also no AMD cards in sight besides the super low-end cards.


But this is just a blip. When Pascal GPUs first showed up gtx 980ti where going on ebay for <300.

Markets are markets.


You might want to proofread your writing a bit.


I read it over once before I submitted. I re-read it and give myself an A. Are we pedantically arguing over a comma splice here, or do you have constructive criticism for my writing?


NVidia's sales and margins haven't really been growing. They buy fab time way in advance and sell to third-party vendors at a fixed price. They aren't selling more units, and the third-party vendors are netting all the profits.

Worse yet, if nv does invest in more card production they run the risk of bringing that online just at the crypto market crashes, when there will be much less demand for cards and a flood of old mining cards on the used market.

Worse yet, if gamers can't buy your cards you lose loyalty (can't build it with newbies and old hands slowly forget). If you don't believe that gamers are unhappy then you simply haven't been paying attention.

This is an appalling situation for Nvidia (and AMD - all the same reasons apply to them too).


> if nv does invest in more card production they run the risk of bringing that online just at the crypto market crashes, when there will be much less demand for cards and a flood of old mining cards on the used market

Are you saying they shouldn’t manufacture & sell more card now because that this will cost them future sales? Can’t they just put that extra income in a bank and earn nice interests on the profits that are shifted from future to present due to that crypto craziness?


Chip fab has high fixed costs that take time to recoup. If they build more capacity now for demand that evaporates by the time it comes online, they are out a huge amount of $$.


> Chip fab has high fixed costs

How’s that relevant? nVidia has no chip fabs.

They use third party services. Currently TSMC but if TSMC is overbooked there’re others, e.g. for the current-gen GPUs nVidia also considered Samsung but apparently got better deal from TSMC.


Mining isn't the only issue although it's the one that gets all the attention it seems. From what I've heard RAM prices are high and this is a large contributing factor to the price rises - Gamers Nexus[1] on YouTube mentioned that the _bill of materials cost_ for building a graphics card had gone up by something like $20-$30 because of RAM and board partners are actually having to raise their official MSRP.

[1] https://www.youtube.com/watch?v=neNHVosINro


Yeah... gamers as a rule don't want pre-built machines because the builds usually skimp out on cheap/poor quality parts


That's because system vendors have fixed-price contracts. Their prices just have time lag.


Many gamers are also PC enthusiasts and buying prebuilt (and less upgrade-able) is not an attractive option.

As someone who’s trying to upgrade my gpu in the context of video editing, this situation sucks.


That’s just because of pricing contracts. The OEM is taking a bath.

My SSD and memory intensive workload infrastructure increased 60% in cost after our deal dissolved.


Well even if they say it's their EULA, won't stop the pro miners one bit. The effort is not work it.

Besides; some people who develop would probably use a 'scaled down' system to test it anyway.

Maybe NVIDIA will launch a gamer registration system, that would help the gamers actually get their hands on some hardware.


The good news is Bitmain is rumored to be launching an Ethereum ASIC - 25x as fast as an RX 570 at 4x the efficiency (650 MH/s at 750W). Those are going to push gaming GPUs out of the ethereum market. Prices are going to come down quickly just like the last time ASICs hit the market.

(Ethereum is ASIC-resistant, meaning any ASIC needs to look a lot like a GPU in terms of a powerful memory subsystem, which is much more difficult to optimize. It has never been impossible to improve the efficiency of the processing side of things - after all, a GPU is a kind of ASIC. You can think of these as Ethereum-optimized GPUs. The fact that these ASICs are only 4x as efficient as a general-purpose GPU says that the ASIC-resistance is actually working, in comparison the first Bitcoin ASICs were thousands of times as efficient as GPUs.)


Why would Bitmain do this when all signs are pointing to Ethereum moving to Proof-of-Stake? I find it hard to imagine them risking the R&D cost that goes into developing an ASIC like this unless they think they're going to make it back very quickly.

That's assuming it's Ethash-specific though. I guess you could also mine Ethereum Classic with it at least (are they planning to move to Proof-of-Stake? I havne't kept up).


I think Ethereum is just one of the many cryptocurrencies that are being mined with GPUs right now - so if this rumor is true, I don't expect it to have the same impact that ASICs had on Bitcoin mining.


Ethereum makes up the overwhelming majority of the hash power across all the networks (I'd guess about 75% of the total). If those GPUs go looking for other things to mine, they will crush the difficulty on the remaining networks and profit will go to zero.

Many chinese miners are using electricity that's nearly or actually free, they will keep trying to recoup their investment even if it sends small miners into the red.

On top of that, most Ethereum people are mining using AMD GPUs - and AMD GPUs have garbage efficiency at all the other coins. Ethash is literally the only one they do with reasonable efficiency, NVIDIA is ~2x more efficient at everything else.

https://i.imgur.com/2u2HOw7.png

(numbers from WhatToMine.com and include undervolting for all cards, and BIOS mods for AMD cards)


If I am reading your chart correctly - the 1080 is also very inefficient at Ethash?


Correct. The 1080 uses GDDR5X, which is quad-pumped and delivers chunks of RAM that are twice as big as Ethereum uses. This works fine for graphics, and for many other coins, but on Ethereum half of the bandwidth is wasted.

(bandwidth consumption is what makes Ethash memory-hard/ASIC-resistant - same principle as the tuning parameters on bcrypt/scrypt, designed to make it difficult for an attacker to scale their processing)

The 1070 Ti is basically a 1080 with GDDR5 (dual-pumped) instead of 5X (and minus one SM), and does much better, along with the 1070. The 1080 Ti has enough bandwidth to brute-force it even throwing half of it away.


Once Ethereum moves to proof-of-stake (supposedly this year) that would render any such effort pretty silly on their part.

There are other nascent cryptocurrencies that may make an effort worthwhile.


Ethereum is more than 18 months behind schedule on the PoS switchover, this time last year people were saying we would already be on PoS right now. Now the switchover is "maybe Q1 next year", and he made some statements that sound an awful lot like it's going to be pushed back to 3-5 years.

Vitalik can't actually deliver this product. He's a teenager who made a Bitcoin clone with a different hashing algorithm, not a super-genius software engineer. He's way out of his depth and he's not delivering.

It's even sillier to take him seriously at this point in time, and Bitmain is putting their money where their mouth is. The only real question is whether Vitalik will hardfork and screw up their implementation... but then next time Bitmain will just keep it in-house and not tell anyone. They already pre-mine with them before selling to customers, business is business.

(remember, Bitcoin includes a scripting language for smart contracts too! It was just partially disabled due to bugs in the original implementation that nobody ever bothered to fix. And nowadays the development process has dragged to a halt from infighting between various stakeholders, they are effectively incapable of making significant decisions. https://en.bitcoin.it/wiki/Script )


I bought a GTX1080 founders edition a few weeks after it came out over a year ago, after I was finally able to find one at best buy (of all places). I felt bad for spending all that money. I am seeing now they are selling used on ebay for more than I bought it for new. I can't remember a time in computer history where this has happened.


There's actually a well known phenomena known as a Chip Famine - https://en.wikipedia.org/wiki/Chip_famine

Most of the time it's an earthquake in Asia that takes power plants offline and impacts supply.


It took me weeks of searching and refreshing newegg, amazon etc. just to buy a 16-port PoE unifi switch for a reasonable price recently. No idea why but apparently a lot of others had been having similar sourcing issues.


There is a shortage of silicon wafers right now as well, which is affecting all silicon production. I rather suspect this is another cartel that's dragging its feed expanding production - like DRAM.

(DRAM/flash prices have been on a continual incline because capacity growth has not been matching expected demand growth, and this was expected to continue through next year. China told the DRAM cartel to knock it off or they'd bring state-sponsored fabs online to fill the gap... and a few weeks later Samsung signed a memorandum of understanding and now prices are expected to decline throughout this year. Just a stunning coincidence /s)

http://www.china.org.cn/business/2017-12/23/content_50157479...

https://www.theregister.co.uk/2017/12/21/china_memory_insour...

https://www.reuters.com/article/us-samsung-elec-chips-outloo...

https://seekingalpha.com/news/3327772-samsung-headed-china-a...

And fab time on cutting-edge nodes is expensive as well. For example, BitMain is buying up more 16nm fab time than NVIDIA right now. So certain classes of high-performance silicon like 100 GbE switches may be in shortage as well.


There was a manufacturing defect with the PoE system on those switches. Ubiquity had to bring them all back and correct it. They are available again now.


That definitely happened with GPUs back when bitcoin could be mined with GPUs. That also has happened with RAM recently as well.

Also, after a while certain enterprise hardware ends up being worth a fortune as desperate IT guys try to buy stuff as spares for a critical system that haven't been upgraded for whatever reason.

Then there's stuff like Raspberry Pi's, though they aren't really a computer part per se. That being said, computer parts occasionally do sell used for more than their list price.


> has happened with RAM recently as well.

Is happening sadly.. ram prices are still stupid


Oddly, used Jetson TK1s have always sold for higher prices on eBay than you could get them from Newegg or other suppliers, and I'm not clear on why.


I would not buy one from eBAy used. You will never know just how much lifetime you have left after a 'miner' burned it 24/7/365.

And I thought the prices were a joke, but a AMD vega RX 64, was pricing between 1,100$ and 3,000$ 2/13/17


That depends on whether or not you can easily replace fans. Or if you are even going to use the fans (if watercooling GPUs, you don't care). Other than that, they were running 24/7, but usually focusing on efficiency (undervolted), many times without a case. This is usually better than "gaming" GPUs, which are overclocked and stuffed inside a case.


another "first" in computer history - it's now more expensive to build your own gaming or "machine learning" box from parts than to just buy a pre-made gaming computer.


I feel incredibly lucky. In late November (IIRC), I was mulling over whether I should upgrade from an R9 270 to a better card that could manage VR someday. I sat on the thought for a week, unwanting to buy a graphics card at above MSRP. I finally pulled the trigger, as gaming is something I do a huge amount of, and I have disposable income that I really need to learn to not be so stingy with.

As of today, the card I bought costs over $720[0].

I paid just over $300

[0]: https://www.newegg.com/Product/Product.aspx?Item=9SIA6V66483...


A lot of those third-party listings are way high, just on the off case that somebody impulse-buys them or blindly follows some internet guide/link.

On eBay, 580s are going for about $350 at the moment. Still more than double what they were going for a year ago, you used to be able to reliably get the 8 GB for $175, but it's not $720 either. That's 1080 money.


I got a fury X for a couple hundred dollars before all this nonsense started, I was similarly happy with my decision.

edit: or incredulous at how much they cost now


Sensationalist title... but the article was a fun read nonetheless. For me, building mining rigs is a great way to teach my kids some DIY skills, electronics, thermodynamics, etc.

I've basically been using these as space heaters throughout the house, in the tool shed, etc:

https://www.alatortsev.com/2018/01/10/milk-crate-crypto-mine...

Now just need to figure out what to do with all the access heat in the summer. Might heat the swimming pool. :)


> “I’m using my electricity, my time and my effort to allow the cryptocurrency world to thrive.”

Come on, don't let him get away with that lie. More miners bring no benefits to the ecosystem.


The only computer parts I hoard are my Microsoft trackball and klackety keyboard.


Another negative point to throw on the pile. Cryptocurrency has been overhyped to an unbelievable degree. It is environmentally harmful and generally a nuisance. I like making money off of wildly fluctuating markets as much as the next guy, but cryto is, at least in its current form, pretty damn stupid. All this enthusiasm for something which, as of now, there still isn't really a good use case for, is crazy.


I think this is a short term thing and will ease as GPU supply catches up and competition reduces profit margins. Right now from my calculations, a GPU is looking at something like 10 months until break even based on current crypto and GPU prices.

That's not a great investment considering you could just trade cyrpto itself and be far more liquid.


heh, when? by the time you will be able to get one they will have a new one out you can't get either. I thought I would be able to get two of the AMD vega 64 shortly after COMPDEX. Then I saw the prices. WHOOF. Availablity sucks, price sucks, and frankly I am still waiting.


At my local store (Toronto) pretty much all the Nvidia models are in stock. Prices are high though.

There is also stock available on the aftermarket, including whole rigs for sale.


BTW, I use a private VPN server which can be setup on aws in a few minutes https://github.com/webdigi/AWS-VPN-Server-Setup

No logging on server side guaranteed as it is your own server. AWS could monitor but I don't mind that


Ignore above, meant for another post


It trickles down too. My friend was going to throw out an old computer.

I pulled out the ancient ATI video card and sold it for $50 on eBay.


Biggest takeaway for me is that the WSJ now uses Gawker level clickbait titles.


Yikes I know. What a juxtaposition between an "edgy" headline ("felt like I was buying drugs") and one of the most mundane opening sentences in journalism I've ever read ("James Liska finally spotted a reasonably priced computer-graphics card online. By the time he clicked “buy,” it was gone.").


FWIW the A-Hed is meant to be a humorous column.

https://www.wsj.com/articles/SB10001424052702303362404575580...


The entire industry is going down that path because it's the best way to make money. The WSJ has been doing this for a few years now.


I wonder if someone can write a script so that the submission form can automatically block titles like "The Weird Old Foo You Won't Believe is Bar!".


The WSJ is just another Fox outlet.




Oh, so the video card shortage is an actual thing and not something Micro Center sales staff ginned up in order to drive up prices.

Which means I got an absolute steal on my recent PC build, all things considered.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: