Hacker News new | past | comments | ask | show | jobs | submit login

I think Ethereum is just one of the many cryptocurrencies that are being mined with GPUs right now - so if this rumor is true, I don't expect it to have the same impact that ASICs had on Bitcoin mining.



Ethereum makes up the overwhelming majority of the hash power across all the networks (I'd guess about 75% of the total). If those GPUs go looking for other things to mine, they will crush the difficulty on the remaining networks and profit will go to zero.

Many chinese miners are using electricity that's nearly or actually free, they will keep trying to recoup their investment even if it sends small miners into the red.

On top of that, most Ethereum people are mining using AMD GPUs - and AMD GPUs have garbage efficiency at all the other coins. Ethash is literally the only one they do with reasonable efficiency, NVIDIA is ~2x more efficient at everything else.

https://i.imgur.com/2u2HOw7.png

(numbers from WhatToMine.com and include undervolting for all cards, and BIOS mods for AMD cards)


If I am reading your chart correctly - the 1080 is also very inefficient at Ethash?


Correct. The 1080 uses GDDR5X, which is quad-pumped and delivers chunks of RAM that are twice as big as Ethereum uses. This works fine for graphics, and for many other coins, but on Ethereum half of the bandwidth is wasted.

(bandwidth consumption is what makes Ethash memory-hard/ASIC-resistant - same principle as the tuning parameters on bcrypt/scrypt, designed to make it difficult for an attacker to scale their processing)

The 1070 Ti is basically a 1080 with GDDR5 (dual-pumped) instead of 5X (and minus one SM), and does much better, along with the 1070. The 1080 Ti has enough bandwidth to brute-force it even throwing half of it away.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: