with the AMD Ryzen CPU release tomorrow, you can now pair a lastest generation 8C/16T CPU with a 1080 for $1000, a combination that a week ago was about $1700. The Ryzen boards should be cheaper too, bringing the total system price for a configuration like this down a lot. Pretty amazing.
Sort of, the intel 8c/16T is crazy expensive because it's a server chip with 4 memory busses, ecc support, etc. $1000 CPUs don't make sense for gamers. The fastest "normal" i7 is has plenty of cores and better single thread performance than the $1,000 intel chip. Of course AMD wants to compare to the $1k chip, not the more competitive $350 chip.
The Ryzen is a desktop chip, no ecc, half the memory busses, good at cache friendly stuff. No faster than last years CPUs on anything memory intensive.
Of course AMD's going to cherry pick the cache friendly benchmarks to brag about.
The most recent AAA titles often favour more cores, as also the consoles are 8C nowadays [1]
And in case you do more than gaming, i'd always take the 8C/16T over a 4C/8T for the same amount of money, even if it's 10-15% slower in single threaded workloads.
Of course AMD compares to the competing 8/16 chip, because for enthusiasts its just a much better deal. There are a lot of enthusiast gamers that do streaming or video editing for example.
Mainstream is probably better of with waiting for the 4C/8T Ryzen models and pairing that with a RX 480 or mid range Vega.
> The most recent AAA titles often favour more cores, as also the consoles are 8C nowadays [1] And in case you do more than gaming, i'd always take the 8C/16T over a 4C/8T for the same amount of money, even if it's 10-15% slower in single threaded workloads.
Really ? That contradicts everything I've ever heard and experienced myself.
At least until really recently, games where still massively dependent on the single threaded speed since all the graphic stuff was single threaded (because of DX<12 or OpenGL). Did this already change thanks to the newest graphic API (DX12 / Vulkan) ? Last time I checked, the adoption was still pretty low, it's really good news if the landscape is changing quickly. Or is it something else ?
We do not yet know. Gigabyte specifically says ECC memory is compatible, but runs without ECC on their boards. Asrock only says that ECC ram is compatible.
It requires that the memory controller support ECC and that there are extra traces from the memory controller to the RAM.
This used to mean it was solely a motherboard feature, but starting about a decade ago, the memory controller was moved onto the CPU, so now it requires both CPU and motherboard support.
buying a Nvidia card is almost shooting oneself on their foot.
If all you want it for is gaming. If you also want to play around with machine learning or other GPGPU applications, then getting anything other than Nvidia is a bad idea, since that is what everyone uses and supports at the moment.
It's not just machine learning, 3D rendering is moving over to GPU as well and there is next to no support for anything other than CUDA in that space today.
Not everything starts and ends with gaming in the high end computer space.
Yeah, competition is nice. We can now choose to buy a 8 core Intel 6900K without a GPU, or buy a 8 core Ryzen R7 1700X together with a Nvidia GTX 1080 Ti for the same price.
While this used to be true (and mostly still is), the tide is shifting on this point. Games are using more and more cores. Overwatch, for example, uses 6. I believe DirectX 12 makes it easy/reasonable to use 4 cores, with some benefit to be gained with up to 6.
Not that it really helps with some games. Dota 2 is CPU bound for the moment, so if you play that a lot then maximizing single-threaded performance is probably the way to go.
Still, if I was buying a CPU today, I'd be cautious about going for less than quad core if I was interested in new AAA games.
Still, there's diminishing returns on more cores for such uses. You can use 2 cores, might use 4 cores, having 6-8 cores is probably just cause excess heat that could be instead spent to run 4 cores at higher frequency.
This is how Intel should have dealt with Ryzen; instead of dismissing the competition. Smart move from NVIDIA though I'm still going to be keeping an eyeball on Vega.