And the highest data rates to date with the signal integrity requirements that accompany it. Got a piece of dust pushed down by your CPU cooler saddling two DIMM pins? Get ready for your machine to shred your data. And that's just a common simple scenario. I's be surprised if real world error rates in nominal scenarios won't be higher than with DDR4.
And again still likely be right. 10 years ago consumer electronics marketing never included signal integrity stuff like eye diagrams but now pretty much every nvidia announcement with a new memory standard does. We're really pushing ever closer to channel bandwidth limits and corners that could be cut in the past can no longer be cut. ECC is more important than ever.