Based on publications from SSD controller vendors, it looks like current 3D TLC NAND has raw bit error rates on the order of 10^-3 or better when it's healthy. Raw bit error rates in the 1-2% range correspond to a drive that's either worn out its write endurance, or has been sitting on a shelf in high temperatures for years and has some data retention issues. It looks like most SSD controllers are designed to maintain some degree of usability with ~1% raw bit error rates (albeit with performance penalties), but 2% RBER is pushing the limits of even the last-resort layer of ECC.