It's not important to most people, because decimal floating point only helps if your UI precision is exactly the same as your internal precision, which almost never happens.
Seeing the occasional 0.300000000000004 is a good reminder that your 0.3858372895939229 isn't accurate either.
One can argue that nothing is important to most people.
The correct calculations involving money, up to the last cent, are in fact important for people who do them or who are supposed to use them. I've implemented them in the software I've made to preform some financial stuff even in eighties, in spite of all the software which used binary floating point routines. And, of course, from the computer manufacturers, at least IBM cares too:
Seeing the occasional 0.300000000000004 is a good reminder that your 0.3858372895939229 isn't accurate either.