That works fine for addition and subtraction. When you divide (or do percent calculations) integer cents is no longer, you need to deal with rounding manually/some custom class with every calculation.
Yeah but that's rounding error which isn't limited to just money. There's no universal precision/representation between all the currencies an currencies change all the time so it would be wildly inappropriate to bake into a programming (language?) standard.
Also, what's with the aversion to traits/types? This is the whole point of having operators/types.
Correct. Precision must be parameterized for every variable.
2 is the most commonly used value (for money), but not the only reasonable one. And rounding needs to be done at certain steps but not a others. Typically internal calculations are expected to be done with maximum precision, and every time you have an item visible to the customer you round. Often to full cents, but there are cases where you show a fixed number of decimal cents even to the customer. I doubt that cases where you show maximum precision to a customer are common.
That works fine for addition and subtraction. When you divide (or do percent calculations) integer cents is no longer, you need to deal with rounding manually/some custom class with every calculation.