> Outside of the academic world decimals are almost always a better solution
Is “academic world” now a shorthand for “all numerical computing”?
Decimals basically never make sense, except possibly in some calculations related to money. Those make up a minuscule part of modern computer use.
Maybe decimals are also better for homework assignments for schoolchildren?
The type of applications where decimals are useful are by and large insensitive to compute speed and need no special hardware support. You can easily write your code for decimal arithmetic on top of integer arithmetic hardware.
Those of us who need binary floating point for graphics, audio, games, engineering, science, .... won’t stop you.
Even with money I use integers. Instead of dollars (or local currency), I store values internally as pennies (or local equivalent 1/100 of main currency). Sometimes when working with interest I'll need to work with floats, and some databases I have values stored as DECIMAL(8,2) instead of INT, but for the most part I've saved quite a few headaches by keeping my values in INTs.
No need for snark. You might be correct that, by “computational volume”, handling currency values might be considered a niche; but even something like World of Warcraft has to handle money at some point.
Is “academic world” now a shorthand for “all numerical computing”?
Decimals basically never make sense, except possibly in some calculations related to money. Those make up a minuscule part of modern computer use.
Maybe decimals are also better for homework assignments for schoolchildren?
The type of applications where decimals are useful are by and large insensitive to compute speed and need no special hardware support. You can easily write your code for decimal arithmetic on top of integer arithmetic hardware.
Those of us who need binary floating point for graphics, audio, games, engineering, science, .... won’t stop you.