Hacker News new | past | comments | ask | show | jobs | submit login

> Outside of the academic world decimals are almost always a better solution

Is “academic world” now a shorthand for “all numerical computing”?

Decimals basically never make sense, except possibly in some calculations related to money. Those make up a minuscule part of modern computer use.

Maybe decimals are also better for homework assignments for schoolchildren?

The type of applications where decimals are useful are by and large insensitive to compute speed and need no special hardware support. You can easily write your code for decimal arithmetic on top of integer arithmetic hardware.

Those of us who need binary floating point for graphics, audio, games, engineering, science, .... won’t stop you.




Even with money I use integers. Instead of dollars (or local currency), I store values internally as pennies (or local equivalent 1/100 of main currency). Sometimes when working with interest I'll need to work with floats, and some databases I have values stored as DECIMAL(8,2) instead of INT, but for the most part I've saved quite a few headaches by keeping my values in INTs.


There are currencies that have the lowest value coin as 1/20 of main coin.


That seems okay until you need to track sub-penny accuracy somewhere, then you have a big problem.

Lots of applications don't need that, but a surprising amount do, so it's not a global solution.


But there's still a minimum significant value which can be defined from the problem space. Do DECIMAL(16,8) or whatever.

In some situations, there may even be industry or legal standards as to what can be considered rounding noise.


No need for snark. You might be correct that, by “computational volume”, handling currency values might be considered a niche; but even something like World of Warcraft has to handle money at some point.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: