Benford's Law applies because the wider natural distributions tend to be flat in the log space. One example is the log-normal distribution, which is what you get when random variables compound multiplicatively. (When they compound additively, you get a tighter Gaussian.)
So let's say that the values in bank accounts (just for an example) are $10^(N(3.5, 2)). In the log-space, this distribution is relatively flat on [2, 5] so let's focus on [3, 4]. The flatness at the top of the bell curve means there'll be just as much mass in [3, 3.3) as in [3.7, 4); converting back into dollar amounts that means there are as many between [1000, 2000) (10^3.3 is close enough to 2000) as in [5000, 10000). So you have as many leading 1's as you do of all digits 5-9.
I prefer to call it the Benford Effect. It's not a law. You don't get it for all distributions. It's not the case for human height, and IQ's have a hyper-Benford effect (50% of leading '1's) purely on account of how the distribution is defined. You only get it when the distribution is flat in the log space over at least one order of magnitude.
So let's say that the values in bank accounts (just for an example) are $10^(N(3.5, 2)). In the log-space, this distribution is relatively flat on [2, 5] so let's focus on [3, 4]. The flatness at the top of the bell curve means there'll be just as much mass in [3, 3.3) as in [3.7, 4); converting back into dollar amounts that means there are as many between [1000, 2000) (10^3.3 is close enough to 2000) as in [5000, 10000). So you have as many leading 1's as you do of all digits 5-9.
I prefer to call it the Benford Effect. It's not a law. You don't get it for all distributions. It's not the case for human height, and IQ's have a hyper-Benford effect (50% of leading '1's) purely on account of how the distribution is defined. You only get it when the distribution is flat in the log space over at least one order of magnitude.