Because music theory doesn't know how numbers work. The interval of A(440Hz) to A(440Hz) is "1" in music theory even though it's 0 in literally every other context ever in the history of everything. I mean, if they're going to use a bunch of mathematical terminology, the least they can do is count right.
It's not a "1", it's a "prime" and the root of that word is an ordinal (whence "first") not a cardinal. There is no zero'th ordinal. There is just "none", which implies that A to A is simply not an interval.
most of it was built before people understood why/how some things work. Take for example scales. In the beginning, you learn that over a C7 chord, you can use the mixolydian scale. Fine. But then you discover that the minor third (e-flat) can work too, and the major 7th can work as well (fe bebop scale). Then you discover that the others can work too (yes even the minor 2nd) and depends on other things like strong-beat/weak beat and what you're resolving to. Classical music theory has no framework to explain all the things that can work. (The reason why all of the above works is still better explained by physics)
I think that's a mischaracterization, because it assumes that it was ever designed. At each point in time it was assembled from the notation conventions familiar from the previous generation of music to record the current generation. As you go back in time you find a nice, continuous progression back to about the 9th century.
Why do you say so?