In my view, numbers are numbers, and symbols are symbols. There's an agreement that a symbol represents a number, but there's not a one-to-one relationship between available symbols and available numbers. Normally this isn't a problem, and we treat them interchangeably. And indeed the distinction may only be a philosophical oddity or a matter for mathematicians. But I believe nonetheless that there is a distinction.
Now I was merely an undergrad math major, which means I topped out before learning this stuff in a formal way. But at my primitive level of understanding, I think of a number as something that behaves like a number within a given system. What I learned in my courses was how different kinds of numbers behaved: Whole numbers, reals, complex, vectors and tensors, etc. I remember reading a definition of "tensor" that was to the effect of: A tensor is something that behaves like a tensor, meaning that the important thing is the behavior.
Another post in this page expressed that we should be particularly cautious when dealing with numbers, symbols, and IEEE floats, notably to beware that IEEE floats and real numbers don't always behave the same. That was treated in one of my math courses, "Numerical Analysis." You could also get CS credit for that course, suggesting its practical importance.
I think the consequences of what you are saying makes sense. Would be neat to explore more of the idea. I was starting to find, recently, that it was better to think of numbers as symbols that follow operational rules. This view seems counter to that.
Yes, in a way. What distinguishes irrational numbers from rational numbers is that all rational numbers can be represented by strings drawn from a regular language. For example, all strings generated by the regular language "-?\d+\.\d+?_\d+" (where "_" denotes the repeating decimal expansion as in 1/6 = 0.1_6) correspond to exactly one rational number and all rational numbers correspond to at least one string in this regular language. Irrational numbers (and other types of "numbers" such as +inf and -inf) cannot be represented by any such regular language.
Correct. Given the regular language I specified, each rational has an infinite number of matching strings: 1.0 = 1.00 = 1.000 = 0.9_99 = 1.0_0 etc. The point is that for every rational number you can think of, I can show you at least one string in my regular language to represent that number.
According to the finitists, this is a defining feature of a "number". Since the same can't be done for irrational numbers finitists conclude that irrational "numbers" aren't numbers. You probably agree that all numbers are (or can be represented by) symbols, but that not all symbols are numbers. So how do we distinguish symbols from numbers?