Hacker News new | past | comments | ask | show | jobs | submit login

This explanation feels off, to me. Aren't all numbers symbols? Pi, e, 10, 0xA, 9.99999...?



In my view, numbers are numbers, and symbols are symbols. There's an agreement that a symbol represents a number, but there's not a one-to-one relationship between available symbols and available numbers. Normally this isn't a problem, and we treat them interchangeably. And indeed the distinction may only be a philosophical oddity or a matter for mathematicians. But I believe nonetheless that there is a distinction.

Now I was merely an undergrad math major, which means I topped out before learning this stuff in a formal way. But at my primitive level of understanding, I think of a number as something that behaves like a number within a given system. What I learned in my courses was how different kinds of numbers behaved: Whole numbers, reals, complex, vectors and tensors, etc. I remember reading a definition of "tensor" that was to the effect of: A tensor is something that behaves like a tensor, meaning that the important thing is the behavior.

Another post in this page expressed that we should be particularly cautious when dealing with numbers, symbols, and IEEE floats, notably to beware that IEEE floats and real numbers don't always behave the same. That was treated in one of my math courses, "Numerical Analysis." You could also get CS credit for that course, suggesting its practical importance.


I think the consequences of what you are saying makes sense. Would be neat to explore more of the idea. I was starting to find, recently, that it was better to think of numbers as symbols that follow operational rules. This view seems counter to that.


There is a view that math is "just" symbol manipulation. So I can't say your approach is wrong. Probably whatever works, works.

And you can usefully say things like "x is a number" without saying which particular number x is.


Yes, in a way. What distinguishes irrational numbers from rational numbers is that all rational numbers can be represented by strings drawn from a regular language. For example, all strings generated by the regular language "-?\d+\.\d+?_\d+" (where "_" denotes the repeating decimal expansion as in 1/6 = 0.1_6) correspond to exactly one rational number and all rational numbers correspond to at least one string in this regular language. Irrational numbers (and other types of "numbers" such as +inf and -inf) cannot be represented by any such regular language.


I'm not entirely sure I follow. Aren't pi and e irrational numbers?

I also included .9_ as it is an easy trap to show we have two ways of writing 1 in standard decimal notation.

Please read this whole post as a question. I'm genuinely not clear on the distinction.


Correct. Given the regular language I specified, each rational has an infinite number of matching strings: 1.0 = 1.00 = 1.000 = 0.9_99 = 1.0_0 etc. The point is that for every rational number you can think of, I can show you at least one string in my regular language to represent that number.

According to the finitists, this is a defining feature of a "number". Since the same can't be done for irrational numbers finitists conclude that irrational "numbers" aren't numbers. You probably agree that all numbers are (or can be represented by) symbols, but that not all symbols are numbers. So how do we distinguish symbols from numbers?


But not all symbols are numbers.


Why not? One of the profound effects of computers is that we have moved computing to be able to work with images. Sounds. Etc.

I get that not all have simple algebraic concepts.

I get the impression I'm corrupting parts of I am a Strange Loop? Would love to read more on this.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: