Hacker News new | past | comments | ask | show | jobs | submit | sykh's comments login

If sqrt(2) is a normal number then its decimal expansion would contain infinite information.


It depends on how you define information, but I'd argue that it doesn't.

In particular, those digits are not random variables which are free to take on any value; they're fixed by the definition of sqrt(2). In terms of Shannon information, the information is a measure of uncertainty in a message. If our protocol is that I'll send you digits of sqrt(2), then there's no information being transferred at all: since you could have worked them out for yourself. Alternatively, if our protocol is that I'll send you digits with uniform probability, then sending sqrt(2) would be "infinite information", but only because we're having to narrow it down out of infinitely many possibilities. I don't think this is the most useful definition though, since we can arrange for any amount of information we like: if the protocol is I will either send "5" or sqrt(2), then it contains 1 bit; and so on.

Alternatively we could use algorithmic information theory, where the information content of a message is the length of the smallest tape for a universal turing machine which outputs that message. sqrt(2) can be calculated by a very small program (cranking out digits forever), so it contains very little information. Yet even here, since it's constant, we could define our turing machine such that it emits sqrt(2) when given an empty tape, and hence it again contains no information.


Let S be an encoding of the works of Shakespeare in binary. A normal number contains S in its binary expansion. We can do this for all information. In this sense is it correct to say that a normal number contains infinite information in it?


Again, depends on your definition of information. For shannon information, you would need to define a protocol. If the protocol is "I will send you the entire works of Shakespeare" then I would be sending you no information. Likewise, if the protocol is to send you S, then S contains no information.


A normal number can be computable. It is not known if sqrt(2) is normal.


At no point was anybody talking about normal numbers. "Normal numbers" doesn't mean "numbers with random digits", it has a much narrower meaning that. There are non-computable numbers which are normal, there are non-computable numbers which aren't. There are computable numbers that are normal that don't look random in the least: https://en.wikipedia.org/wiki/Champernowne_constant


The author's paper is about normal numbers (numbers that contain infinite amount of information). If a normal number isn't considered to have random digits then what notion would you use?


You've misunderstood the definition of normal numbers. A "normal number" (loosely defined) is a number where the digit expansion of the number has all possible substrings of digits uniformly distributed in the limit as it goes to infinity.

Several things to note:

1. Whether or not a number is normal has nothing to do with the "randomness" of its digits or the "amount of information" in the digits. Chapernowne's constant, 0.123456789101112... isn't "random" at all and contains very little information, yet it is known to be normal in base 10.

2. A number can have totally super-random digits (as random as you want! it can even be non-computable! infinite information!) and not be normal at all. For instance, imagine you have a normal number that's as random as you want, and it starts like:

0.892345123402345671235....

And then goes on forever, no discernible pattern. Say you construct a new number from this number, with the only difference being that you remove the digit 7. Literally, every place that a 7 appears in your original number, you just remove it. This number would no longer be a normal number, because all the sequences with the number 7 in it would appear nowhere.

The number would still have "infinite information" in the sense of the author of this paper. It would still be "just as random", it would still have "no pattern". But it would not be a normal number anymore.

Whether or not a number is "normal" or not has nothing to do with the issues raised in this paper. "Normality" is a different criterion entirely. When the author talks about numbers with "infinite information", he's not talking about normal numbers, he's talking about computable numbers, which is an entirely different concept: https://en.wikipedia.org/wiki/Computable_number


> Chapernowne's constant, 0.123456789101112... isn't "random" at all and contains very little information, yet it is known to be normal in base 10.

Why is Chapernowne's constant not random?

> The number would still have "infinite information" in the sense of the author of this paper. It would still be "just as random", it would still have "no pattern". But it would not be a normal number anymore.

That's not true. It would not be just as random. If the set you're sampling from includes a 7 (i.e. the set of digits which can be represented in any single place in the sequence), and you never see a 7 for an extremely long time, this is exceptionally good heuristic evidence that the number is not random. And if we know 7 never shows up, we also know that the number is not random, because we know it's not uniformly sampling from the set of base 10 digits.


I read through most of the paper we are all nominally talking about. He doesn't use normal numbers as you say. He is talking about computable numbers as you say. My apologies.

But a normal number has "random" digits in the sense that all finite sequences of digits (in a given base) occurs uniformly in the expansion. What other notion of random can one meaningfully give for an expansion of a number's digits? Without getting into too much philosophy.

From [1]: We call a real number b-normal if, qualitatively speaking, its base-b digits are “truly random.

My expertise is in commutative algebra so I'm outside of my comfort zone.

[1]: https://www.davidhbailey.com/dhbpapers/bcnormal.pdf


Something can be random and not have a uniform distribution. If I throw two dice, the sum will be random, but it will not be uniformly distributed. Uniform is just one distribution among meny.

The only definition of "randomness" in a sequence that holds any kind of water, philosophically or mathematically, is Kolmogorov complexity [0] (see specifically the section on "Kolmogorov randomness"). I don't know where you got the idea that "normalness" is the ultimate version of randomness, but it's not.

Forget the formal definition for a second, and just think of the intuitive notion of randomness for a second: does the sequence 12345678.... look random to you? In random sequences there should be no patterns: do you see a pattern in this sequence? If you had a computer program with a random number generator that produced that sequence of digits, would you be happy with it? No, you wouldn't.

In the sense of Kolmogorov randomness, the sequence 1234567.... is obviously not random at all, since it's trivial to find a Turing machine to generate it. It matches up perfectly with out intuitive notion of what randomness is, and it quite correctly points out that the amount of information in the string is very low, even though it's infinitely long. That is my definition of randomness, and it's (more or less) the author of this paper's definition.

[0]: https://en.wikipedia.org/wiki/Kolmogorov_complexity


The cardinality of the continuum is a cardinal number. It’s one of the alephs. It is not known which aleph it is. So it’s not known what the cardinality of the powerset of the naturals is. It’s just known that it is the same cardinal number of the reals. Basically, we have two jars of marbles that contain the same number of marbles but it’s not known how many marbles that is.


The continuum hypothesis being independent just means that it's an additional rule you can add or remove from the game you are trying to play. It doesn't mean we are lacking in knowledge and that if we were to work harder we would solve this problem. We do know which aleph c is: it's aleph_1 with CH and some other aleph without CH. Just take your pick which version you like better.

It's not like we don't know which one is the true model of military combat: chess or checkers. They're just two different games with two different rule sets, and you get to pick which one you like to play more.


The set theory that most working mathematicians deal with is ZFC. In ZFC it is not known what cardinal the continuum is. Hence the statement that I was responding to is incorrect. The person I responded to said that they do know how many reals there are.

The cardinality of the reals is called c. It is known to the be the same as the cardinality of the power set of the naturals. It is not known, in ZFC, which aleph this is. We just know that it is the same as the size of another set.

If you want to add an axiom and say that c is aleph1 then you are free to do so. But if you don't have this axiom then you don't know which aleph it is. So in what sense can you say that you know how many reals there are? You only know it if you add an axiom that says, "It is aleph1."

If I have a jar of pennies and I know it has the same number of pennies as the number of quarters in another jar that I have, does this mean I know how many pennies are in the jar?


Exactly right.


I think the issue is with the implicit claim that we don't "know" a cardinal until we know which aleph it corresponds to.


A computable number can be a normal number.

http://www.glyc.dc.uba.ar/santiago/papers/absnor.pdf


The complement of the set of normal numbers has measure zero. A normal number is what the author of the paper is talking about when referring to random numbers. I think most mathematicians, if they had to bet, would bet that sqrt(2) is normal. It is not known though. If it is normal then it’s digits are random.

I haven’t read the paper but I think the author is arguing that most real numbers don’t make sense physically. If sqrt(2) were shown to be normal and since it’s the hypotenuse of a right triangle of legs with length 1, I wonder what the author's response to this would be. Perhaps he’d say that such a triangle doesn’t exist physically.


Perhaps he’d say that such a triangle doesn’t exist physically.

I am certain that this is the case given that he's referring to the maximum information density density of space as one of the reasons why exact numbers make no physical sense. See https://en.wikipedia.org/wiki/Bekenstein_bound if you don't know about that limit.

More directly, the inability to represent exact lengths falls out of the Heisenberg Uncertainty principle. As an abstract mathematical concept, numbers could be anything. As a physically relevant concept, there are limits to how much precision can matter.


What does it mean to produce a physically realized triangle with length exactly 1? Some specific number of atoms in a specific place? How does that square with the uncertainty principle?


It’s a sign of how bad things are in the U.S. State something obvious, like you did, that jives with someone’s attachement to a political party and you get irrational responses. The polarization is palpable. People don’t seem to rationally discuss policy. It’s about defending/justifying political parties. It’s the party that matters.


Mann and Ornstein predicted this and explain its causes.


It’s sort of the divide and conquer strategy. Pit groups against each other whilst the looting occurs. While people fight over who is the true snowflake and that kind of stuff all sorts of shitty policies get enacted. Things didn’t just suddenly get unbearable because <person of opposite party> just got elected. If one thinks this then the problem was already there and one's lack of awareness of it is the real problem.


It's also convenient for maintaining a grip on power and something I'm certain both parties have internally considered. Think about the most recent election. How many people voted for Hillary thinking, "I think this person truly and accurately represents my views and beliefs." And similarly for Trump? By contrast how many voted for one or the other thinking, "This person is trash... but, my god, the alternative is just completely unacceptable!"

The same sort of attitude ensures that third parties will never be considered. When people are driven to literally fear the 'opposition' winning, it means that the powers that be have effectively sealed their grip on power tight. Also even notice the media's focus on "electability." It's all rhetoric designed to get people to note vote for the candidate they want in power, but to 'strategically vote' and in the process completely undermine their own self interest.

Pair this with some institutional issues like ballot access and we have a system where the establishment has all but guaranteed their perpetual success.


Yes America would probably be better off with more than two parties. First you need to ditch the first past the post system.


This Chomsky vote comes to mind: “The smart way to keep people passive and obedient is to strictly limit the spectrum of acceptable opinion, but allow very lively debate within that spectrum....” .


> It’s sort of the divide and conquer strategy.

Since we're discussing nothing short of the US government acting as an enemy to the citizens it is supposed to serve, I would suppose the next few prudent questions are:

1) What is the entity deploying this divide and conquer strategy?

2) Is this entity monolithic? Cellular? Organized or random?

3) What does this entity stand to gain and is it good for all?


Since we're discussing nothing short of the US government acting as an enemy to the citizens it is supposed to serve....

Your premise is incorrect. The "sort of" part of my statement is important. There are lots of forces at play that cause this. I see it as sort of an emergent phenomenon and not the design of some clever genius or powerful organization.


> sort of divide and conquer

Your assertion is not strong enough and your self referential call out to authority is suspect or at the least misplaced.

It is absolutely divide and conquer and it absolutely is an organized effort.

My assertions were posed as questions only to guide the light.

Anyone with eyes to see will know this. The problem is most of those eyes lack the courage and faith to speak.


I didn’t make any self referential call out to authority. I provided no authority whatsoever. I posited beliefs and spoke in such terms. I just stated how I see things. As I said, I believe the problems are emergent phenomena. Of course I could be wrong. It could be an organized effort as you say. It seems reasonable that it would become organized even if the origin isn’t.


...pensions create a long term liability in a way that putting cash in employee 401k doesn’t.

Assuming we don’t want to live in a society in which vast numbers of elderly live in penury this isn’t true. It’s well known that when retirement savings are the responsibility of the emoloyee then they don’t save enough.

What you say about bad governance is certainly true and incompetent or badly incentivized leadership will exacerbate problems. I don’t know what the solution is for the U.S. but the current system is going to make us end up with a society with large numbers of destitute elderly. Shifting the onus to the employee via 401Ks is just kicking the can down the road so to speak. It also allows politicians and voters off the hook because the liabilities aren’t on the books. On paper it looks good but the reckoning will occur.


Not to mention shifting responsibility on the management side from professionals to citizens.

(There are a lot of arguments to be quibbled over here, but when the average American can't calculate interest I'm not going to point to them as the best steward of their retirement)


My wife is a psychiatrist and has dealt with people who survived suicide attempts. Mostly people who tried to kill themselves by shooting themself in the head. She says that her experience is that people do not immediately regret the decision just the consequence of having survived severe damage to the head.

It seems, at first thought, unreasonable that people trying to kill themselves by jumping off a bridge would be more likely to regret the decision than people who shoot themselves in the head. I wonder if this is mostly a selective choosing of the survivors of jumping off the bridge to fit a narrative.


This is a famous article in the mathematics community. While her results were known to humans for several hundred years she claimed they were not known to her. If this is correct then her derivation is an impressive feat but certainly not worthy of publication. According to [1] the paper still receives citations and some people refer to "Tai's model" instead of the Trapezoidal Rule.

You can read some comments to her article and her response in [2]. I think she should have acknowledged that her method is the Trapezoidal Rule. Maybe she has in the 20+ years since. I don't know.

The whole saga reminds me of a time that member of the biology department asked me, "Why does the Ti-83 calculate scientific notation wrong?" I asked what she meant. She gave me this example:

Calculate (3.75 x 10^23)/(9.34 x 10^(-5))

She enters the problem into the calculator to show me that it does indeed calculate the wrong value. She entered

3.75 x 10^23/9.34 x 10^(-5)

I had to explain to her that the order of operations was important.

[1] http://johncanning.net/wp/?p=1863

[2] http://www.math.uconn.edu/~kconrad/math1132s14/handouts/taic...


.


Please don't do this here.


What the freak


Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: