Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

1. There must be a limit; there are only around 10^80 atoms in our universe, so even a universe-sized supercomputer could not calculate an arbitrarily deep zoom that required 10^81 bits of precision. Right?

2. Renormalization just "moves the problem around" since you lose precision when you recalculate the image algorithm at a specific zoom level. This would create discrepancies as you zoom in and out.

3. You cannot; because of the fundamental limits on computing power. I think you cannot compute a mathematically accurate and perfect Mandelbrot set at an arbitrarily high level of zoom, say 10^81, because we don't have enough compute or memory available to have the required precision



1. You asked about the fundamental limits, not the practical limits. Obviously practically you're limited by how much memory you have and how much time you're willing to let the computer run to draw the fractal.


1. Mandelbrot is infinite. The number pi is infinite too and contains more information than the universe

2. I dont know what you mean or look for with normalization so I can’t answer more

3. It depends on what you mean by computing Mandelbrot. We are always making approximations for visualisation by humans, that’s what we’re talking about here. If you mean we will never discover more digits in pi than there is atoms in the universe then yes I agree but that’s a different problem


Pi doesn't contain a lot of information since it can be computed with a reasonably small program. For numbers with high information content you want other examples like Chaitin's constant.


> Pi doesn't contain a lot of information since it can be computed with a reasonably small program.

It can be described with a small program. But it contains more information than that. You can only compute finite approximations, but the quantity of information in pi is infinite.

The computation is fooling you because the digits of pi are not all equally significant. This is irrelevant to the information theory.


No, it does not contain more information than the smallest representation. This is fundamental, and follows from many arguments, e.g., Shannon information, compression, Chaitan’s work, Kolmogorov complexity, entropy, and more.

The phrase “infinite number of 0’s” does not contain infinite information. It contains at most what it took to describe it.


Descriptions are not all equally informative. "Infinite number of 0s" will let you instantly know the value of any part of the string that you might want to know.

The smallest representation of Chaitin's constant is "Ω". This matches the smallest representation of pi.


„Representation“ has a formal definition in information theory that matches a small program that computes the number but does not match „pi“ or „omega“.


No, it doesn't. That's just the error of achieving extreme compression by not counting the information you included in the decompressor. You can think about an algorithm in the abstract, but this is not possible for a program.


You seem wholly confused about the concept of information. Have you had a course on information theory? If not, you should not argue against those who’ve learned it much better. Cover’s book “Elements of information theory” is a common text that would clear up all your confusion.

The “information” in a sequence of symbols is a measure of the “surprise” on obtaining the next symbol, and this is given a very precise mathematical definition, satisfying a few important properties. The resulting formula for many cases looks like the formula derived for entropy in statistical mechanics, so is often called symbol entropy (and leads down a lot of deep connections between information and reality, the whole “It from Bit” stuff…).

For a sequence to have infinite information, it must provide nonzero “surprise” for infinitely many symbols. Pi does not do this, since it has a finite specification. After the specification is given, there is zero more surprise. For a sequence to have infinite information, it cannot have a finite specification. End of story.

The specification has the information, since during the specification one could change symbols (getting a different generated sequence). But once the specification is finished, that is it. No more information exists.

Information content also does not care about computational efficiency, otherwise the information in a sequence would vary as technology changes, which would be a poor definition. You keep confusing these different topics.

Now, if you’ve never studied this topic properly, stop arguing things you don’t understand with those who’ve learned do. It’s foolish. If you’ve studied information theory in depth, then you’d not keep doubling down on this claim. We’ve given you enough places to learn the relevant topics.


Actually it does, you can look it up. It’s naturally a bit more involved than what I use in a causal HN comment.


> could not calculate an arbitrarily deep zoom that required 10^81 bits of precision. Right?

I’m here to nitpick.

Number of bits is not strictly 1:1 to number of particles. I would propose to use distances between particles to encode information.


... and how would you decode that information? Heisenberg sends his regards.

EDIT: ... and of course the point isn't that it's 1:1 wrt. bits and atoms, but I think the point was that there is obviously some maximum information density -- too much information in "one place" leads to a black hole.


Fun fact: the maximum amount of information you can store in a place is the entropy of a black hole, and it's proportional to the surface area, not the volume.


Yeah, I forgot to mention that in my edit. The area relation throws up so many weird things about what information and space even is, etc.


10^81 zoom is easy. You run out of bits at 2^(10^81) or 2^100000000000000000000000000000000000000000000000000000000000000000000000000000000.


We can create enough compute and SRAM memory for a few hundred million dollars. If we apply science there are virtually no limits within in a few years.

See my other post in this discussion.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: