Shannon's theory of information doesn't offer a way to tell whether a channel is doing computation or merely transmitting the information -- the mutual information merely characterizes how much information goes across a channel, but is insensitive to any changes to the representation.
OTOH, the algorithmic complexity theory (a la Kolmogorov) doesn't really have the same generality as Shannon's theory. Flops is not a well-defined measure of information processing rate for the brain, for instance.
I got inspired by the "integrated information theory" folks -- they have this notion that combining information streams in a nontrivial* way is necessary and sufficient for consciousness. I disagree that it's sufficient for consciousness, but it might be sufficient for a definition of information processing or generalized computation.
> You can't have the latter without the former.
Agreed.
>That doesn't explain meaning in the case of imaginary or abstract details, or the system's conception of the meaning.
The mapping is in our heads. I don't know what you mean by "the system's conception of meaning" -- which system, and what is a conception of meaning?
> Shannon's theory of information doesn't offer a way to tell whether a channel is doing computation or merely transmitting the information -- the mutual information merely characterizes how much information goes across a channel, but is insensitive to any changes to the representation.
In my view, it is more than a change of representation. The computation is using information that might true of something, to produce new information that might be true of something else. I don't see how anything like Shannon's theory could explain how it is able to do this.
> I got inspired by the "integrated information theory" folks -- they have this notion that combining information streams in a nontrivial way is necessary and sufficient for consciousness. I disagree that it's sufficient for consciousness, but it might be sufficient for a definition of information processing or generalized computation.
Ok. I share the same view, that it isn't sufficient for consciousness.
> I don't know what you mean by "the system's conception of meaning" -- which system, and what is a conception of meaning?
The computational system. Consider the case of the human brain, which may be computational. People can understand that some information is about X (say, a particular tree, or the notion of Justice). But it's not just that they know what the information is about, but they understand something of the character of that thing -- of the tree, or of what Justice is like. If the brain is computational, then that would mean that such an understanding was computational (or computational plus bodily interactions with the environment, etc). But that doesn't tell us how it is that computation is able to "embody" an understanding of the character of something. That needs to be explained.
> Here's a link to the paper
Thank you.
Is that the same paper? I notice it has a different title to the one you mentioned above.
OTOH, the algorithmic complexity theory (a la Kolmogorov) doesn't really have the same generality as Shannon's theory. Flops is not a well-defined measure of information processing rate for the brain, for instance.
I got inspired by the "integrated information theory" folks -- they have this notion that combining information streams in a nontrivial* way is necessary and sufficient for consciousness. I disagree that it's sufficient for consciousness, but it might be sufficient for a definition of information processing or generalized computation.
> You can't have the latter without the former.
Agreed.
>That doesn't explain meaning in the case of imaginary or abstract details, or the system's conception of the meaning.
The mapping is in our heads. I don't know what you mean by "the system's conception of meaning" -- which system, and what is a conception of meaning?
Here's a link to the paper: https://www.pnas.org/content/pnas/79/8/2554.full.pdf
edit: *how to define 'nontrivial' is very much up for debate. edit2: formatting