I believe that computation, mathematics, information and semantics all share the same set of simple foundations. And that the means to understanding these foundation is to look at how physical computational processes use information from their environment, and produce information that is used to make "real world" outcomes happen.
I am working on this at the moment, and no, I don't expect that what I write here will convince anyone. And I don't have any summaries of the work at the moment.
We're in need of an information-theoretic definition of computation or information processing, in analogy to Shannon's definition of communication. I'm trying to work it out.
It's clear that there is a relationship between computation and information via Landauer's principle. It's also clear that it's got to do with nonlinearity of dynamical systems: "The essence of computation is nonlinear logical operations." J. Hopfield, PNAS 79, 2554 (1982).
Shannon's theory of information doesn't offer a way to tell whether a channel is doing computation or merely transmitting the information -- the mutual information merely characterizes how much information goes across a channel, but is insensitive to any changes to the representation.
OTOH, the algorithmic complexity theory (a la Kolmogorov) doesn't really have the same generality as Shannon's theory. Flops is not a well-defined measure of information processing rate for the brain, for instance.
I got inspired by the "integrated information theory" folks -- they have this notion that combining information streams in a nontrivial* way is necessary and sufficient for consciousness. I disagree that it's sufficient for consciousness, but it might be sufficient for a definition of information processing or generalized computation.
> You can't have the latter without the former.
Agreed.
>That doesn't explain meaning in the case of imaginary or abstract details, or the system's conception of the meaning.
The mapping is in our heads. I don't know what you mean by "the system's conception of meaning" -- which system, and what is a conception of meaning?
> Shannon's theory of information doesn't offer a way to tell whether a channel is doing computation or merely transmitting the information -- the mutual information merely characterizes how much information goes across a channel, but is insensitive to any changes to the representation.
In my view, it is more than a change of representation. The computation is using information that might true of something, to produce new information that might be true of something else. I don't see how anything like Shannon's theory could explain how it is able to do this.
> I got inspired by the "integrated information theory" folks -- they have this notion that combining information streams in a nontrivial way is necessary and sufficient for consciousness. I disagree that it's sufficient for consciousness, but it might be sufficient for a definition of information processing or generalized computation.
Ok. I share the same view, that it isn't sufficient for consciousness.
> I don't know what you mean by "the system's conception of meaning" -- which system, and what is a conception of meaning?
The computational system. Consider the case of the human brain, which may be computational. People can understand that some information is about X (say, a particular tree, or the notion of Justice). But it's not just that they know what the information is about, but they understand something of the character of that thing -- of the tree, or of what Justice is like. If the brain is computational, then that would mean that such an understanding was computational (or computational plus bodily interactions with the environment, etc). But that doesn't tell us how it is that computation is able to "embody" an understanding of the character of something. That needs to be explained.
> Here's a link to the paper
Thank you.
Is that the same paper? I notice it has a different title to the one you mentioned above.
I don't think so, in the sense that it's not concerned with the same kinds of details as information theory (which it is not to say it is incompatible with it), and it provides a set of foundations that are common to information, semantics, mathematics and computation.
As is well known, information theory doesn't deal with the meaning of the messages. As far as information goes, I am primarily concerned with information in the sense of such "messages", and their meaning.
Computation is considered by many to not involve semantics, because it processes information in a "blind, pattern matching" fashion, without regard to its potential meaning. But the sense in which it is semantic does not have to do with its intrinsic characteristics. It it a matter of its "extrinsic" details - how the information states in it relate to details that are (typically) outside of the computation. Seeing the computation as a physical process, processing information "about" details in its environment, highlights this.
The key to all this is appreciating that, once you see semantics and information processing as a matter of physical processes, you can see that the correct semantics can be necessary for producing a physical outcome. So you can analyse how that physical outcome was produced, in order to understand the semantics.
Ok, so less about the flow and nature of information and more about the semantics.
> Seeing the computation as a physical process, processing information "about" details in its environment, highlights this.
Are you familiar with Karl Friston's work on the Free Energy Principle? [0] He touches on how the nature of a living being (computational agent) is to model its external environment and act upon that model. That making decisions based on internal logic requires external stimulus and semantics being analyzed and augmented in sort of a feedback loop.
> The key to all this is appreciating that, once you see semantics and information processing as a matter of physical processes, you can see that the correct semantics can be necessary for producing a physical outcome.
I think I understand what you're saying and it touches on some of my own inquiries. A more concrete example, if you have one, might better help align me with where your thoughts are at regarding this.
> Ok, so less about the flow and nature of information and more about the semantics.
I'm working on explaining the fundamental nature of information, and arguing that it is fundamentally semantic. I think there's a lot of confusion about what information theory actually tells us. I don't think it actually tells us about the fundamental nature of information. This isn't to detract from its importance, or to say the theory itself is wrong -- I'm saying that the usual interpretation of what it means, regarding information, is flawed.
> Are you familiar with Karl Friston's work on the Free Energy Principle?
Not terribly familiar with it. I watched the video.
I don't think it gets down to the a precise understanding of the fundamentals. I don't think there is a good understanding of the concepts like information and modelling, that it uses. It doesn't seem to provide specific mechanistic explanations of how the phenomena work and how they have their apparent properties.
As a couple of examples, what explanation does it have of how a system may have an understanding of the character of some entity? Where that entity might be something apparently "abstract" or "imaginary"? If the semantics are about mathematical details, what exactly are they about? What are those mathematical details?
But I don't think I can really hope to explain my position here. It requires a lot of explanation, and I'm still working on the explanation.
>> The key to all this is appreciating that, once you see semantics and information processing as a matter of physical processes, you can see that the correct semantics can be necessary for producing a physical outcome.
> I think I understand what you're saying and it touches on some of my own inquiries. A more concrete example, if you have one, might better help align me with where your thoughts are at regarding this.
Imagine a robotic arm that, when a electrical current is sent to it, reaches out and grabs at an area in front of it. If that current happens when there's an object in that position, then the robotic arm will end up picking up the object. If the current happens when there isn't an object in that position, the robotic arm will have picked up nothing. Thus, the semantics of the electrical current has a necessary causal role in the arm picking up the object. And thus we can analyse the causal details to get a concrete understanding of the semantics and its role.
I am working on this at the moment, and no, I don't expect that what I write here will convince anyone. And I don't have any summaries of the work at the moment.