Hacker News new | past | comments | ask | show | jobs | submit login

Occam's razor?



Occam's razor cuts both ways :-)

Some people when they hear that MWI implies that there are many worlds and that the wave function branches they think that implies that MWI is "adding" something so Occam's razor should work against it.

OTOH what MWI postulates that the Schrodinger equation is all there is and the collapse of the wave function is observed because the observer is entangled with the system it measured.

In order to make an intuitive sense of what's going on, MWI invokes the "worlds" and the "branching" but that's just a way for us to grasp how it would feel if ourselves are part of the entangled system. But as many analogies whose purpose is to tickle the intuition it doesn't work for everybody and may be confusing the discourse more than helping it.

But the thing is: the everett interpretation is simpler and it requires less additional rules and mechanisms to explain what we observe. So Occam's razor should favor it.


Simplicity in terms of weighing theory cost isn't about rules, but about explanatory posits. MWI posits exponential growth in discernible state about which sufficient information must be carried by whatever grounds the wavefunction. But you don't get this multiplicity of state/information for free. This is a theory cost that is as profligate as one can imagine. It is magical thinking to imagine that the universe gets this for free.


I don't know where the expression comes from, but "the only numbers that need no justification are zero and infinity". Think of it this way, if you found a coordinate system within the universe that goes from 0,0 to 100,100, you'd have to ask "but why exactly 100?" whereas it would be less surprising if the system had no apparent limit and just could go to infinity on any axis.

The multiplicity of MWI is like that: just a maximally extended coordinate system.


A coordinate system is an abstraction, whether anything has some large coordinate is irrelevant to the coordinate system itself. But my understanding of MWI is that there is discernible state being described by the Schrodinger that is taken as "real" or "concrete" by MWI. If we considered the non-actual branches as merely abstract coordinates, then there would be no problem with profligate state and its associated theory cost. But then it wouldn't be many worlds, but rather some kind of collapse theory.


Wavefunction state is just as “real” in the Copenhagen interpretation. There cannot be any hidden variables, it is not merely a mathematical abstraction. You then must posit this additional process of collapse

Where the MWI takes additional theory burden is where it explains how the born rule translates to our lived experience - ie. why do we only experience one path of the wavefunction


I have no clue what you mean by “theory cost” but MWI is just describing existing evolution of the wave function.

I think the measurement problem is a much bigger ‘theory cost’ to Copenhagen than just assuming wavefunction mechanics continues even after measurements are made.


Theory cost just means some quantity by which we compare theories and judge one to be more parsimonious. A theoretical posit has some cost associated with it. MWI posits actual discernible states, each of which is a unique posit and thus a unique contribution to the theoretical cost of the whole. This state grows exponentially which implies an exponentially growing cost. This is as profligate as one can imagine. Copenhagen does not have an exponential cost as the non-collapse branches do not have a concrete reality under Copenhagen. The cost associated with the new collapse construct pales in comparison.


the occam's razor is an abstract principle, so it is hard to really quantify what counts as additional 'theory burden'. i simply don't agree that this "actual discernible states, each of which is a unique posit and thus a unique contribution to the theoretical cost of the whole" is an actual way of judging theory burden vs. just "the rules we've already discovered continue to apply at large-scale and chaotic systems" which i would say is very low theory burden

Copenhagen isn't even clear on if the states have a concrete reality or not - there certainly are no hidden variables, at least.


There's a certain amount of intuition and taste involved when assigning theoretical costs, but that's no reason not to do it. We have general principles with some rational justification for preferring certain features of theories over others. Simplicity in terms of "not multiplying entities without need" is one such principle.

A discernible state is an entity. More discernible states requires more entities or more discernible features of an entity. This is the only way to make the concept of discernible state intelligible. To deny this is just to engage in magical thinking. But each posit has a cost associated with it. We don't know what that cost is exactly (relative to other theories with different features). But we do know that an exponential growth in discernible states overshadows the theoretical cost of the whole regardless of how we quantify the cost of any individual posit. MWI is almost maximally profligate; any empirically adequate theory without this cost should be preferred.


to me, this is like saying germ theory of disease has higher theoretical burden than miasma theory of disease because it posits all of these individual viruses and bacteria around us everywhere.

"sure, we can observe bacteria and viruses in the lab individually - but when we move to the physical world disease spread is described by miasma which is simpler and thus this is the more parsimonious theory." there are fewer entities in miasma - but you still have to posit this additional thing (the miasma) on top of the already observed viruses and bacteria.

similarly, i don't agree that 'count of entities' is the correct criterion for what makes a theory more or less parsimonious/occam's razor preferred. it is about the number of rules even if there is a single rule that creates exponential entities. Copenhagen would also predict this 'exponential branching' in the system under observation - it just proposes that it collapses when measured by the apparatus: an additional rule that makes it less parsimonious than many worlds


It's hard to compare theories that don't have well-developed mechanisms. If we're naive theorists comparing miasma vs tiny organisms, but you have no sense of how those organisms operate, its functionally just a comparison between one miasma and many miasmas, i.e. one vs. many nebulous non-mechanistic effects. In that case one miasma would be preferable. But that's not a demerit to the concept of minimizing entities as a theoretical virtue.

>i don't agree that 'count of entities' is the correct criterion for what makes a theory more or less parsimonious/occam's razor preferred.

I mean, its literally in the original formulation of Occam's razor: "plurality should not be posited without necessity". But there are many ways to justify this. My preferred argument is that fewer entities means fewer resources with which to "bake in" the explananda into the theory. The more knobs to turn to generate your empirically adequate theory, the less likely your theory will capture reality.

>[Copenhagen] just proposes that it collapses when measured by the apparatus

Measurement isn't limited to an "apparatus". Collapse of superposition is a feature of large scale interactions. While there is branching, in practice it is bounded because interactions tend to cause collapse.


Well. They're not discernible states if they are in another world. That's the very definition of "another world".

From an outside viewer that hasn't been entangled with you, no branching happens and they see the overall state evolve according to the Schrodinger equation.

Imagine that w create very small machines that could perform some observations and record them into some internal log.

Now imagine creating a particle in some state and have that machine measure that state and record it internally. Provided that the particle+machine system is isolated from the environment (and us) you'd probably agree that the particle and machine are entangled and that the state of the "log" is not in a defined state until we measure it.

Now, that information processing machine is not a human but I think it may be useful to describe how the external world would look like from it's point of view. For example, if that machine performed several experiments on that test particle or other particles it would note in its log data that would confirm that it obeys the Born rule.

In that sense, it's "useful" to imagine how things look like from the "point" of view of that machine and it's possible outcomes in the many worlds, although these worlds might not really matter to us since we can describe the system that includes that machine as having been in a mixed state all along until we humans actually observe its "log"

The likelihood that people start to object to this though experiment gets higher and higher s as soon as one starts to imply that we humans ourselves are exactly like that machine.

For me, the observation that our ability to accept or reject this interpretation is so much tied to our intuitions about our own _self_ is a strong hint that actually favours the theory. We humans have been known to be trapped by our point of view


>Well. They're not discernible states if they are in another world. That's the very definition of "another world".

Discernible in principle. Either the states are distinguishable from within the system (e.g. two branches that disagree on the state of the world), from an outside observer, or a God's eye view. If none of this is true then there is no other world.

>Now imagine creating a particle in some state and have that machine measure that state and record it internally. Provided that the particle+machine system is isolated from the environment (and us) you'd probably agree that the particle and machine are entangled and that the state of the "log" is not in a defined state until we measure it.

Lets further imagine a large collection of these machines connected in series. Each machine can perform one of two experiments at any given trial, and which experiment they run depends on the most recent outcome of the prior machine in the series (e.g. the direction of a particle's spin). There are an exponential number of scenarios as the number of machines in series grows. If we imagine this system entangled until we observe it, and we expect that the definite state is consistent in terms of which particle the machine measured and the corresponding prior outcome in the series, then the entangled system just has to carry discernible state in proportion to the exponential state space of the system. If not, then the system isn't in an indefinite state until observed. The concerns about exponential theory cost remain. Unless the state space is bounded by collapse or some other means, the theoretical cost to MWI or any theory based on Schrodinger without additional posits is supreme.

>In that sense, it's "useful" to imagine how things look like from the "point" of view of that machine and it's possible outcomes in the many worlds, although these worlds might not really matter to us since we can describe the system that includes that machine as having been in a mixed state all along until we humans actually observe its "log"

There's a tension here that needs to be released somehow. On the one hand, you have an external observer determining the system is in a superposition of states. On the other hand, you have the perspective from within the system of everything being definite. You can't just carry both forward simultaneously without addressing the disagreement on the state of reality. Either there are really an unbounded number of branches, the branches are bounded by some mechanism, or there are no branches and the evolution of reality happens asynchronously.


I'm not sure what you're actually suggesting.

Are you saying that under this scenario of N such machines in series the MWI would make different predictions that just solving the Schrodinger equation?


No, I'm saying that under the assumption of no collapse, the scenario with N machines after a long period of entanglement requires that nature "stores" (has discernible state for) information proportional to e^N bits about this region of space. The prediction is the same, the question is what does it say about how nature is constructed and how does this relate to the credence we give the theory.


I think you just highlighted the fundamental difference between a bit and a qubit


Do you speak Ithkuil?


The sibling reply is correct, but I'll give a different answer connected to the analogy I gave that you are replying to. Occam's razor prefers simpler theories not smaller universes.

The idea that the universe consists of seemingly uncountable numbers of stars, each of which is another Sun and perhaps has planets of its own, is vastly more detail than the old heliocentric model of one Sun, a fixed number of planets, and stars being pinpricks in the tapestry surrounding the solar system. Nonetheless Occam's razor suggests that we accept the view that the stars are suns and we are just one planet among many billions upon billions, because although that universe requires vastly more detail to describe, it is governed by a simpler set of rules. Unifying heaven and earth under a single, simple set of physical laws, discovered by Newton, makes for a simpler scientific model, even if it introduces the possibility of the universe being vastly larger than we originally thought.

Similar situation with many-worlds. If MWI is right, then the "universe" if interpreted to include all reachable branches of the so-called multiverse, is vastly larger than we previously believed. But it is also simpler, in a strict Occam's razor sense, because it does away with the concept of collapse. Wave function collapse is an additional physical rule in the Copenhagen interpretation, whereas it's just an illusion and not an enumerated part of physical law in MWI.


Occam's razor is not just about simple rules. The original formulation is: "plurality should not be posited without necessity". Necessity here is empirical adequacy. A theory that posits an unbounded, exponential growth in states fails the simplicity test against an empirically adequate theory that posits bounded state.


The philosophy of science has advanced significantly since the 12th century, and scientists don’t generally use the original definition anymore.

I have never, ever heard it expressed in this statistical mechanics sense, and it doesn’t line up with most advances in physics (my field) which have been accompanied by increased number of states but fewer rules.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: