Hacker News new | past | comments | ask | show | jobs | submit login
The sleeping beauty paradox (stats.stackexchange.com)
77 points by xtacy on Aug 31, 2015 | hide | past | favorite | 80 comments



I'm having trouble finding the paradox here. As with many probability puzzles, the problem seems to be a hidden conditional in the probabilities.

There's no change in belief happening, only two different beliefs. First is "1/2 of the times it is flipped, the coin will be heads". Seconds is "1/3 of the times I am awakened, the coin will be heads". The coin flip is fair, but the decision to ask the question is biased.

This is extremely simple to demonstrate by resorting to the absurd case. If we change the awakening ratio away from 1-2, the absurdity of saying "1/2" becomes increasingly clear. At 1-9, tails will be the correct guess 90% of the time the question is asked. At 0-1, heads isn't even a possible outcome on awakening.


Both answers are valid because the question is ambiguous. Asking "what is your belief" doesn't work because there are multiple ways to interpret that. You can clarify such situations by putting them in a game. Probabilities are about how much you would bet on it.

Game 1: You get $10 every time you guess correctly. Best strategy: Guess tails because if it's correct you get $20. If you guess heads correct you get only $10. If you got $3 for guessing tails correctly and $6 for guessing heads correctly then it would not matter what strategy you used.

Game 2: If you guess correctly then you get $10 by the end of the experiment. Best strategy: it doesn't matter what you guess, if you guess tails correctly you get the $10, and if you guess heads correctly you also get $10.

Thirders think the question is like game 1: every time they guess correctly they win correctness points. Halfers think the question is like game 2: they either guess correctly or they don't; you don't get additional correctness points by guessing right twice.


> Game 2: If you guess correctly then you get $10 by the end of the experiment.

The question is clear you get a reward each time you awaken, not at the end of the game.

The reward is correctness (Humans like being correct = reward, else things like 'should believe' becomes meaningless hence the question becomes meaningless Q.E.D. it's not meaningless and correctness is a reward)

"When you are awakened, to what degree should you believe that the outcome of the coin toss was Heads?"


This fits with my understanding of what's happening here. The SO discussion is a bit unclear as you go down the page, but the initial statement of problem up top clearly claims that you're questioned each time you awaken.

I think you're quite right to point out that being correct is the only fair heuristic for this question. Several of the defenses of a 'halfer' belief amount to "you ought to believe this so that your beliefs obey this type of decision theory". Actively believing something with a (reliably) worse payout to conform to a rule about belief seems like a terrible form of epistemology.

On a related note, are you familiar with Newcomb's Paradox and the LessWrong debates over it? They get to a similar idea of "correctness above all".


Your Game 2 is poorly defined. What does it mean to "guess correctly" when you're asked to guess twice? If you systematically take the first (or the last) answer, then it's equivalent to being awaken only once on Tails. If you take the logical OR of the correctness of two guesses, then the sleeping beauty can win 5/8 of the time by answering randomly, which is better than 1/2 by guessing a fixed answer (the only strategy you analyzed).


Right, but I didn't want to complicate the explanation. Only deterministic strategies are allowed, or require both answers to be correct in order to get a reward.


For game 2, flip a coin and guess what is shows.

If the original coin was heads, your chance to win is still 50%.

But if the original was tails, and you only have to guess once correct, your chance to guess wrong twice is only 25%.

So a 50% chance of winning half the time and a 75% chance half the time gives you an overall better chance that the 50% of sticking to one answer because it doesn't matter.

It seems a random guess is the best solution for game 2.


For 1-9, tails being the correct guess 90% of the time still makes sense to me.

If you run the experiment twice, typically the first time you get heads, you are woken once, and so heads is correct answer once, the second time you get tails, but you are woken 9 times, so tails is the correct answer 9 times.

Even from the frame of reference of the experimenter, heads and tails are equally likely, but it if you guess heads incorrectly, you will be wrong 9 times, or 9 times as wrong ;).

Basically it depends on what you consider to be a "trial" of the experiment, if a coin toss is one trial, the wakings are a red herring, the probability is the same as the coin toss. If one trial is one waking, 90% of wakings are going to be tails.


The phrasing of the question is When you are awakened, to what degree should you believe that the outcome of the coin toss was Heads?

It's not actually asking for a strategy about guessing the correct answer, or how many times you'll be right or wrong.

It think whether you are a halfer or a thirder is going to depend on how you interpret the question.

If you see it as asking about the frequency at which "Heads" will the right answer, then it's clearly 1/3. But that's not what the question actually asks (probably - all language is interpreted).


"1/3 of the times I am awakened" never happens though. You're either awakened once or twice, not 3 or 6 or infinite times. I think the thirders get stuck on this "if we do it infinite times" or "in the limit case" thing, which is different from doing a single coin flip. http://stats.stackexchange.com/a/169582/87304


I think the problem is in the phrasing of the question: When you are awakened, to what degree should you believe that the outcome of the coin toss was Heads?

That's not a precise probability question.

More significantly the question (whatever it might be) is potentially asked more than once.

So, if the question is "what is the probability that the coin-flip was Heads?" (which I think it the most likely interpretation) then it's 1/2

But if the question is "what is the probability that you are being woken up because the coin was Heads" then it's 1/3

Asking me what I "should believe about the outcome" is (intentionally, I suppose) blurring the question so that it disguises the fact that there are actually 2 different questions in play.


The event isn't just "there was a coin flip". The event was "there is a coin flip and I have been woken up". For this event, two thirds of the time the coin flip will have been tails.

Halfers forget that there is an extra piece of information available "I have been woken up". SB knows that one third of the time she has been woken up it is because the coin flip was heads, and two thirds of the time it was tails.

Anyway, this is a stupid "paradox". It is extremely simple to build a simulation of SB being woken up and coin tosses, and if you do so, you get the clear answer - 1/3. There is no sampling issue or other trick that makes this simulation hard to write correctly...


No, those questions are equivalent. In the single experiment case, the answer to both is 1/2, and in the infinite case, it's 1/3. There's no "3" to even enter into the equation in the single experiment case.


Uh? The outcome probability is the same if you do it once or if you do it infinitely, it's an independent event.


Which answer is "correct" depends on your interpretation of the question, as you say, and that's exactly the point. Like many paradoxes of the kind, it illustrates the ambiguity of the question and, by extension, the lack of nuance in the way we describe and reason about probabilities (in general, that is; of course one who has learned some probability theory is likely to be able to understand and communicate the difference).

Implicit in any answer is a way of valuing the choices made. In the 1/3 case, it is assumed that being right twice is better than being right once - for example, you get $100 each time you answer correctly. In the 1/2 case, it's assumed that answering the same question correctly twice isn't inherently better than answering it correctly once - for example, you have to debate an opponent at each awakening. You're either right or wrong and you don't get extra credit for winning the same debate with the same opponent twice.


Exactly this. I came to make the same comment, but unsurprisingly someone has beat me to it. In particular, the answer to the "paradox" in a nutshell is this:

> There's no change in belief happening, only two different beliefs. First is "1/2 of the times it is flipped, the coin will be heads". Seconds is "1/3 of the times I am awakened, the coin will be heads".


Yep it's not a paradox, it's actually just two separate questions.

1. What is the probability that the coin toss was heads (as posted in question)

    set of possibilities: { H, T }
2. What is the probability that you have awoken due to the coin coming up heads? (implied by question, but not explicitly asked)

    set of possibilities: { H, T, T }


The question is actually: "given that I have woken up, what is the probability of heads?"

This is the question that is disputed by sources in OP. Both sides are addressing the same question.


I agree on the 0:1 case, but say it's even 1:100000000. Before you go to sleep, you're asked "what do you think the result of the coin flip will be?", and then you wake up and are asked "what do you think the coin flip was?" What new information have you received that changed your mind?


The fact that no new information was gained does not mean that your answer to "what do you think the coin flip was?" will neccessarily be equal to "what do you think the result of the coin flip will be?"

All it means is that you can already, before going to sleep, make an unconditonal strategy "when woken up, I'll say this". And your knowledge can easily lead you to a statement "right now, I believe the coin flip will be 50/50, but when I wake up I'll bet on tails".

The coin flip does change the probabilities significantly after you get awakened - e.g. in the 0:1 chance you know that being awakened implies that it must be that particular result, and the 1:100000000 scenario implies that if you're awakened then it either was tails or something extremely unlikely happened. The "no new information" clause simply means that you get already informed about this probability before going to sleep, not after the coin flip.


The new 'information' is the number of times you're being woken up. Even though this information is not being told to you, you can implicitly use it by using the same strategy every time.

You should see it from the perspective of every awakening.

Imagine this. The experiment is repeated 10 times. On average the coin will have landed 5 times on heads, and 5 times on tails. This means that you'll be woken up 5 times while the coin landed on heads. However, you'll be woken up 500000005 (!) times while the coin landed on tails.

Knowing this, when you're woken up only in 5/500000010 of the scenarios the coin landed on heads.


Can someone explain the half position clearly? I don't understand how anyone could think half is the correct answer.

Let me frame the question a different way. You are one of three volunteers in separate rooms. I flip a coin and if it's heads I ask one volunteer (at random) to guess the outcome. If I flip tails I ask two of the volunteers (again, at random) to guess the outcome.

You know the rules I will follow, but you cannot tell if anyone else has been asked before you. I open the door and ask you to guess the outcome of the flip. What do you guess?


Your reframing of the question changes the problem. Let A be the probability of Heads (1/2) and B be the probability you are asked (1/2) then P(B|A) is 1/3. So the probability the coin came up heads is P(A|B) = P(B|A) * P(A)/P(B) = 1/3. Which is what you expected.

In the original problem the probability of getting woken up is 1. So we end up with the probability of the coin being heads given that I was just woken up is (1 * 1/2) /1 = 1/2.


Suppose I am going to flip a coin and give you money if you guess correctly. On heads I hand you 1$ on tails I hand you 1$ and add 1$ to your bank account without telling you. Now, after flipping the coin I ask you what the heads or tails?

Clearly your going to pick tails. Why? Because it's the more rational choice based on the spoken rules.

Going to sleep and losing memory is really just window dressing. The simple truth is before you went to sleep you should be able to say 1/3 odds of waking up to heads.


But that's still not the question.

The stated question is When you are awakened, to what degree should you believe that the outcome of the coin toss was Heads?

Not, Should you _guess_ Heads or Tails?

The probability of the coin toss _does not change_. The probability that "the reason they are waking you up is because the coin was Heads" does change, but that's not the question.


Clearly there is a 50:50 shot you get heads or tails from the coin. In your model of it's a 50:50 coin then you would have 50:25:25. The problem is waking up the second time does not reduce the odds of waking up the first time thus 50:50:50.


I think you changed the question. Before you go to sleep you know you have a 50% chance of waking up to heads and a 50% chance of waking up to tails twice. When you wake up you haven't learned anything so those probabilities can't have changed.


You have arguably less information when you wake up in that you don't know the time. Are you waking up because it's tails? because it's heads? or is it tails the second time?


This is an incredibly powerful rephrasing, kudos. It's perfect, because for all intents and purposes you are "a[nother] person, who cannot tell if anyone [he] has been asked before", even though that other person is "also you."

Very good! The half position is clearly, simply wrong.


I think 1/2 argument makes essential use of the fact that the "2 people" are the same person at different times. Ultimately it comes down to a different idea of the way "being right twice" is valued. In the 1/3 position, the model is that each time you answer the question you get something good if you're right. In the 1/2 perspective, the idea is that after the whole experiment is over you either were simply correct or incorrect, and being correct twice doesn't count any extra.


You don't get to be "right twice"

The question is When you are awakened, to what degree should you believe that the outcome of the coin toss was Heads?

It's not a guessing competition, it's asking you to assess the probability of a past event that you have no new information about.


[deleted]


I think the issue here is that the two of you are answering subtly different questions - which statement of the problem encourages to arrive at a 'paradox'. In particular, we need to agree that Sleeping Beauty is questioned both times on a result of tails to make progress.

The coin is fair, so the chance of getting heads is obviously 1/2. That's the answer to "What are the odds that a given coin flip came up heads?", but it's not what Sleeping Beauty is being asked when she wakes up.

What SB is being asked is "Given that you have just been awoken, what are the odds that you were awoken as a result of the coin landing heads?" There's an implied conditional present. The coin flip was fair, but the questioning is biased by result - if SB is awake and being asked, the odds are 2/3 that she's in a world where the coin came up tails.

To demonstrate, resort to extreme cases, as you did. If SB isn't woken at all for heads, then the answer to the question "you're awake, how did the coin land?" is tails with 100% certainty. If she's woken up once for heads and 99 times for tails, then the answer to "you're awake, how did the coin land?" is tails with 99% certainty, and so on.

This all hinges on the assertion that SB is questioned on every awakening. If she's only asked on the first awakening (or is asked about the coin flip as a general concept) then the answer is of course 1/2, as you assert.


And herein lies the problem. You say Given that you have just been awoken, what are the odds that you were awoken as a result of the coin landing heads?

But the linked post says (ambiguously) When you are awakened, to what degree should you believe that the outcome of the coin toss was Heads?

I don't think your interpretation of the post's question is accurate, but it's worded in vague enough terms that in effect we're arguing about the writer's intent rather than stats.


> the question asked about the odds of the coin toss

No, the question asked about the outcome of the coin toss, which is much less likely to be Heads (1 wake) than Tails (2 wakes).


I like this rephrasing, but I think you only need n=2 subjects not n=3. To see why you should guess tails, set n=100.


The fact is, there aren't three volunteers, there's only one. Upon awakening, she has no information she didn't have when she went to sleep, which at first seems to be an unassailable argument in favor of the "halfers."

To me, the problem with the "no new information" argument is that there's another popular paradox in which it also seems to apply, but fails spectacularly. In the Monty Hall paradox (https://en.wikipedia.org/wiki/Monty_Hall_problem), you know in advance that the host is going to open a door at random and offer you a chance to change your initial guess. You're aware that the host knows what's behind the doors. You know that no matter what door you pick, one of the remaining ones has a goat behind it. And you know that the omniscient host will pick that door. So what new information do you gain when the host does what you knew he was going to do?

One way to resolve the Monty Hall paradox is to rephrase it such that there are a million doors and the host opens all of the remaining ones except one. Of course it makes sense to switch doors in that scenario, since you knew that the host would avoid opening your initially-chosen door as well as the one with the car behind it. Your initial door has 1:1000000 odds of winning, while the only other closed door has odds of 1/2.

So if you could find a way to transform the Sleeping Beauty problem into a Monty Hall problem, that could be a good way to settle the argument in favor of the "thirders."

Or, if with some logical sleight-of-hand, you could show that SB is not isomorphic to MH, that could potentially be used to strengthen the no-new-information argument made by the "halfer" side. I'm leaning that way, because SB does not know upon awakening, and will not remember, whether the experiment will result in her being awakened twice. For all she knows, she's playing the same Monty Hall game for the second time. It's this lack of an unbroken timeline of conscious awareness that settles the question in favor of the halfers.

To answer the thirders' rebuttal that SB would lose money if she were gambling on the coin toss with any odds other than 1:3, I'd say that's a completely different experiment. We aren't being asked to speculate on the set of events behind multiple awakenings, just one in isolation. The lack-of-new-information argument carries the day. Any arguments that involve repetition, clones, or gambling fail because they are responding to a different question.

TL,DR: Don't gamble if the game involves erasing the players' memories. You will lose a lot of money in a hurry, even if your strategy is mathematically optimal.


Here's an amplification that might clarify it. If you flip heads, I'll wake you up zero times. If you flip tails, I'll wake you up once. Now what's the probability that you flipped tails. 100%, right? Why? The coin still has a 50% chance of giving either heads or tails.

Here's another -- if you flip heads, I'll wake you up once. If you flip tails, I'll wake you up a million times. When I wake you up, is it more likely that you flipped tails or heads, remembering that you don't know which day it is.

You are right, by the way, that there is no new information -- the tendency to answer "50%" the night before is the logical error. You are conflating the odds of flipping heads or tails with the odds of WHETHER you flipped heads or tails in light of the known outcome (which is not 1:1 with flipping heads or tails).

Also, if you believe in statistics, the right answer in the continuous case (infinite repeats) is also the right answer in the singular case (one repeat). Your point about "there aren't three volunteers, only one" is fallacious.


Here's an amplification that might clarify it. If you flip heads, I'll wake you up zero times. If you flip tails, I'll wake you up once. Now what's the probability that you flipped tails. 100%, right? Why? The coin still has a 50% chance of giving either heads or tails.

I do like that, it's a strong argument. But it still relies on monkeying with the participant's awareness in a way that is likely to (or in this case, will) keep them from making the right decision over time. You use the term "known outcome," but there isn't a known outcome if you're dead or asleep forever.

Your point about "there aren't three volunteers, only one" is fallacious.

But so is treating it as any other problem (like gambling) that only converges on an answer after multiple trials. The strength of your argument is that it doesn't rely on answering a different question, just changing a parameter in the question that was asked. But it's an awfully important parameter. Implicit in the question is the premise that you will be alive/aware at some point in the future, so that you can be asked what you think.

Edit: another way to rebut your particular argument is by sticking with the original terms of the problem, except that if the coin lands on heads, they shoot you after you answer. That doesn't help you answer the question.


I think you're starting to grasp it. The key here is that if you are dead or asleep forever, then you are not waking up to answer the question. Therefore, the simple fact that you are waking up brings new information to the situation -- you know for sure that you didn't flip heads.


But in the original question, you were already guaranteed to wake up, so how does it give you any information?


I don't see the problem in their problem:

> Whenever SB awakens, she has learned absolutely nothing she did not know Sunday night. What rational argument can she give, then, for stating that her belief in heads is now one-third and not one-half?

The argument that she can give is clear: she might have learned something, but the memory was taken from her. And she knew that they would take the memory from her. So yes, she hasn't learned anything new, but that's a cop-out—the problem explicitly prevents it.

I mean, from that perspective, she doesn't even need to go to sleep! They could ask her, "when you wake up, how likely will you think that the coin was heads?" and get the same (correct) response of 1/3. That result is not based on her "learning new information" (since the situation forbids it), it's based purely on the situation as described.


> They could ask her, "when you wake up, how likely will you think that the coin was heads?" and get the same (correct) response of 1/3.

Exactly. Let's stretch the example to the extreme, if you get heads you get woken once, if you get tails you get woken 1 million times. Now say you run this experiment 10 times and you happen to get 5 heads and 5 tails. You'd be woken up 5 million and 5 times. And with each wakening, the chance of the coin having been heads isn't 50%, if you'd say and guessed it was heads every time you wouldn't get half of them right, you'd be statistically wrong about 5 million times out of the 5m and 5 times. The correct answer is 1 in a million, and if you'd guessed tails on every awakening instead you'd be wrong on average only once in a million.

It's this chance to be correct or incorrect in your answer when waking that would inform the question that was posed 'When you are awakened, to what degree should you believe that the outcome of the coin toss was Heads', which then would be 1/3.


First of all, how often you're correct doesn't matter; say the tails option wasn't "you'll be woken up a million times", but instead "you'll be asked the same stupid question a million times" (and have to answer as if you'd forgotten the previous answer). Of course you'll be wrong more often in the tails situation.

Second, the experiment isn't run 10 times. It's run once. A single coin flip. With a multitude of flips, you've got no idea where you are in the sequence so it pushes your averages toward the 1:1000000. But a single coin flip is a single coin flip, 50/50.


> "and get the same (correct) response of 1/3"

Are you sure it's the correct response?

The way I see it, the correct response is 'neither', i.e. neither 1/3 or 1/2. Without having any indication about how many times you have played the game, there is no way of determining between the two odds, it could be your first go, or it could be your 1 millionth go.

Without any knowledge of time spent, there is no correct answer other than 'neither'.


Here is a surefire way to win the lottery.

Start by picking numbers. Your chances of winning naturally are very small, so I will make an arrangement with you. You go to sleep before the winning numbers are read. Afterwards, if you have won, I will wake you up N times, administering our trusty forget potion. But if you have not won, I will wake you up only once.

As we have established in this thread, by increasing N, your chances of having guessed the lotto numbers correctly upon awakening approach 100%.

I wake you up, you apply Bayes' Theorem, and then rejoice, for you almost certainly have won the lottery!


"[…] to what degree should you believe that the outcome of the coin toss was Heads?" is a terribly unrigorous way to phrase this question. I suspect it's where all the confusion sets in.

Are we adjusting the natural prior of a coin toss based on information we now have? We have no information; 50%.

Are you placing a bet each time you are awake? You get to bet twice if the coin comes up Tails; 2:1 (33%) odds against Heads balances the tables.

Are we judging how often we'd be correct if we guessed Heads every time we were awoken? Depends on the number of trials. With only 1 trial; 50%. With 2 trials; 42%. With infinite trials; 33%.

Nothing begets a paradox like an ill-posed question.


>We have no information

We do, we know that we are now in the experiment. In the case where you don't wake up on heads at all, have you learned new information upon waking? If yes, why don't you learn something anytime the relative probabilities differ?


That's not new information. You know that you will be in the experiment before the coin toss. Information does not flow acausally.

To contrast with Monty Hall, after the MC opens the first door, you do have new information: you glean information from the MC's choice, which he made based on his knowledge of where the prize is.

In this problem, the researchers take no action visible to you after the coin flip, and your memory is wiped before each new observation you can make. So your knowledge about the flip's outcome is exactly the same after it occurs as before: 50% chance.


If that doesn't count as new information, does waking up if you were to only wake on tails count as new information?

If yes, what's the justification for distinguishing the two?

What happens if you are told it's your first awakening as soon as you get up? Bayes theorem would imply you can't think the chances of heads stay the same before and after hearing this, so at least one must not be 50%.


I might be misunderstanding your conditions, but I would say no, waking up only on tails would not count, unless of course you got to make your choice after the researcher indicates they're about to put you back to sleep. Then you obviously know that the coin flip was tails! But in any case where you have just awoken after having had your memory wiped, and you have not been told anything by the researcher, you have been given no new info.

I think, if you are told it's your first awakening, then you still must assume 50% heads or tails: in both cases, you will always have a first awakening, so there is no new info there. However, since you will not be told this in your second awakening, if that occurs, you know immediately the coin must have come up tails.


>I think, if you are told it's your first awakening, then you still must assume 50% heads or tails: in both cases, you will always have a first awakening, so there is no new info there.

How is this not a direct violation of Bayes theorem? The probability of learning that it is the first awakening differs based on what the coin flip was, so it requires an update.


Once you know it is the first awakening, the probability of it being the first awakening is 100%.


When you wake up, you don't know which awakening it is. You claim the probability of heads is 50%.

Now, you are told it is the first awakening. The probability of learning that if heads is twice as much as the probability of learning that if tails, so your Bayes factor is 2:1. You therefore cannot still believe that the probability of heads is 50%.


Maybe it's really an argument about this:

Upon awakening:

Halfer:

    P(monday,heads)=0,5
    P(monday,tails)=0,25
    P(tuesday,tails)=0,25
Thirder:

    P(monday,heads)=0,33
    P(monday,tails)=0,33
    P(tuesday,tails)=0,33


I think it boils down to the thirder position.

P(monday,heads)=0,33 since P(heads)=0,5 and P(monday)=0,67.

This is because we are sampling awakenings, not coin tosses or beauties.

It's like betting on a coin toss, but with a side twist that the bet is evaluated twice (no new toss) when the coin is tails. Hence a majority of evaluations will be on tails:

  Guess, Coin, Profit
  H,     H,    +1
  H,     T,    -2
  T,     H,    -1
  T,     T,    +2
Analysis for betting heads: average is +1 + -2 = -1 Betting tails: average is -1 + +2 = +1.

I'd bet tails!


A similar problem is God's Coin Toss as described in Scott Aaronson's excellent lecture series (and book) "Quantum Computing Since Democritus":

http://www.scottaaronson.com/democritus/lec17.html


Not a paradox...here is a proof that it is 1/3

Let P(T) = probability that coin landed on tails, P(H) = probability that coin landed on heads.

Let "1st" denote the event that it is the first time you are awaken, "2nd" the second time.

Note that P(T|1st) = 1/2 (if you were told that this is the first time you were woken up, it's equally likely that the coin landed heads or tails). And of course, P(T|2nd) = 1

By the definition of conditional probability, 1/2 = P(T|1st) = P(1st|T)P(T)/P(1st) = (1/2)P(T)/P(1st) Hence P(T) = P(1st) = P(T)1/2 + P(H) = P(T)1/2+(1-P(T)) = 1 - P(T)/2 -> P(T) = 2/3


Both are right: the two camps posit completely different things.

Halvers stipulate a single experiment. Thirders stipulate infinite experiments.

It is pretty straightforward math to show these are not inconsistent, and there are even options in between. https://stats.stackexchange.com/questions/41208/the-sleeping...


From my understanding, the answer is 1/2.

Let us assert that the probability of the coin flip being heads is 1/2.

Now, you have awoken. Regardless of if you've awoken to the Heads flip, or the first time to the Tails flip, or the second time to the Tails flip, the original flip's chance was still 1/2.

The possible misunderstanding comes from the fact that you will awake TWICE to Tails, which is more than to Heads! But the thing is this is irrelevant to the question, because you don't know whether this is the first or the second time, and you only need to wonder whether the original flip (recall it is probability 1/2) was heads or not. Potentially you could think that you'd be "wrong" more often, but we are only looking at one specific instance of you waking up in isolation, for which you have no additional information.

For example, consider waking up 1000 times if the flip is tails. Upon waking up, do you think the probability heads becomes 1/1001? I think regardless of waking up the first time or the 1000th time, you have no more information so it might as well have been the first time, and hence the probability is 1/2.


You are not being asked the probability of a coin flip. You are a being asked the probability of a coin flip given that it woke you up. Simple Bayesian statistics. You don't know which of the three scenarios your in (Heads: 1st wake up, Tails: 1st Wake up or Tails: 2nd wake up). Given that there are three possible scenarios you are in, when finding myself in a scenario, I would give each scenario equal weight.

There's my devil's advocate view. I can see what you're saying though.

Edit: Additionally, even though the awakenings are not in the same 'stream of consciousness' if one tails waking happen, they both happen. They are linked even if you're trying to view them in isolation.


Let me try to explain why this is the wrong answer.

Let's assume there are 100 people undergoing this experiment. Half of them will flip heads, half tails. The half that flip heads will be woken up a total of 50 times. The half that flip tails will be woken up a total of 100 times.

So, we have a total of 150 wake-ups. 100 of those came from tails, and 50 from heads. So, if you're woken up and have no prior knowledge, you have a 100/150 = 2/3 chance that you flipped tails, and a 50/150 = 1/3 chance that you flipped heads.

Put another way, when you are woken up, it could be one of three cases:

-- Awakened on the only time for heads -- Awakened on the first time for tails -- Awakened on the second time for tails

Two of those cases correspond to tails, one to heads. So, it's twice as likely that you are being awakened due to a tails flip.


> ...Upon waking up, do you think the probability heads becomes 1/1001?

Of course, do you really think otherwise? If I promised to put a pound in your moneybox every time you correctly guessed the coin state on each awakening, would you really fall to sleep confident in a 50/50 guess strategy in your 1000 awakenings to 1 scenario?


the betting argument is not the same as what is being discussed.

to a halver, the likelihood of heads and tails are still each 1/2, but the expected winnings of betting tails would of course be larger.

it would be like saying i'll flip a coin, and if you guess correctly that it's tails i'll give you a million dollars. obviously i will guess that it's tails, even though i still believe each outcome is equally likely.


It seems to me that the paradox/confusion here comes from asking someone to consider a distribution over a bound variable - i.e. you are asked to examine the odds of X happening in a situation where there are already side effects of whether or not X happened.

In this sense, it reminds me of the puzzle where a man gives you a choice between two envelopes, one of which is specified to contain twice as much money as the other, with the paradox centering on a bystander's argument that you should then switch envelopes, since the other one must have either half or twice your value, giving you a 1.25x higher expected value for switching.

As I understand it, in both cases the "traditional" solution to the problem is to recognize that probability doesn't work that way, and you can't consider distributions over bound variables, but the more interesting solution is to rephrase things in Bayesian terms, in which case the analysis is reasonably straightforward.

I'm a dabbler though; experts please tell me if I'm spouting gibberish.


You're flipping a fair coin again and again. Every time you flip heads, you add 1 black ball to a (initially empty) bag. Every time you flip tails, you add 2 red balls to the same bag bag.

After a large number of flips, you pick a ball randomly from the bag.

What are the odds that the ball was added when a heads was flipped?

What are the odds that the ball is black?


The question specified one coin flip. In your thought experiment, after one flip, the answer to both your questions is 50%.


I'm a layman in probability but isn't it the same thing as the Inspection paradox? http://allendowney.blogspot.be/2015/08/the-inspection-parado...


I remember reading rec.puzzles occasionally in the day when the Sleeping Beauty question was first posed. Coming back after not reading the group was for a couple of weeks was incredibly confused because it was full of minor variants of this single question constructed by people to support their position. I'd not only missed the original question and was thus lacking the context completely, but also had missed the megathread that followed and didn't understand the political undercurrents which would have been obvious if I'd only known who was halfer and who a thirder. It was just crazy.

Here's a great account of how it unfolded: http://www.maproom.co.uk/sb.html


This reminds me of the Monty Hall problem, but in reverse - you were playing the Monty Hall game and just won the car! You forgot whether or not you switched doors during the second step - what is the probability you switched doors?


There's actually a strong analogy to an extended Monty Hall here. Specifically:

1. Monty offers the challenge, and I pick door B.

2. Monty opens door C, showing a goat.

3. Monty asks if I want to switch. My odds of winning are now P(A) = 2/3, P(B) = 1/3.

4. My mom flips on the TV to see me play, but has missed my answers and only sees the state of the doors. What are her odds of guessing right?

Result: my mom's odds of winning are 1/2, even though the odds of a given door winning are not. There's a bias between the two doors, but her perspective is neutral - she's guessing which door has better odds, and that guess is unbiased.

As you say, this question is the reverse: the odds on the coin flip are unbiased, but my odds of being asked the question are biased towards one of the two outcomes.


See my other post; I actually think that a Monty Hall comparison argues in favor of an unbiased 1:2 guess, because of how the terms of the two problems differ.

In the SB question, your odds of being asked the question are 1:1. You just don't know how many times you'll be asked. You didn't know the night before (and neither did the researchers), and you still don't know, even though the researchers now do.

In the MH game, you give the host some new information when you make your initial choice. He already knew what door not to open, but now he knows what door he must open, and the rules require him to communicate that to you. That's when you receive the new information (as you point out with the example of your mom walking into the room).

In the SB problem, you don't get any new information before the question is asked, including whether or not you're going to be put back to sleep. So the only answer you can rationally give is 1:2.

If the researchers used a d20 instead of a coin to determine how many times to wake you up, you could safely guess that any given awakening wasn't your first or your last. But you still can't give any answer about the number on the die, other than a random guess from 1-20. You need to store some information for later recall, and they're not letting you do that.


I quite like your phrasing because it highlights that probability is a model of uncertainty for an observer.

So the mother has no information on the door, so she is 'neutral' (p=1/2 of guessing right); you have a small amount of information (p=2/3); you can even include Monty which of course has total information (p=1). You can't ask what are the probabilities of finding goats behind each door without specifying an observer and the available information.


The Monty Hall problem is actually much better for educating people, as it highlights the bias hidden behind the "50/50" oversimplification. This Sleeping Beauty problem, on the other hand, is phrased to confuse you, so that arguing between the "halfers" and "thirders" boils down to their different assumptions on what is asked, I think.

Similarly, the question "if a tree falls in a forest and nobody's there, does it make a sound?" has two valid answers for two meanings of "sound", an objectivistic "pressure wave" vs. a (somewhat?)solipsistic "consciousness' hearing".


[deleted]


I also related it to Monty Hall in that one option is two options bundled together.

The options are not [Heads||Tails] but rather [Heads||Tails+Tails]. The Tails+Tails is like having the Car+Goat bundled as a single option.

Sleeping Beauty knows the coin is 50/50 but that if she is woken there is more opportunity to be woken during a Tails flip.

To show that it is biased in favor of Tails - the experiment must be repeated. It shows more readily if you increase the number of times she is woken if the coin lands on Tails, similar to how the Monty Hall problem becomes more intuitive when you increase the number of doors.

Instead of being woken 2 times - let's have her woken 1,000 times. Every time she is woken she is asked if the coin was Heads or Tails, with no recollection of her previous answers. She is given a dollar for every time she is correct. Should she guess Heads or Tails?

If she guesses Heads every time - she can only win $1. If she guesses Tails every time - she can win $1,000. If she alternates answers she can only win $500.


I'm not sure this is a paradox. There are 2 expected outcomes. All the added outcomes rely on the experimenter to disrespect the set rules.

This is a conditional probability with a hard to predict condition (ie. human factor) portrayed as a non-conditional probability. This looks more like a bad representation of a problem rather than a paradox.

This reminds me of my youth. Regardless of the project I was trying to accomplish, as soon as my little brother got involved, all bets where off. He was a hard to predict little bugger.


Even if you get awakened 99 times with Tails, you have equal probability going down the Heads (1 awakening) or Tails (1/99 awakenings) paths.

So if you awaken with no knowledge of other awakenings, you are equally likely to be on either path, with the coin being Heads or Tails.

You are NOT equally likely to guess that your awakening was due to Heads or Tails however, and that's the paradox.


This is basically just asserting SSA.

One problem with the halfer claim is that if you learn it is the first awakening, you then place much more probability on it landing heads (basic Bayes theorem application), even when the coin is yet to be flipped. How would you resolve that?


I'm not sure I understand this given the way the question is posed.

We're doing a single experiment, and I am put to sleep without remembering either once or twice, and then awaken (potentially a third time) after that?

Or could this keep going indefinitely until heads comes up?

And am I guessing each time I'm woken, or only after the final time?


How do the researchers get you back to sleep for the second (tails) awakening?

You're woken once or twice and given the potion. Or you're woken and given the potion each time. This riddle wants it both ways, and that is part of the problem with it.


You're only given the potion right before going to sleep, after you need to state your probability estimates.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: