Hacker News new | past | comments | ask | show | jobs | submit login

Which of course directly leads to Pascal's Mugging: I can simply say "I'm a god, give me $10000 or you will burn in hell for all eternity". Now if you follow Pascal's Wager or GP's logic you have to give me the money: I'm probably lying, but the potential downside is too great to risk upsetting me.



There's actually a rational explanation for that: humans don't care very much about burning in hell for all eternity, when it comes down to it.

There's actually a similar though experiment that might seem even more bizarre: I could tell you "give me $100 or I will kill you tomorrow" and you probably wouldn't give me the $100. That's because when it comes down to it, humans don't see the loss of their life as that big a deal as one might think. It's a big deal, of course, but in combination with the low likelihood, still not big enough to forgo the $100.


Here's the thing: if you have just killed five people, "give me $100 or I will kill you tomorrow" becomes a much more effective threat.

One-time games and repeated games have different strategies.


It becomes more effective only because it changes the probability estimation of the outcome.

Life is a repeated game of decisions that compound on each other, so that difference is irrelevant.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: