Hacker News new | past | comments | ask | show | jobs | submit login

So what if the "certain topic" is something with no immediately obvious thing that "works", e.g. climate change?



I'm afraid I don't know enough to talk about it, but to me it seems more of a collection of observations than a "theory" in the speculative sense. As far as I'm aware though, we have observed that cities with less greenhouse gases tend to be colder, so in that sense lowering CO2 levels "works" to reduce temperature, though we can't speak of global "correctness" until we manage the same with the global temperature.

This is a special case however, in that we have to assume correctness because otherwise we'll all be much worse off, and the possible cost of reducing pollution are slim in comparison. But if anything I think this supports my point that correctness itself doesn't matter, only the material consequences do.


But if only potential consequences matter regardless of the odds, then you get to the problem of Pascal's Wager [0]: it's best to assume God exists, because the consequences of being wrong and not doing that (going to hell) are far higher than the consequences of being right and doing it.

[0] https://en.wikipedia.org/wiki/Pascal%27s_Wager


Pascal's wager is mainly faulty because it relies on a complete lack of information, unlike climate change where we have some information. If we had any clues at all that any existing gods are benevolent, it would most definitely be the right choice from a pragmatic perspective. If we circle back to climate change as an example, it's hard to be certain but that's still a lot of clues telling us we should do something.

I assume you're not headed into religious debate territory but I don't imagine pragmatists focus much on metaphysical matters (I know I don't).


Right, so how about AI then? There's a very small risk that AI would become extremely malicious and wipe out the entire race. In other words, the risk is infinite. Since there is some (albeit very little) information that this might happen, should we spend all our resources on preventing that?


I don't follow?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: