Hacker News new | past | comments | ask | show | jobs | submit login

This comment seems like a fine example of the very phenomenon I am trying to draw people's attention to.

I will try to explain...

> It's unpopular because it's naive.

You have no way of knowing whether this is naive.

> Nobody has the ability to prove everything to themselves and everyone else from first principles.

In no means whatsoever did I state this as a goal or requirement.

> Everyone takes shortcuts by repeating things they were told by others

Agreed. But my point is: people do not realize (in realtime) that they do this - and, this can have severe consequences.

> thus "lying" is occasionally the 5-year old type of lying where someone just makes something up on the spot and tries to avoid looking guilty, but these are not the meaningful or common types of lies.

I am not discussing lies (lying requires conscious intent) - I am discussing the mind's default inability to reliably distinguish between virtual reality and physical reality. This phenomenon can be observed in very high quantities on HN, but only in certain types of threads (culture war topics) - on other topics (computing, physics, etc), one will find very little flawed logic or assertions (relative to other social media sites).

Even if one doesn't think this is worth worrying about, it should at least be considered potentially interesting, considering the consequences of this phenomenon if it exists at scale. Take the climate change debate for example, or 'masks for covid' as a simpler scenario: what is the ACTUAL reason(s) our societies can't sort this shit out?

Should we care about such things, or should we not care? I'm getting very mixed messages on this from every social media site or organization I belong to. There seems to be general consensus that we should care (as a boolean) - but on the degree to which we should care, it seems like some people are opposed to caring too much (when it extends into thinking deeply and precisely, based on first principles (free of axioms and premises) and sound epistemology).

If you are debugging a complex system, would you use skills like precise observation, logic, and systems thinking, or would you just make a few wild guesses and throw up your hands in defeat when your guesses turn out to be ineffective?

> A good way to see this is by looking at what news organisations call "fact checks". Almost always, these are simply repeating the claims of some random academics or government institutions, which are taken to be true by default.

Agreed. This is an example of how 'that which is not true' in physical reality can become "true" in virtual reality.

> A lot of people really do assign a very high prior to "member of the establishment saying something makes it true", but a whole lot of other people do not.

Subconscious bayesian reasoning is often converted to binary when passed to the conscious (or so it seems).

> If the latter go and dig in and discover the underlying claim seems false, and start saying so, then the news orgs will claim it's "disinformation" and the others will say it's the "lying media" and yet nobody is literally making things up maliciously, even if a claim is objectively true or false.

Mostly agree, except for the "nobody is literally making things up maliciously" part - sometimes people actually do make things up with "malicious" intent.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: