Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The singularity ain't real, buddy.


What is singularity anyway? I thought it was just some hypothetical point in time where making accurate predictions about the future becomes infeasible.


The singularity is a sci-fi end-of-days story. It speculates that at some point we'll invent an AI that's about as smart as a human, and smart enough to design a better version of itself.

Then a week later it'll come up with a version of itself that's twice as smart. A week later that version will come up with a version that's twice as smart again, making it 4x as smart as a human. A week after that, it'll come up with a version that's twice as smart again, 8x as smart as a human. And so on. A year later it'll be a trillion times smarter than a human.

The AI will then either solve all our problems, invent fusion power and faster-than-light travel, and usher in a Star Trek style post-scarcity world of infinite resources and luxury, or it'll take over and create a Terminator style oppressive dystopia where the few surviving humans survive on rats we roast over a barrel in our underground bunker.


At geological or biological timescales, that’s what the emergence of language did. AI can all be viewed as part of the accelerating returns of language…


You mean slowing it down, by producing large amounts of meaningless text?


> At geological or biological timescales, that’s what the emergence of language did

Elaborate on this idea please. Are you saying that human languages are self-replicating abstract entities?


There's a dystopian definition in the other comment but usually, when people say they're looking forwards to the singularity, they're referring to a version of the future where robots and computers are able to do most of not all jobs, so humans no longer have to work, and can spend all their time doing whatever, usually phrased as painting, singing, dancing, and the other leisure-type activities. Bettering themselves and helping humanity. The utopian, post-scarcity Federation from Star Trek, if you've seen it.


But without offering an objective model for how that could happen, it’s just wish-fulfillment fantasy.

The Singularity is the rapture of the nerds, and is about as likely to happen as the second coming of Jesus.


That was not a comment on the plausibility of it. Offering up a definition is still useful so people actually have conversation instead of just talking past each other.


There are many competing definitions which is why these comment threads devolve.


The technological singularity was first described by John von Neumann. Are you suggesting you understand something about technology that one of the most intelligent technologists to ever walk the earth didn't?


Does it matter how smart someone is when they make things up?

We see this in AI talk all the time "Dude is really good at coding and math, he's smart. We should totally listen to him when he makes things up like a 12 year old who just watched Terminator."


Great question, and what you're referring to is an appeal to authority, which is a common logical fallacy. It absolutely warrants further examination.

In the case of von Neumann, I mention him because he is the one who introduced the concept to the public. It is something he spent a great deal of time thinking about. He spent his life working around technology and had some incredible insights. Many consider him the smartest person to ever live, though in my opinion there are plenty contenders. He made key contributions to the development of computers, nuclear tech, quantum mechanics, mathematics and more. [0]

So I believe all of this truly does lead credence to the idea, at least enough to warrant sufficient research before dismissal. It's not his authority I appeal to, it's his experience. He didn't obsess over this idea for no reason, and his well-documented achievements aren't comparable to random flash-in-the-pan tech executives.

[0] https://www.britannica.com/biography/John-von-Neumann


It’s still an appeal to authority, not an analysis of the idea itself.


My question to you was whether you know something about technology that von Neumann, the guy behind the idea, misunderstood, such that you would so flippantly dismiss it.

The ball is in your court to prove why the singularity is not real, because many experienced, decorated technologists disagree and have already laid out their arguments. If you can't prove that, then there's no argument for us to have in the first place.


Beyond the (absolutely correct) sibling comment about von Neumann's dated expertise, you are experiencing selectivity bias. Technologists who believe in the Singularity (and it really is belief, not a rigorous technical argument), are very vocal about it. Those that don't have faith, don't bother speaking about it.

There are a lot of people out there who believe in technological progressivism and continued advancement of socially beneficial technologies. They just don't speak about a "singularity" beyond which predictive power is not possible, because the idea isn't worth spending time on as it isn't even self-consistent.

It's not our job to show why the Singularity won't happen. It's the nutjobs who believe in the Singularity who have the responsibility of showing why. In the 70 years in which people have been bitten by this idea, nobody has. I'll wait.


> It's not our job to show why the Singularity won't happen

No, but shallow dismissal of a subject that many intelligent technologists have spent their life considering just comes off as arrogant.

Predicting technological timelines is a largely pointless exercise, but I wouldn't be hard pressed to imagine a future N decades or centuries from now in which, assuming we do not destroy ourselves, human affairs become overshadowed by machine intelligence or technological complexity, and an increasingly complex technosocial system becomes increasingly difficult to predict until some limit of understanding is crossed.


This is a epistemological problem, you think Neumann's expertise on 1950s technology gives him insight into the technology of, let's say the 2040s. You muddle the issue by calling him an expert on "technology" rather than an expert on "technology as of when he died in the 1950s".

If we don't grant that expertise on 1950s technology gives insight into 2040s technology, then there's little reason to consider his writing as something other than a cool story.


> You muddle the issue by calling him an expert on "technology" rather than an expert on "technology as of when he died in the 1950s".

That's not my intention. It's more that, he simply looked at the historical progress of technology, recognized an acceleration curve, and pondered about the end game.

My argument is that it warrants consideration and not shallow dismissal. Plenty of opponents have criticized the concept, and some decent arguments revolve around rising complexity slowing down technological advancement. That is possible. But we can't just dismiss this concept as some fleeting story made up "like a 12 year old who just watched Terminator".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: