Hacker News new | past | comments | ask | show | jobs | submit login

>even if you don't believe in the Singularity

This makes "the singularity" sound very much like a religion.




It's faith-based handwaving bafflegab. So, yes, only without the intellectual credibility.


The Rapture for nerds, as Ken MacLeod delightfully put it.


That's my favourite, I've also seen Kurzweil called DeepakChopra++ which I like as well.


Is that an insult towards the C programming language? ;-)


Anyone whose thinking is wishful can safely be disregarded, of course.

But this utopian resurrection and transcendance story is just one version of the singularity. There are many people who think AI is not physically impossible, nanotech is not physically impossible, and so recursively self-improving AI with strong abilities to act in the physical world is a possibility. Many of those people think that is a very dangerous possibility.

You can agree or disagree with the detailed arguments, but you cannot accuse these people of allowing wishful thinking to cloud their judgements.

I like MacLeod as a writer, but that slogan is damaging because many people hear it, laugh, and stop thinking.


There are many forms of wishful thinking. Apocalyptic disaster is also one.

You seem nice; forgive me if I get too dismissive here.

As a software practitioner it seems to me obvious that we are so many light years away from the kind of software the Singularity people are talking about that the whole thing is all a fantasy club, and a little embarrassing. It's like the detailed debates 19th century radicals used to have about society after the Revolution.

Also, the Singularity people always seem to do that moving target thing where as soon as you say one thing, they go: But that's not the Singularity, that's a misunderstanding of the Singularity. Leaves me thinking: it must be awfully subtle.


A software practitioner, you say? As a researcher in machine learning, I think you're wrong. I agree that recursively self-improving AI is not around the corner, but I think it could happen in a few decades.

Even if it has a 1% chance of happening in the next century, I would still allocate resources to thinking about how to mitigate the risks (and maybe look foolish if it never happens) than leave things to chance and end up regretting it. By which I mean, a runaway AI destroys humanity or other bad outcomes.

As for the moving target -- since a singularity, if it happens, would be very important, it definitely makes sense to talk about lots of different scenarios. Anyone who dismisses a plausible scenario just because that's not what Vernor Vinge said, or what Ray Kurzweil said -- well, they can safely be dismissed.


> This makes "the singularity" sound very much like a religion.

I think what most people mean is, "even if you don't believe that anything other than squishy brains can ever recursively do what the brain does".


It's better to distinguish the singularity (semi-infinite rapid change caused by recursively self-improving AI) from AI.


The singularity (as conceived pre-Kurzweil) is the horizon beyond which we can't make any reasonable predictions about the future. Recursively self-improving AI is merely one potential path to that. But GP is right that the meme that the singularity is ridiculous tends to go along with the meme that intelligence is ineffable and can't be reduced to an algorithm in a computer.


Thanks! This conflation keeps getting made in every thread on the singularity and they're not at all the same thing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: