Comparison to space travel is silly. I'm your age, and I remember clearly what happened. Don't you? We kids were so excited about getting closer and closer to a man walking on the moon. The closer we got, the more exciting. Finally, we watched it happen. Woohoo! We did it! And then we all lost interest. It was done. It wasn't entertaining anymore and it wasn't worth any money to anybody, so we got on with our real lives and did things that WERE fun and profitable.
It was a government program built for reasons other than economic, so it didn't grow organically as an economic system serving and adapting to the needs of customers. When the non-economic reason went away, the growth rate dropped to near zero.
Compare that to information technology. Is that one centralized federal program divorced from economic reality? No, it's growing organically, serving countless different markets demanding countless things. Machines can do more every year, and the more they can do, the more diverse the demands for them to do more, and the more people there are who are willing to pay, the more people who are working on better IT technologies, the more investment, and so on.
Will there be a singularity? No, we're not all converging on a point in technology or time. We already have domains where machines are orders of magnitude better than people and those that are the reverse, and the machines are sliding past us gradually. There isn't one "general intelligence" model, either. People do some cognitive activities so poorly that it wouldn't be considered intelligent at all if we didn't use ourselves as the benchmark for general intelligence. We'll find ways to improve machines in specific domains and find other ways to generalize their intelligence across various categories of domains, one step at a time without ever crossing any visible "general intelligence" threshold.
The process has been underway for a long time. We'll eventually reproduce whatever Nature has done, and much that it hasn't, if we don't kill ourselves first.
How soon? How soon what? A gradual change is already underway and will just continue. Maybe at some point augmented humans will notice that they can't think of any specific thing an unaugmented human can do better than a pure machine, but others will debate it and the augmented people and machines will keep improving until they all think it was sometime in the past.
Okay, how soon will it be possible to upload my mind to a long-lived substrate so that I won't disappear like an unsaved term paper when the power goes off? Probably not long after my power goes off....
It was a government program built for reasons other than economic, so it didn't grow organically as an economic system serving and adapting to the needs of customers. When the non-economic reason went away, the growth rate dropped to near zero.
Compare that to information technology. Is that one centralized federal program divorced from economic reality? No, it's growing organically, serving countless different markets demanding countless things. Machines can do more every year, and the more they can do, the more diverse the demands for them to do more, and the more people there are who are willing to pay, the more people who are working on better IT technologies, the more investment, and so on.
Will there be a singularity? No, we're not all converging on a point in technology or time. We already have domains where machines are orders of magnitude better than people and those that are the reverse, and the machines are sliding past us gradually. There isn't one "general intelligence" model, either. People do some cognitive activities so poorly that it wouldn't be considered intelligent at all if we didn't use ourselves as the benchmark for general intelligence. We'll find ways to improve machines in specific domains and find other ways to generalize their intelligence across various categories of domains, one step at a time without ever crossing any visible "general intelligence" threshold.
The process has been underway for a long time. We'll eventually reproduce whatever Nature has done, and much that it hasn't, if we don't kill ourselves first.
How soon? How soon what? A gradual change is already underway and will just continue. Maybe at some point augmented humans will notice that they can't think of any specific thing an unaugmented human can do better than a pure machine, but others will debate it and the augmented people and machines will keep improving until they all think it was sometime in the past.
Okay, how soon will it be possible to upload my mind to a long-lived substrate so that I won't disappear like an unsaved term paper when the power goes off? Probably not long after my power goes off....