> I don't think there's any harm in believing in the singularity for entertainment value, but as for providing a door to immortality, I think the main danger is simply distraction. While all the effort and discussion and research is spent on something there is no way of proving -- in the meantime, 20,000 children die each day from preventable diseases.
> I invite anyone who seriously considers a singularity to set it down, and put some effort into the relief of suffering. When we have preventable diseases and poverty figured out, then let's revisit the singularity. I'd enjoy working on it then.
The amount of money currently being spent on singularitarian pursuits is negligible compared to what is spent on development aid, or healthcare.
Furthermore, one of the necessary enablers of the singularity is faster computers. Regardless of the sustainability of Moore's law, it is economic competition between chip-makers that has provided the impetus for the continuous exponential increase in computing power that happened during these last decades; as long as it's physically possible, and as long as the chip market is not a monopoly, the increase in computing power is going to happen anyway, and the singularity would be nothing but a byproduct of these market forces - and will not come at the cost of a disinvestment from charitable pursuits.
As a species, it's nice to have a portfolio of possible futures - and it's nice to know that the singularity road is being probed by some people.
I think the most precious thing we have is time, not money, because our time is finite. When you consider the potential of the human race and computing power to help spare daily genocide, there is clearly an opportunity to do better. If a person is driven to spend ever more time on pursuing the innovations that could theoretically lead to a singularity, what would that person say to a stadium full of children who are going to die the next day? I wouldn't know what to say. I'm not sure they'd be comforted by assurances that a byproduct of the pursuit of the singularity results in economic momentum that employs people, as good as that may be.
Perhaps though the attendant momentum of deep mind projects might result in AI or II be putting to the task of humanitarian relief. But my guess is that any such system would probably ask, "what should the priority be for preserving life?" -- and if the prioritization were done by consensus, I'm guessing that most people would prioritize preserving life over pursuing the singularity. For example, I think most people would want a self-driving AI government to prioritize their own survival over the singularity. And if that were the case, and AI were tasked with helping to "load balance" priorities of ethics, efforts, solutions and systems, I wonder if it might not admonish us to make changes in our life.
I'm not self-righteous, I'm self-unrighteous. I'm just beginning to question my general acceptance of wherever technology goes, wherever I spend my time. I don't presume to tell people what to do, but I do propose that the ethic of choosing whether to relieve a suffering child right in front of you may be self-evident to some people -- and if so, that realizing children are at arm's length, that we can do something about it, may convince some people, myself included, to think about "re-balancing" priorities.
I read fiction regularly. I would be a hypocrite to judge anyone for "not spending enough time relieving suffering". Yet overall I question how content and comfortable I am with life. I guess in a portfolio of effort, I hope that the allocation of assets in my own life and others would place a primary emphasis on sustainability that includes the relief of suffering. I think that's what 20k kids dying each day says to me. I think that's what they'd post to Hacker News, if they could.
> I invite anyone who seriously considers a singularity to set it down, and put some effort into the relief of suffering. When we have preventable diseases and poverty figured out, then let's revisit the singularity. I'd enjoy working on it then.
The amount of money currently being spent on singularitarian pursuits is negligible compared to what is spent on development aid, or healthcare.
Furthermore, one of the necessary enablers of the singularity is faster computers. Regardless of the sustainability of Moore's law, it is economic competition between chip-makers that has provided the impetus for the continuous exponential increase in computing power that happened during these last decades; as long as it's physically possible, and as long as the chip market is not a monopoly, the increase in computing power is going to happen anyway, and the singularity would be nothing but a byproduct of these market forces - and will not come at the cost of a disinvestment from charitable pursuits.
As a species, it's nice to have a portfolio of possible futures - and it's nice to know that the singularity road is being probed by some people.