Hacker News new | past | comments | ask | show | jobs | submit login

> You are basically limiting "produce" to mean produce basic subsistence items like food, clothing, and shelter.

I would also include the arts, entertainment, service work, craftsmanship, etc. But going on...

> Yes, of course the fraction of people that need to work to produce those things will decrease with automation--that's been happening for most of human history, and has been happening a lot faster since the Industrial Revolution. In the US in the late 19th century, IIRC, something like 19 out of 20 people worked on farms. Now it's about 1 in 20.

I feel like you're conflating labor specialization, which is what happened for most of human history where machine augmentations created more and more specific jobs (instead of "farmer" you had farm hands, farm maintenance technician, agricultural technician, etc.) all of which were new, different, and usually better jobs with more education and more pay.

Automation on the other hand eliminates jobs as it goes; instead of a new tractor that enables one worker to plow many fields, an automated tractor plows fields 24 hours a day, without breaks, food, or complaints. The guy who drove the tractor has been automated; he hasn't had his labor specialized, his labor no longer exists. So now he has to find a new job which is not nearly the same as training up for a job that he got specialized to.

As to this:

> The question is, which of the two possibilities above is the default outcome of mass automation...?

The default is certainly #2. My argument is that paying people to sit at home and watch soap operas is preferable to paying them to sit in jail cells. It's also often cheaper, and more humanitarian. But the first step to getting to that kind of place is decoupling work from life, because right now the idea that someone could live off of the excess production of our society is offensive to huge numbers of people, largely because of societal norms that we are in control of, and due to leftover Red Scare.




> I would also include the arts, entertainment, service work, craftsmanship, etc.

But that destroys your argument, because these things are not automated and won't be any time soon. Automation certainly does not remove the need for humans to create works of art and entertainment, to perform services, to provide craftsmanship, etc. So if all those things are included in "produce", then there are lots of ways for humans to produce that won't be displaced by automation any time soon.

> Automation on the other hand eliminates jobs as it goes

So does labor specialization, as you call it. Machine augmentation doesn't just create "more and more specific jobs"--it eliminates the old jobs that didn't require the knowledge of how to use the machines.

Also, even if a tractor isn't automated (it requires a human driver), it still enables one human to do work that it took many humans (about 20, with the present state of farm automation in the US) to do before. All those other jobs are indeed eliminated. Historically, farm laborers whose jobs disappeared went to the cities and started working in factories or learned a trade.

> My argument is that paying people to sit at home and watch soap operas is preferable to paying them to sit in jail cells.

Agreed, but my argument is that if those are the only two choices, we're doomed. Basically, you're saying society will, in the fairly near future, end up stagnant--all important jobs will be automated and all of the humans will just sit around watching soap operas (except, perhaps, for the small number of humans who oversee the automation). And I'm saying that such a society will collapse, because the rest of the universe is not stagnant. Things are always changing, and only humans--human thought and human ingenuity--can cope with change. (And if your answer is that we'll just invent AI and automate that too, that just means we'll be at the mercy of the AI, which I don't see as a good situation.)

> the idea that someone could live off of the excess production of our society is offensive to huge numbers of people

I would put this differently. I would say that the idea that someone could receive a basic living without producing anything that improves anyone's lives is offensive to huge numbers of people. And it should be.


What's offensive is, the thought that the only good in a life is working like a slave to please an intellectual's ideal.

Its the old Protestant work ethic, which served us for a time but is rapidly becoming obsolete.

In the 1950's we wrote about how robots would remove the yoke of labor from our shoulders, let us go on to be creating and enjoy leisure. Now that it's happening, folks are panicking. Cries of "But people must work; its offensive if they have leisure!" go up on every discussion site.

Try to think outside the box. What might people accomplish if they're not required to labor at McD's for 8 hours a day?


> What's offensive is, the thought that the only good in a life is working like a slave to please an intellectual's ideal.

Which is not at all what I said.

> What might people accomplish if they're not required to labor at McD's for 8 hours a day?

Lots of things, I'm sure. And as long as those things improve people's lives in some way, I'm all for it.


How about, leave that up to them to determine? Again, intellectual ideals work as long as we all think alike. Which we don't.


> How about, leave that up to them to determine?

Up to whom? Yes, you get to decide what improves your own life. But you don't get to decide what improves someone else's life.


> But that destroys your argument, because these things are not automated and won't be any time soon.

After the limited time I've spent in my life paying attention to this, I'm not assuming any job, including my own, is safe from automation.

> Historically, farm laborers whose jobs disappeared went to the cities and started working in factories or learned a trade.

And that is the difference, except this time the cities jobs are being automated and there's no where left to go, as even the farms don't need nearly so many people. And yes there's certainly an argument to be made for the trades which I'm sure are (mostly) safe from automation for awhile, but not forever, and there's a limited pool of people who are skilled enough to do those jobs, and even past that is a limited availability of those jobs, AND AFTER THAT is the fact that as more and more former factory workers, truck drivers, and coal miners all sign on to be plumbers, electricians, etc. all of which are needed that the supply of those workers will explode and the wage they make will therefore shrink.

> Agreed, but my argument is that if those are the only two choices, we're doomed. Basically, you're saying society will, in the fairly near future, end up stagnant--all important jobs will be automated and all of the humans will just sit around watching soap operas (except, perhaps, for the small number of humans who oversee the automation).

That't not at all what I said: I said we should have a system in place for that certain group of people who have nothing to contribute, because there's nothing wrong with that. The alternative to feeding and clothing them in their homes is to feed and clothe them in prison, and the former doesn't enrich private prisons and we don't need to pay for guards, not to mention the ethical cost involved in the current situation which is mass incarceration.

I'm saying, by virtue of being born, you should be fed, clothed, and sheltered because we can do that and the notion that we simply shouldn't so Mark Zuckerburg can have a fifth house is I'm sorry, a DISGUSTING idea to me.

> I would put this differently. I would say that the idea that someone could receive a basic living without producing anything that improves anyone's lives is offensive to huge numbers of people. And it should be.

Why is that offensive? With what we currently spend on defense, we could feed, house, and clothe every homeless person in the United States with Gucci bags, mansions and caviar. Why is it so important that we not? And if you cannot answer without citing tradition or work ethic or something equally arbitrary and archaic, then I'm sorry but I'm not convinced.


> After the limited time I've spent in my life paying attention to this, I'm not assuming any job, including my own, is safe from automation.

So you think things like artistic creativity, service work, craftsmanship, etc. are going to be automated soon? That there will literally be no productive things that humans can do, sometime in the not too distant future?

If that is indeed what you think, then I can see why you want some rule put in place that ensures that all humans get a basic living, uncoupled from work. But I still think that, if that's the case, humanity is doomed. See below.

> With what we currently spend on defense, we could feed, house, and clothe every homeless person in the United States with Gucci bags, mansions and caviar.

But that's not what you're talking about. You're not just talking about homeless people. You're talking about, eventually, all people. You're saying that there will come a time in the not too distant future where every single human on the planet has effectively zero productivity. And your way of preparing for this is to put a rule in place that says that humans don't have to produce anything in order to get a basic living.

One problem with this is, as I said before, change. If you haven't automated dealing with change, then any state where, for the moment, humans don't have to produce anything won't be stable.

But if you have automated dealing with change, then you're up against a bigger problem: you have basically invented AI, and there is no guarantee that the AI will care about humans. We will be at the mercy of the AI, just as other life forms on Earth today are at the mercy of humans. We will basically be zoo animals, preserved only at the sufferance of the AI. And in that situation, humanity is doomed.


You're making several deep errors regarding what I'm saying.

Firstly you're responding as though, at some mysterious date in the future, all people are getting fired and replaced with a robot. It's not going to happen that way, it's going to be a gradual process but I think not as gradual as a lot of people want to think, and I think once it begins it will accelerate exponentially as the technology matures.

Not to mention:

> So you think things like artistic creativity, service work, craftsmanship, etc. are going to be automated soon? That there will literally be no productive things that humans can do, sometime in the not too distant future?

You're still hung up on this idea that productive things are all people can do. The whole point as pointed out by another poster is that people were supposed to not work anymore, at least in the traditional sense: that humans would give the work to the robots, and then go find other, more enriching things to do, maybe something with art or just really learning a shitload about medieval poetry. The fact that someone chooses something that isn't necessarily productive doesn't matter in the world I'm envisioning, where scarcity is no longer a thing.

> But that's not what you're talking about. You're not just talking about homeless people. You're talking about, eventually, all people. You're saying that there will come a time in the not too distant future where every single human on the planet has effectively zero productivity. And your way of preparing for this is to put a rule in place that says that humans don't have to produce anything in order to get a basic living.

I wouldn't say it's going to hit zero. There will always be something for us to do, even if it's just to hit the stars and go find new things (ala Star Trek's utopian future).

What I am saying is: We already are seeing the beginnings of a population that can't really produce anything, and we're seeing a new phenomenon; we don't really need them to. How much of the population will end up like that isn't the point; the point is that some of them are, and some are already, and we have no mechanisms to deal with that other than starving people to death which I think is not a long term solution.

> But if you have automated dealing with change, then you're up against a bigger problem: you have basically invented AI, and there is no guarantee that the AI will care about humans. We will be at the mercy of the AI, just as other life forms on Earth today are at the mercy of humans. We will basically be zoo animals, preserved only at the sufferance of the AI. And in that situation, humanity is doomed.

This is a whole different discussion but this doomsday view also relies on the AI being entirely unaware of empathy, which as a core component of humanity, would likely make its way into the artificial minds humanity creates. It's a base function of animal and human brains, it's what stops us from killing each other over basic impulses.


> you're responding as though, at some mysterious date in the future, all people are getting fired and replaced with a robot

I have said no such thing. I am responding directly to your own statements. You said, in effect, that nobody's job is safe from automation. I just assumed you meant what you said.

> You're still hung up on this idea that productive things are all people can do.

All of the things in question are productive things, by the definition I thought we were using. A "productive" thing is a thing that improves people's lives. Are you saying that artistic creativity, service work, craftsmanship, etc. don't improve people's lives?

> There will always be something for us to do

That contradicts "no one's job is safe from automation", doesn't it?

> We already are seeing the beginnings of a population that can't really produce anything

Again, you're using a much too restrictive definition of "produce". But there's another point here as well.

The basic vision I see you describing is, nobody has to produce anything because all of the necessary production will be done by machines. But who owns the machines?

If the machines are owned by a few centralized entities, then we have the same problem of distribution that we had before--who gets the stuff the machines make? If you think there is any way of setting up a society that will guarantee that such a distribution is going to be fair to the people who don't own the machines, I have some beachfront property in Nebraska I'd like to sell you.

The obvious alternative is to make those machines cheap and small and plentiful enough that we can all have one. In other words, instead of universal basic income, we all are able to buy, at some reasonable price, machines that meet all of our basic needs. Then it will indeed be up to each individual person to decide what, if anything, they want to produce, because they will own all the resources necessary to meet their basic needs.

The difference between these two scenarios is that the first (yours) requires a fundamental change to human nature. I don't think that's going to happen. The second (mine) only requires a change in the kinds of technologies we try to build. That seems like a much better bet.

> this doomsday view also relies on the AI being entirely unaware of empathy, which as a core component of humanity, would likely make its way into the artificial minds humanity creates

This strikes me as overly optimistic handwaving. We have no idea how human brains implement empathy, and none of the artificial devices we have built so far have it. Yet somehow we're going to magically be able to build it into an AI?

> It's a base function of animal and human brains

Which evolved over hundreds of millions of years. I think you're vastly underestimating the amount of complexity that process created.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: