Hacker News new | past | comments | ask | show | jobs | submit login

It doesn't have to be infallibly. It only has to be a single order of magnitude better than any human driver, and most people will start using it.

How many other sectors have abandoned human interference after computers surpassed human performance?




The important question then becomes - is society OK with bugs and shortcomings in software and hardware killing people? (this is based on the assumption that even driverless cars will not be perfect, some people will still die on the road)

So far, society seems to not be OK with this (as-in we'd rather a person do the killing, even if we think that killing was wrongful).

We aren't OK with autonomous robots having weapons, even though they might be objectively better at guarding prisoners, military bases, killing "bad guys" in bank robberies, etc. We freak out when a fatality occurs at an automotive plant, and those robots only pivot in place!

If society is going to agree we're all OK with a bug left by some short-sighted engineer being responsible for people's deaths - then OK. However, I wager people aren't really OK with this, most just haven't really considered this aspect yet.


A lot of the backlash against autonomous weapon systems is fed by the last 50 years of sci-fi movies showing what might happen (however unrealistic), self driving cars are a different thing and there isn't really an equivalence.

Sure there will be legal issues (in a crash who is responsible, the driver, the manufacturer or the programmers) but they will get resolved with time and case law.

The economic advantages to self driving cars are huge (unless you drive for a living but then progress is what it is), 35,000 people a year die on American roads an order of magnitude improvement would save ~32,000 lives a year (and that's just accidents resulting in fatalities, many many more experience life changing injuries), this generation of drivers might not like it but as the cars get better and better at driving themselves the next generation will hand over more and more of the responsibilities until a human driving a car manually on the road will look like an anachronism.

Also people aren't ever going to be happy with a bug in hardware or software killing someone but we are currently 'happy' with allowing tens of thousands of people to die from car accidents, if the motorcar had been invented in 2000 many people would have wanted to ban it immediately.

"You want to operate a 2500KG metal box at 40mph in proximity to people!? oh hell no!"


There is no reasonable argument for preferring that more people should die as long as the agents of their deaths are the kinds of biological organisms we're used to. What we happen to be already accustomed to has no relevance in determining what we ought to do in the future, except in trivial cases where the different alternatives don't lead to widely distinct numbers of casualties.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: