Hacker News new | past | comments | ask | show | jobs | submit login

I'm not sure if you meant it, but you essentially answered my question with a "yes." It's to absolve individuals of responsibility in killing people.

If that's the case then I'm not sure how it is a benefit to society.




Responsibility isn't being absolved, it's being shifted to the one that actually operates the vehicle - probably either the car vendor (who either manufactured or licensed the self-driving car software), or the fleet operator depending on what kind of sales and operations model self-driving cars adopt. When human escalator operators were replaced with automatic escalators, was there are problem with shifting responsibility over to Otis and other escalator manufacturers?

As with automatic escalators, there are two primary benefits to society: greater aggregate safety[1], and freeing up people's time. You point to the lack of a human operator to hold responsible as a detriment, but the statistics point the other way around: human drivers are horribly irresponsible and even if we can punish them when they do wrong that doesn't substantially change their behavior. Freeing up people's time is self-explanatory.


So I think you have a great analogy, and I generally like your perspective. The problem is that I think this accident clearly demonstrates that the suggestion that AV's would be safer than human drivers is now suspect. A post elsewhere compared the deaths per mile for human drivers to that of this pilots' and it's worse by orders of magnitude.[0]

That's my point. I 100% agree, if self-driving cars did make things better, than the safety aspect is a good thing.

I think the freeing up people's time is okay too but only if all of society has access to AV's. However, like many innovations of the past, advancements which are not accessible to all people just free up the time of some people while offsetting burdens onto others (ie., those who can't afford AV's). BUT, that is a separate issue.

[0] It is just one data point, but it's a bad sign to be so quick to kill a pedestrian so soon out of the gate.


Worse by an order of magnitude is only the case if you exclusively measure fatalities - and as you point out there is exactly one data point and not enough miles logged to produce any meaningful conclusion. The data on non-fatal accidents for Waymo indicates that self-driving cars are an order of magnitude less likely to be at fault for an accident that human drivers: https://www.huffingtonpost.com/entry/how-safe-are-self-drivi...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: