Hacker News new | past | comments | ask | show | jobs | submit login

People die today. Self driving cars just need to improve on that but a significant amount. I predict that despite not being perfect (which might not even be a reachable bar) they will become mandatory for all cars in the near future because they are that much better than humans.



Humans are deeply emotional creatures and it's most evident when you observe group-level decision making.

Individually, you and I can both agree that the bar is quite low for self driving cars. It just needs to do better than current drivers to be safer. We should adopt it immediately, if we can just prove that it crashes less.

Society doesn't see it that way. 1 crash is horrific. 1 accident, is proof the technology is flawed. 1 accident is 1 too many.

It's stupid, but that's how it is.


The metric shouldn't be #crashes but #innocent lives lost. I've never seen a study that accounts for all the influences to prove that autopilot and co. are better in that regard.


Who's going to be liable for those deaths? The manufacturer? I don't think they'll be so hot on that. So really, they need to be orders of magnitude safer than humans, to the point that they are close to flawless.


I don't think that is the blocker. You can pass a law that says "manufacturers that meet XYZ self-driving certification are exempt from liability".

The question of liability is always tough to resolve. Is Boeing responsible for the 737 MAX deaths because they wrote the faulty software? Is the airline responsible for poor training or buying the version without the "disagree" light? Or is the FAA liable for their poor oversight of the whole program? Arguments can be made for all three.


Selfdriving.. oh wait, this is in the air, where there's no traffic and all these different situations. And wait, it was not even an autopilot. Even this was too hard to implement.

Lane-assist on the highway? Sure.. autopilot? Not any time soon. Probably not within 20 years


Exactly, this is also a legal one. I'm curious what happens when we get human driven cars crashing into auto piloted. Are we going to have an assumption of guilt on the human or the auto or will it actually be balanced.


I agree, but the optics are awful. It's not a technical problem, it's a psychological one.

I don't have any sources to back this up, but I'd wager that people, on average, would prefer to take a risk an order of magnitude greater if they believed they had some degree of control over the outcome.

I.e. let's say 10% of conventional cars experienced a fatal accident, and 1% of self-driving cars experienced a fatal accident. I think most people would choose to be in the 10% fatality group because they feel like they would be in control.

Anecdotally, I have a co-worker who hates flying because she feels like she isn't in control, even though aviation deaths are extremely rare.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: