Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think it depends who was legally in control of the vehicle. I assume (and I relise it could be different, but this is just creating a model) that the safety driver would legally be considered to be in control of the vehicle, and as such responsible for the crash. She did after all have the ability to prevent the crash had she not been negligent.

I assume for instance that if I were to use Tesla AutoPilot on the road in the UK (I don't have a Tesla so I haven't looked into this) and my car crashes into someone while a) it's enabled and b) I'm not paying attention that I am still 100% at fault.

Until self-driving cars can legally be in control of themselves, absolving occupants of any responsibility, I'd assume that this is, or at least should be, the case.

I don't think Uber is clean in this to be clear, I suspect they were cutting corners to stay competitive, and I just don't trust them at all to make decisions that are in the interest of the general public, but the direct criminal responsibility seems to lie with the the safety driver, even though it seems that Uber should be sueable for something.



"legally be in control of themselves"

Never. They have no skin in the game. If they did, locking them inside a car for their life would be illegal.


At some point I suspect laws will change if self driving cars become good enough. I don't know where the liability will lie, perhaps the car manufacturers, or the insurance companies.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: