Here's a simple rule that could be enacted for companies seeking to do work in the autonomous driving space: if your car kills a person, your company loses its license to work on autonomous vehicles. Forever. That will dictate the adequate pace to achieve these goals safely. Then we'll see what's really possible with this technology.
Having cars that can drive themselves just doesn't seem like a particularly high priority for society at large in the face of other looming issues. Why allow it to proceed in such a dangerous fashion at all?
But human-driven cars kill tens of thousands of people a year in the US. I think improving that is really important for our society.
It is very sad that this pedestrian died and companies that kill people through avoidable accidents like this should be punished. Uber specifically seems like a trash tier self-driving car research program and I wouldn’t mind if they just stopped.
But one day, even a system that’s far safer than human drivers is going to kill someone accidentally. It is going too far to say, a method of driving cars that ever kills somebody should be abandoned.
What if the rate of deaths for a given company is not zero, but below the rate for human drivers? Is it ok to have additional, unnecessary pedestrian deaths by NOT allowing that company to deploy their technology?
(That said, the negligence in the Uber case makes it pretty clear they are likely far from reaching that level of competency)
So you mean being okay with a multibillion dollar corporation harming people so it can make even more profits, all the while telling you that it's "good for society"
Good for the people once the system is perfected. Not so much for the ones who get run while the system is being perfected.
The "greater good" utilitarian argument has been the basis of some of the worst policies and politics in the world.
I'm not saying that self driving cars fall into the same category, but how many deaths are you okay with until Uber/Waymo perfect their algorithms (and later, charge you for it)? 1? 10? 100?
Given how many people die per year in traffic accidents with non-autonomous vehicles, shouldn't this rule apply too? Humans have shown that they can't drive cars safely and should no longer be trusted to drive.
More seriously though, it would be more interesting to compare number of kilometers driven without accidents compared to national statistics. And having something like the NTSB do thorough investigations when accidents do happen. Ultimately there is a need for autonomous driving because it should eventually cause less deaths
If a fault with the vehicle parts itself causes an accident in a human driven car then the manufacturer absolutely should be held accountable.
Also, if you believe Uber or any company is pursuing autonomous with the goal of decreasing the amount overall automobile deaths I daresay you're extremely naive. It's probably not possible to achieve anyway on a road shared with human drivers.
I don't believe that Uber's goal (or any other companies) is to decrease the amount of overall automobile deaths. However I believe that the widespread adoption of automated driving will eventually result in a decrease of automobile deaths.
I'm pretty skeptical we'll see autonomous vehicles--at least outside of limited access highways or other relatively less difficult scenarios--sooner than decades from now. But your suggestion is an impossibly high bar. There will always be failures because of debris on roads, unpredictable actions by human drivers/cyclists/pedestrians, weather problems (e.g. black ice), mechanical failure, etc. that will result in some level of fatalities.
Having cars that can drive themselves just doesn't seem like a particularly high priority for society at large in the face of other looming issues. Why allow it to proceed in such a dangerous fashion at all?