Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> when you look into the testing and QA setups for something like Boeing

Boeing has a corporate lineage that extends back for more than a century and for most of that they did not have the levels of engineering safety excellence they can manage today. The culture that achieves near-perfect performance every flight is a different culture to the one that got planes off the ground back in the day.

And that goes to what I'm trying to communicate in this thread - people are bringing up examples of how people deviating from standard practice in mature, well developed industries where there are highly safe alternatives.

This is a different industry. Today in 2019 mankind knows how to fly a plane safely but does not know how to drive safely - I've worked in a high safety environment, they were notching speeds down to 30kmph and 40kmph down from 100kmph on public roads because the risk of moving any faster than that just isn't acceptable. People were substantially more likely to die on the way to work than at it. They'd probably have bought in 20kmph if the workers would reliably follow it. Driving is the single highest risk activity we participate in. Developing car self-driving technology has an obvious and imminent potential to save a lot of lives. Now we aren't about to suspend the normal legal processes but anyone who is contributing to the effort is probably reducing the overall body count even if the codebase is buggy today and even if there are a few casualties along the way.

What matters is the speed with which we get safe self driving cars. Speed of improvement, and all that jazz. Making mistakes and learning from them quickly is just too effective; that is how all the high safety systems we enjoy today started out.

It is unfortunate if people don't like weighing up pros and cons, but slow-and-steady every step of the way is going to have a higher body count than a few risks and a few deaths learnt from quickly. We should minimise total deaths, not minimise deaths caused by Uber cars. Real-world experience with level 3/4 autonomous technology is probably worth more than a few tens of human lives, because it will very quickly save hundreds if not thousands of people as it is proven and deployed widely.



> The culture that achieves near-perfect performance every flight is a different culture to the one that got planes off the ground back in the day.

But those lessons were learned. We know how to do it now. Just like...

> Today in 2019 mankind ... does not know how to drive safely

Yes we do, and by and large we do it correctly. It's easy to think nobody knows how to drive if you spend 5 minutes on r/idiotsincars, but that's selection bias. For every moron getting into an avoidable accident each day there are millions of drivers who left home and returned completely safely.

You can make the argument that people sometimes engage in too-risky behaviors while driving, that I'd agree with, but people know how. Just like people know how to develop safety systems that don't compromise safety, even when they choose to not, as I believe happened here.

> Making mistakes and learning from them quickly is just too effective; that is how all the high safety systems we enjoy today started out.

But again, we know how to do this already. And again, my issue isn't even that someone got hurt while we perfected the tech, all of our safe transit systems today are built on top of a pile of bodies, because that's how we learned- my issue is the person hurt was not an engineer, was not even a test subject. Uber jumped the gun. They have autonomous vehicles out in the wild, with human moderators who are not paying attention. That is unacceptable.

There was a whole chain of errors here:

* Ugly hacks in safety control software * Lack of alerts passed to the moderator of the vehicle * The moderator not paying attention

All of these are varying levels and kinds of negligence. And someone got hurt, not because the technology isn't perfect, but because Uber isn't taking it seriously. The way you hear them talk these are like a month away from being a thing, and have been for years. It's the type of talk you expect from a Move Fast Break Things company, and that kind of attitude has NO BUSINESS in the realm of safety, full stop.


> The culture that achieves near-perfect performance every flight is a different culture to the one that got planes off the ground back in the day.

This is true, but those lessons that got them to that culture are written in blood.

Like the old saying, "Experience is learning from one's own mistakes. Wisdom is learning from others mistakes." I don't think re-learning the same mistakes (process mistakes, not technological) is something that a mature engineering organization does.

One of my worries is that SV brings a very different "move fast and break things" mindset that doesn't translate well to safety-critical systems.

As for the rest of your post, what you're talking about is assessment of risk. Expecting higher levels of risk of an experimental system is fine, but there's a difference when the person assuming that risk (in this case, the bicyclist) doesn't have a say in whether that level of risk is acceptable




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: