Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Isn't the bigger problem that it did not classify the object as being on a collision path. It doesn't matter what the object is, brake if you are going to hit it.


It's astonishing to me that emergency brakes were off. The investigation report clearly shows that the pedestrian was classified incorrectly, alternating multiple times between wrong classifications, and the path prediction was off. Despite all of those failures, it was still correctly determined that a crash was imminent 1.2 seconds before impact, and a second later realized that avoidance had failed - at some point around 1.2 seconds before the crash, AEB should have engaged.

Instead, as the report says:

> The vehicle was factory-equipped by Volvo with several advanced driver assistance systems (ADAS), including forward collision warning (FCW) system and automatic emergency braking (AEB) system. However, Volvo collision avoidance ADAS were not active at the time of the crash; the interaction of the Volvo ADAS and ATG ADS is further explored in section 1.9.

It was apparently disabled by Uber because it was difficult to run the Volvo systems alongside their own, which strikes me as highly irresponsible. Autonomous driving software relies on multiple redundancy, and AEB is the last-resort system for when everything else, from automatic maneuvers to alerting the driver, has already failed.


> It doesn't matter what the object is, brake if you are going to hit it.

The built-in emergency braking system from Volvo does this [0] but Uber deliberately disabled it (presumably because it conflicted with their self-driving rig).

[0] https://www.media.volvocars.com/global/en-gb/media/pressrele...


Atleast the Volvo system output could atleast be used as a sanity check or something.

If you hit the accelerator harder when the Volvo brakes you override it. It should be fairly easy to integrate as a backup.

However I guess that the Volvo system might not be far looking enough for those speeds?


From what I read in some of the earlier reports on this, the car didn't have the ability to emergency brake in autonomous mode. It was disabled at that time, so it could only brake in regular traffic, not for obstacles that appear suddenly.


That is right - so now we have two major errors.

The justification made for disabling the emergency braking - that it would interfere with data gathering - might appear reasonable at first sight, but it does not stand up to scrutiny, for if the emergency braking is triggered, the driving system has already made a mistake, and you already have the data on that malfunction.


Yes, but accidental emergency breaking can cause just as bad of an accident as this, it all just depends on the scenario.

They had hired a driver to sit behind the wheel to monitor the road and the car for exactly this reason.

The driver they hired decided to watch a movie on their phone instead of paying attention to the road.


> Yes, but accidental emergency breaking can cause just as bad of an accident as this, it all just depends on the scenario.

If Volvo's system is that dangerous, then it should not be on the road at all - but there is no evidence that it is, you are just making a speculative argument.

> They had hired a driver to sit behind the wheel to monitor the road and the car for exactly this reason.

That is no reason to disable a safety feature that would add safety in depth.

> The driver they hired decided to watch a movie on their phone instead of paying attention to the road.

That was a major error - a crime, in fact - but, unfortunately, also an entirely predictable scenario that cannot be dismissed on the grounds that dealing with it would make testing more difficult or expensive. So now we have three errors.


Does not compute. I regularly hit objects while driving my car. Such as: potholes, twigs, paper bags, plastic bottles.

The deadly assumption here is "doesn't clearly match a category" => "safe to ignore"


Emergency brakes were disabled because they behaved erratically. So I'm not sure what it would have done if it had ID'd her correctly, it was still relying on witless behind the wheel. What a shit show.


I doubt that the emergency braking system was behaving erratically in itself, to any significant extent - if that were the case, it would do so for human drivers, and so should not be on the road at all. What I suspect is more likely is that Uber's system was behaving erratically, triggering the emergency response with some frequency.


Yea, should have put "behaving erratically" in quotes, I think you were probably on the money that it was in response to the AI or AD in this case. Probably was behaving like antilock brakes as the classification flipped every few milliseconds.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: