Hacker News new | past | comments | ask | show | jobs | submit login

With a disabled emergency braking system and an inability to handle a rather common situations like people jaywalking, these cars really shouldn't have been driving in public. That is the kind of stuff that I'd expect to be tested and implemented on private grounds.

Additionally, reducing the number of people in the car to one when the car is pretty much by design not capable of handling emergency situations by itself is quite reckless.




Additionally, permitting a test plan that prominently involved requiring all operators to break the law by operating a computer when they were supposed to be keeping their eyes on the road is quite reckless on the part of the State of Arizona.

I really don't think we should be so eager to damn Uber that we forget that there were more than just Uber employees asleep at the wheel here: Arizona has a duty to protect public safety. By permitting a self-driving car test program on the public roadways without doing even basic due diligence in vetting the program first, the State was grossly negligent in that duty.


> there were more than just Uber employees asleep at the wheel here

You couldn't be more right. The zeitgeist in 2015 was full-steam ahead on the autonomous future and woe to anything that stood in its way.

The framing of Arizona's embrace of autonomous testing four years ago [0,1] contrasted with California's caution [2,3,4] couldn't have been starker. One was branded enabling innovation, the other was seen as bureaucratic red tape holding up progress.

It's too easy to get wrapped up in these narratives.

[0] https://azgovernor.gov/governor/news/2015/08/governor-doug-d... [1] https://www.uber.com/blog/tucson/driving-innovation-in-arizo...

[2] https://fortune.com/2015/12/16/google-california-rules-self-... [3] https://www.theverge.com/2015/12/16/10325672/california-dmv-... [4] https://www.nytimes.com/2015/12/17/technology/california-dmv...


They shouldn't have been driving in public with someone who wasn't paying attention 100% of the time as any regular driver would. The longer quote:

> Also, don't forget: the SUV's emergency braking system was deliberately disabled because when it was switched on, the vehicle would act erratically, according to Uber. The software biz previously said “the vehicle operator is relied on to intervene and take action," in an emergency.

The question then was there proper training and communication to the test drivers that it's never okay to look down at your phone. Or whether that was simply unrealistic expectations. Or if the hours were too long, or testing at night, etc.

It said there was 5 seconds which should have been more than enough for a human test driver to hit the brakes, which was their stated job.


In this case the safety drive was watching their phone which was playing a movie so a bit beyond the normal level of attention wandering you'd expect. But even for people trying their best having to pay attention to a road for hours without having to provide any input isn't something you can reasonably expect people to do. NHTSA level 3 autonomy is just a bad idea, we need to go straight from 2 to 4.


I think so, too. It's just asking something that humans weren't designed for.


>The question then was there proper training and communication to the test drivers that it's never okay to look down at your phone.

If I remember right, Uber did cost-savings by removing the safety engineer and expected the driver to fill both roles simultaneously.


> They shouldn't have been driving in public with someone who wasn't paying attention 100% of the time as any regular driver would.

It's pretty well known that humans' minds wander, that it happens more when monitoring reliable systems for rare problems, and that it makes operators less responsive and lowers their error detection rate [1] - as anyone who's attended a boring meeting or lecture can attest!

I'm not sure that anyone informed would imagine a worker spending 40 hours a week monitoring a self-driving car would be able to watch it with 100% attention.

The truth is nobody realistic expects the safety driver to respond to reliably prevent an accident like this - they're there for slower-developing problems, resetting false alarm stops, and taking the blame.

[1] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5633607/


That's why two tests drivers makes sense in the early days and not keeping extended hours.

Companies like https://comma.ai/ are taking a much better approach IMO by keeping it simple by first perfecting lane assist/highway driving + building a driver watching device which alerts them when they stop paying attention for x amount of time. Which Uber should be investing in for their test drivers.

Another important thing is being realistic about expectations, of course 100% paying attention is unrealistic even for normal drivers, accidents will happen regardless. Hitting jaywalkers on a dark multi-lane high speed road is a lot less bad than other possible scenarios and there really hasn't been that many accidents yet.


>The question then was there proper training and communication to the test drivers that it's never okay to look down at your phone.

I would be surprised if it involved anything more than an easily-slept-through safety video. Should have come with periodic checks for distraction.


This makes me wonder how Uber got clearance in the first place. Did a regulatory body, like the NTSB, need to sign off on these self-driving cars before they could be driven on public roadways? If so, how did they miss this during that certification process? Surely emergency situations are at least discussed, tested, and reviewed?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: