Hacker News new | past | comments | ask | show | jobs | submit login

When automation does a 95% job, sometimes it isn't worthwhile using it because of the overrides required for the extra 5%. If you require full concentration while using driving assist, it might actually be easier to just drive the car regularly or you'll struggle to maintain that ability to intervene immediately when required.



When you deliberately rig your car so the driver seat can be empty, yeah those extra 5% suddenly become a very tall mountain.

Sane people can work with the software so they complement each other. People make mistakes, the software makes mistakes. Both together make fewer mistakes.

If you start watching shows or playing games on your phone, or sleeping, that won't happen.


Let's say your car has a problem with a slip road in a particular bit of road you are about to hit in about 30 mins. If on autopilot, it will start to take the slip road, but confuse the hard shoulder with a lane, crashing into the barrier. If you have spent the last 30 minutes trying to stay awake because you have practically no input into the driving, you might not be alert enough to avert the accident. If you've been driving, not only would you be alert enough, but you also wouldn't have been in that situation.


So basically like a rich parent supporting their children often leads to them being unsuccessful in life, become drug addicts and so on. I know, bit of a jump in argument there. But that's where my train of thought autopilot brought me.


Yes that's probably a good analogy. I think the problem is the current systems require being able to take over the controls immediately. If this is required more than an insignificant number of occasions, with potentially fatal outomes, it may be an unreasonable thing to ask of a human who hasn't been driving recently, is not engaged with the road situation, and is likely to be distracted / not concentrating / getting sleepy.


Maybe, but let's be clear that right now investigators think this driver was in the backseat. These are interesting conjectures but not at all related to the crash at hand.


Oh sure, if that's true, then this case doesn't really show anything. But, I was talking more generally about the term 'autopilot' and its benefits. My example used a 2018 crash where the autopilot half took a slip road [1]

[1] https://www.carthrottle.com/post/tesla-defends-autopilot-aga...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: