Hacker News new | past | comments | ask | show | jobs | submit login

If Tesla autopilot relies on good quality road markings, then it's not yet usable in 80% of the world.



It's not usable anywhere. What if I draw a fake line on the road? will it follow it? what if you have wet paint and another vehicle smudges it diagonally to the lane?


When I was a kid there was a cartoon where a truck painting a white line in the middle of the road, drove off a cliff for some reason (and therefore, so did the line).

The punchline was that all the other cars also fell off the cliff because they were following the line instead of looking at the road.

It's crazy to think, that's where we are now with so-called self-driving cars.


When I was a kid there was a computer game called Lemmings...

What if all the connected autoupdating self driving cars suddenly learn to follow each other off a cliff in pursuit of an optimisation gone wrong.

It would be ironic if the Bulterian Jihad[1] were kicked off as a result of Elon Musk's machinations.

1. https://en.wikipedia.org/wiki/Butlerian_Jihad


Exactly. This is what I’ve been saying. It is so dangerous. I love Tesla but I think autopilot is currently a death trap.


Naive centering certainly is. That's exactly what caused this behavior. Just about any scenario where the car tries to split the difference between widening lane lines has the potential of causing this problem.

To make matters worse, it's trivial to run across a case where the road curves AND splits, causing the 'neutral' path to go straight into a barrier… which is exactly what's seen in the video. The straight path would be toward the split-off road, but it veers slightly to the right before splitting.

Makes me wonder if GPS can play a role in determining whether the lane markings are liable to seduce the autopilot into splitting the difference. Given established roads, the car should have some clue this scenario is approaching. If we can see a 'V' approaching on a simple freaking Garmin, the car ought to be able to have that information.


> I think autopilot is currently a death trap.

I think this is a bit of an exaggeration. As long as the driver keeps paying attention and uses it as a driving aid, not a driver replacement, everything is fine. It's the moment that people start relying on it doing something that it wasn't built for, that the problems arise.


> I think this is a bit of an exaggeration. As long as the driver keeps paying attention and uses it as a driving aid, not a driver replacement, everything is fine. It's the moment that people start relying on it doing something that it wasn't built for, that the problems arise.

I think this is a bit too optimistic. People will start relying on it to be an autopilot. I think most people see this as the desired result: a car that can drive itself. Do you really think anyone wants "a car that will drive itself while you are paying full attention to every detail"? Otherwise, what is a driver really gaining from this? They're still expected to pay just as much attention (if not more) and I'd bet it's even more boring than regular driving (no interaction from the driver means it's like the world's most boring VR movie)

Humans are not machines, we love to find the lazy/easy way and we love to do things rather than stare at the road, eventually people will grow complacent (hopefully not before the tech is up-to-snuff).


> Do you really think anyone wants "a car that will drive itself while you are paying full attention to every detail"? Otherwise, what is a driver really gaining from this? They're still expected to pay just as much attention (if not more) and I'd bet it's even more boring than regular driving (no interaction from the driver means it's like the world's most boring VR movie)

EXACTLY. They're at even more risk for accident due to inattention because it's so hard to focus on doing nothing.


This puts Telsa at odds with itself.

The feature is called Autopilot after the namesake of flight systems where the pilot doesn't need to be constantly hands on. It is marketed as an autonomous driving system that allows the car to drive safely without human intervention.

In the same breath, Tesla says that it isn't really an autopilot system, and that even though they market it as autonomous driving, it is still essential that you act as if you were driving.

As you've stated, it's a tool for assisting drivers. So why do they market it as fire and forget?


They shouldn't market it as "autopilot"


Volvo does it too, but nobody seems to be complaining about Volvo.


Volvo calls it Pilot Assist when you actually have the feature on your car


Volvo's tech hasn't killed anyone yet.


I've seen construction crews paint new lines on the road e.g. to reroute traffic due to one lane being closed for expansion. A human driver can easily tell the difference between the fresh new line they should follow and the worn out old one. Can a Tesla?


> Can a Tesla?

My 2015 S 70D can't but then again the problem only arises if I ignore the road works signs and continue driving on autosteer in circumstances where the UI has clearly told you not to.


I have a pretty simple lane keeping aid in my car that keeps me within the lines. Yesterday it suddenly steered towards a wall. I think it may have been confused by either glare (low sun right in front of me) or a shadow line on the road.


I have it too, it makes some jolts from time to time. It will also start blaring about ten seconds after me not providing any steering input, though. It's lane ASSIST, not automatic driving.


What make & model?


Opel Astra. It also has a 'bug', where at a certain location the lines on the road trigger the emergency braking feature, every single time.


> It also has a 'bug', where at a certain location the lines on the road trigger the emergency braking feature, every single time.

Ouch. That's really harsh on following traffic. Now, obviously they shouldn't be following closely enough for that to be a problem but in practice I'm pretty sure if I stomped hard on the brake a couple of times per day with regular traffic following it would not take long before someone would ram into the back of my car and if not then the one behind it.

It's also a great way to start traffic jams.


If you draw fake markings on the road people will follow them, especially if the sun rises parallel to the road.

I have personally watched this happen, last year (I think?) the where repainting the lines on interstate 64 near shortpump and for ~ a week that summer my commute there was insane, just a bunch of cars going 70MPH with no real lanes.


This isn't new, but what if it snows? I live in Calgary and snow covers our roads for about 30-40% of the year.


Your question would be answered if they would call it driving assist and not auto pilot. Then you'd understand that you would not turn it on in snow. Just like Cruise control.


If my 2015 S 70D can't see enough to tell that there is a lane then it won't allow you to turn on autosteer. Generally if the road is covered in snow then it won't turn on, also it will turn off if the lane markings disappear.


I wonder how it works when there is some road work, and there are multiple overlapping lines at once.


IMO it should not - it should sound an alarm, have the driver take over, and shut down.


I don't think fake lines is a genuine concern. What if somebody plants anti personnel mines under the road? What if your tesla just assumes it's a normal road and drives over it?


I agree. I think this technology needs smart roads to succeed, whatever this entails.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: