Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
[flagged] (Tesla) has twice attempted to drive directly into a train while in FSD mode (twitter.com/artemr)
78 points by Animats on May 20, 2024 | hide | past | favorite | 83 comments


When Ken Klippenstein obtained footage of an FSD Tesla causing a major accident [1] and wrote about it, he got blocked from being found through X/Twitter search [2]. I remember verifying the search block on my phone at the time.

If I were a US lawmaker, I'd take a moment to reflect on whether a company such as X/Twitter, which actively censors journalists to protect the financial interests of its owners, should really be granted Section 230 protection.

[1] https://x.com/kenklippenstein/status/1612848872061128704 [2] https://x.com/kenklippenstein/status/1604574747945275396


The whole point of section 230 is to let sites moderate user generated content without becoming responsible for what is in that content. What Twitter is doing falls squarely within the letter and the spirit of section 230.


I'm not arguing that X/Twitter is doing something that prevents it from being protected by section 230 as it exists. But, when Congress granted these protections, I think they had in mind things like forum moderation to remove hate speech, not selectively censoring journalists that write critically of a company. So one could rethink whether companies that censor journalists should be protected.


I don’t understand why self-driving systems don’t have any kind of fallback protections. Like however hard full self driving is to get right, surely a hard rule for “don’t drive at full speed into solid objects, no matter how good an idea it seems at the time to the AI” is not that hard to implement?


Tesla uses only cameras for FSD, not lidar, so it can't tell what's a solid object and what's not. Driving in foggy conditions where everything is fuzzy to the camera isn't helping.

I'm surprised FSD stayed engaged, and I'm also surprised the driver thought it was a good idea to use FSD in this weather.


These types of comments are unhelpful. The expectations on the driver are getting vaguer and vaguer. Can't drive in damp weather? Yet it's Full Self Driving? Tesla wants to market it as powerful AI but leave all responsibility with the driver, don't encourage that narrative.


I'm not a fan of Elon's branding either, but if you've ever actually driven a Tesla, it's very obvious very quickly that "FSD" is not fully autonomous driving, and Tesla itself festoons "Beta" all over the place. So, yes, the driver deserves some responsibility for relying on it in manifestly unsuitable conditions.


100%. Whatever the actual limits, flaws, and failings of FSD, the outsize criticism of the branding is crazymaking. (as I've commented in many of these threads) no one who actually uses the product for even a short period of time is deluded into thinking it is fully autonomous by the labeling.

My pet theory for some of these dramatic fails: the driver is not risk-averse, and knowingly uses FSD at some limit "to see what happens" (I know I do it from time to time, though not in anything truly risky).


It turns out that if you loudly proclaim something is "Fully Self Driving", with a tiny asterisk, people assume it's fully self driving and engage it in all sorts of places where they shouldn't.

And that fault lies squarely with marketing, and much less with individual drivers.


Note that "not disengaging" is one of the critical differences between L2 and L3/L4 systems. The latter are required to disengage when they're outside their safe operating domain like this.


In foggy and rainy conditions only a radar will help you.


Great, sensor fusion should work well then. I’m not sure I see the reasoning behind avoiding lidar and other sensors now.


I guess Elon is just hell bent on making it work with a bunch of cameras, because that's the closest to what humans have.

That logic is flawed and will not get us cars which are an order of magnitude safer than driving manually.


When lidar/radar gets to commodity pricing, I'm certain Tesla will reverse course on "vision only". As it is today, FSD is very impressive, though not autonomous in all conditions.


why does everyone parrot musk, what he says may not be the real reason for things.

> I guess Elon is just hell bent on making it work with a bunch of cameras, because that's the closest to what humans have.

a smarter guess is, he doesn't want expensive lidar systems in his cars so Tesla can make more money. Stop taking Musk's words at face value


The reasoning is that Tesla wanted to spend less money so they could make a higher profit.


Lidar does not work in fog. Did you mean radar?


Camera only perception systems don't have any indication of what is solid vs not. This is why Cruise and Waymo use LiDAR. They have exactly what you're talking about. Even then it's hard -- consider soft vegetation or trash.

Furthermore: tailgater collisions are a thing. You can't just stop on a dime, especially on fast roads (even just 45mph roads).


I’m always amazed when people talk about using Tesla self driving features. Every time I get in a Tesla the driver has the screen showing this little block version of how it perceives what’s around it. It always misses stuff, cars jump in and out, things randomly vanish. Maybe that UI is buggy or doesn’t really show what the self driving would use, but I can’t imagine seeing that and then wanting to let the car drive itself.


There's a probabilistic nature to how the image sensor data is interpreted.

The neural network processes data to classify objects and predict what each object is. It doesn't always have a perfect static representation of each object. It's constantly updating its understanding based on the image sensor data. That jiggly jitter you see is basically the system refining its prediction in real-time.

Or at least that's how I understand it...


Other companies have put quite a lot of effort into perception stability because it has a large effect on the downstream quality. It's hard to estimate higher order derivatives like velocity and acceleration well if your position estimates are unstable.


> That jiggly jitter you see is basically the system refining its prediction in real-time.

GP is saying that a "car that jumps in and out" and "things that randomly vanish" do not look very refined. Just like missing a freaking moving train doesn't look very refined.


Lidar has the same problems with foggy conditions.


Actually, no. Look up videos of "range-gated imagers". With multi-hit data for each scan point and appropriate processing, you can see through fog. This technology has been around for years but not talked about much outside military applications.


Thanks for that, it makes sense. Same techniques are available in radar. This will give you a higher noise floor and also lower signal if fog sits between you and the subject. It probably does improve the situation over visible light, but only up to a point.


Agreed, thus the need for radar. Thermal imaging can also see past fog fwiw, though it has other tradeoffs.


The biggest challenge is identifying said solid object. It is only within the last five years that cars with radar have been able to detect stationary objects with some level of reliability, prior to that, if it wasn't moving, there is a good chance that your Adaptive Cruise Control wouldn't detect it, as otherwise they had a tendency to stop for the bin that is on the side of the road, or for parked cars.

A train is a particularly challenging problem, since it is moving at high speed, even with two cameras it is card to really figure out how far away it is. Just because it is easy for humans, doesn't mean it is easy for computers, and vice-versa.


If only there was some sort of technology that could directly measure the distance to objects. Perhaps as some sort of point cloud.

Guess we'll never know.


Here’s the problem: Tesla only uses camera input - no lidar or other sensor tech. So it’s a visual model ensemble that needs to decide what a solid object is.

Models don’t reason. They classify and predict. If the prediction is wrong, explaining why is pretty hard, or maybe impossible. So you just need to iterate over and over on the data to try and fix it. It might, and it might not.


My understanding is that it's actually quite difficult to implement a rule for "don't drive at full speed into solid objects" because solid objects are somewhat difficult to reliably identify.


Actually Tesla has solved the identification problem. “If we collide with it, it was solid” Has been working flawlessly for them.


That video also seems like they were driving insanely fast on a foggy day. Quite possibly setting the speed offset way higher than the speed limit. A human could have easily not seen the train as well at that speed.

During fog your speed limit should be determined by your braking distance and visibility. Period.

I wonder whether the autopilot would have performed better if travelling at a more reasonable speed.


He doesn't seem to drive that fast, he drove at a constant speed only to stop at the very last second. Had he been driving too fast, he would have been right into the train and probably not here to post on Twitter.

The car didn't slow down even as the train was clearly visible. An attentive human would have not only seen the train but also the very obvious flashing lights and have absolutely not problem stopping in time.

EDIT: From the link, it seems FSD was set by the car to about 61~63 MPH on a 55 MPH road, which is the speed most people drive on ( https://teslamotorsclub.com/tmc/threads/ap-fsd-related-crash... ). Considering the conditions, it is a little fast, but not unreasonably so. It took about 5 seconds from seeing the train to the crash, most of it at constant speed, so, a distance of about 140m. At that speed, stopping distance is about 100m, enough to stop in time. And that's for the unlit train, lights can be seen from much further as they are all visible at the start of the video.


You can see the flashing light from way back. You would also probably hear the train.


It's possible/likely that if such a safeguard is implemented, the self-driving systems would be unusable as it'd trip the safeguard either way too frequently or at unacceptable times.


I think the problem is those hard rules end up causing problems in more cases than they solve problems.

Eg. Driving into an apparently 'solid' cloud of smoke from the truck in front might be a good idea.


Obviously they have that. The issue is that they don't recognize solid, stationary objects all the time. Which you'd think would be easy, but turns out it isn't.


I think that's a good idea in theory, but it still relies on the car knowing when it's looking at a solid object with 100% accuracy.


The fallback is the driver doesn't let it drive at 60 mph up to 50 feet from a train blocking the road.


I predict Tesla will claim that FSD disconnected 5-seconds before impact, therefore it is the driver's fault entirely. Or some vague thing about FSD giving an attention warning within the 15-minutes before impact.


FSD? More like Full Self Blaming!


I don't get why people even consider driving hands-off on any of the autopilot-like technologies around. Maybe in the cleanest possible conditions on a highway in slow moving rush hour traffic; and even then hands on the wheel expecting the worst. Do folks really have that much faith in technology? :-O


Isn't it because it is advertised as "Full Self Driving"? Not everyone is an expert on AI or follows auto news in detail.


The name "Autopilot" for their lane-keeping feature clearly relies on most people not knowing how autopilot works in planes, because it's heavily implied to be autonomous driving and it's not. It's just a Tesla scam.


THIS. If you asked a bunch of pilots about the capability level they'd expect from something called "autopilot", then compared with the answers you'd get from a bunch of Joe Averages...yeah. When people don't know squat, and they hear some Marketing-speak, and they'd like to believe - the next thing you know, they've convinced themselves that Santa's magic elves are making it all work, and nothing bad could ever possibly happen.


I don't think its clear they are relying on people not knowing how autopilot works on planes because I don't think most people know autopilot on planes exist for the most part. 10-15% of the US population has never been on a plane.


AFAIK pilots don't need to hold the yoke for autopilot to work on planes, but this is required in Tesla.


I have been doing an experiment for a while now where I rent different EVs for a day from Turo to see if there is one special one that will make me give up my favorite car that I have had for years (06 Mazda 3)

Rented a Tesla for an entire day yesterday and I see the appeal. I only tried autopilot on the highway with straight or curvy lanes. Handled it perfectly and really helped cover like 70% of my road trip (the part with straight roads for miles)

Before this, I rented a Ford mach-e about a month ago with the latest Bluecruise and did the same test, was constantly "ping ponging" back and forth in the lane and essentially gave me no confidence whatsoever in its abilities.

I then tested a Bolt EUV after the Ford but before the Tesla. Unfortunately I wasn't able to get a Bolt EUV with Supercruise but at this point it does not matter, the supercruise is on the Bolt EUV which is not comparable to the Tesla in other areas and all the other GM models which have it blow past my budget.

Autopilot is really a relaxing system when you have been driving for hours. It politely nags you for what seems like every 5 mins so its not like I can fall asleep (at least I wouldn't dare do that) but it really takes some cognitive load off.

Was also raining heavily for some of the day and Autopilot detected this and made it clear on the screen to be extra cautious. Seemed to handle it perfectly fine but I was overly cautious anyway.

My assessment at the end of the day is that given Tesla's excellent software compared to Ford and Chevy, their amazing charging experience compared to EVgo/Chargepoint, and the overall Autopilot system, their cars are in a league of their own compared to other EVs and its just a waste of time to even consider the others.

I am still not sold on EVs vs getting another Gas car but Tesla jumped to the top of my list for an EV. Like I mentioned I decided im likely to not even waste my time and money trying VW and Hyundai (they were next on my list to test). Not sure yet though.

Wish we could get at least the basic highway version on all cars. I think it would be a big positive for so many people. I never really believed until I finally tried it out for an entire day. People NEED to give it a day of actual testing before making an assessment.


I've been pretty happy with the ADAS system on my Mercedes E-Class. Over time I've learned what it does well and what it doesn't do well. But yes, of course, hands on the wheel 100% of the time, same amount of attention as if I were driving fully manually.

I see it as mainly a way to avoid leg/foot fatigue, especially in heavier traffic, and also as a sort of backup. I think of myself still as the "primary system", but the ADAS will help me stay in lane if I get distracted on a long drive, or if I don't notice the car in front of my braking hard. I haven't needed it to save me from anything (yet, knock on wood), but it's nice to have it there.

It is absolutely bonkers to me that some people think these sorts of systems allow them to not pay attention, take their hands off the wheel, do things on their phones, etc. Even Tesla's so-called "FSD" (with the scariest of scare quotes) isn't an excuse for that sort of thing.


I would not trust something that drives as fast as that car does in that type of fog.

But I trust my open pilot device, but a large part of that is that I can see on the screen actually what it intends to do.


as far as I'm concerned and I know it's not entirely feasible, but it should be all cars on the road are autopilot or none are. and until then if someone is driving in autopilot there should be a big ass flashing indicator light that I can see to warn me that extra caution needs to be taken around that car


The Mercedes-Benz Drive Pilot system has special warning lights when operating in autonomous mode. It is generally considered to be technically superior to Tesla's system in most respects.

https://group.mercedes-benz.com/innovation/product-innovatio...


definitely a step in the right direction but needs more flashyness


I'll use the autopilot in regular traffic on highways only. But I'm also aware that glare will mess with the cameras in my Mach-E.

Also, the driver attention system warns me if I'm not paying attention to traffic.

I.e. know when to use it and be aware of its limitations.


I use it to take over while I take my eyes off the road for 1-3 seconds to adjust the GPS navigation / radio, or plug my phone in for charging when it's otherwise safe-ish.


99% reliable is very convincing at looking like 100% reliable. People get comfortable with things pretty quickly.


Yes and every halving of that 1% error is more than twice as hard to do with neural net training. At some point more principled control algorithms are needed.


Because for all the angst and outrage at HN, ask yourself how many people do you personally know that have been killed by Autopilot?


"It hasn't killed any one I know" is a deeply inadequate standard. There's a lot of monstrously deadly things that haven't killed anyone I know.


I mean I don’t know anyone who’s been killed by a THERAC-25, either, but I wouldn’t be in a hurry to go under one.


I’ve been reflecting on Musk’s “roads are designed for two eyes” quote. I’m not sure on the exact wording but he used it to justify dropping lidar and focusing on front-facing cameras.

What got missed, I think, is that there are many cases where having only two eyes fails drivers. The most dangerous driving most of us will ever do is because of some quirk of the landscape, or some long-forgotten legal battle over where the road should be, or because people are cramming more and more in to very limited space.

An excellent example that comes to mind is a steep hill leading up to a dense downtown. At the very top of the hill is an intersection with a two-way stop sign. You can’t see to the left because of the edge of a building, and the hill is steep enough that you have to kind of stomp on the gas to get up it after you stop. “Two eyes” utterly fails us here. If there were cameras facing left and right on each corner of the vehicle the problem would be solve as you could watch the oncoming lanes while stopped.

Here too, two eyes fails us. You have to already know about the train crossing, or hear it, or intuit it’s presence some other way because once you see it it’s so late that you need to slam the brakes to avoid hitting it.


The red lights off the crossing are very clear, even in the fog.

In any case, one should never drive so fast they can't stop within the visible distance.


> "I have owned my Tesla for less than a year, and within the last six months, it has twice attempted to drive directly into a passing train while in FSD mode.

After the first time, I would hope the victim of this reckless product&marketing would go to a lawyer. (Maybe in a cab.)


Or at least stop using the ‘feature’, bloody hell.


“no way can this happen twice”


Totally crazy that someone drives that fast with fog like this, knowing that there is a rail crossing there.

No matter what kind of autopilot I have, I will be always careful around those places.

It also does not make any sense that Tesla does not mark in the map rail crossings as places of extreme danger that make the vehicle go slower automatically. Rail crossings, like road crossings are death traps. They should not exist, they are bugs to be solved with roundabouts and bridges or underground pass.

Also does not make sense that Tesla could not detect extremely bright lights indicating danger.


I wouldn't be surprised if their working culture got rid of their most talented people. It is literally a burnout factory. That can easily be the source of all downstream problems.


Surprised it allows FSD in such poor visibility conditions. Surely they need to do some work on a reported visibility confidence level where FSD disables if it is not confident in the amount of visibility it has.


If it knew that it would also be able to drive by itself.


Autosteer (lane following) will disengage if it can't see lane markings. I'm genuinely surprised FSD was active in these weather conditions (and as other posters have suggested, it may not be FSD at all).


The problem AIs have is that they are often confidentially incorrect.


That sounds definitely untrue.


all the recent discussions and controversy around auto pilot and it made me wonder if there is anything to be learned from the OG self driving car, the Amish horse and buggy. I grew up in a community where the Amish were quite prevalent, and there is an old and popular saying when an accident involving a car and a buggy happened that "it was the horses fault". not sure of how often horses walk into passing trains but I'd assume it's not very frequently


The speed is an advantage there from a safety point of view. You can get away with a lot with slow-moving vehicles.


This happened once before and the driver still wasn't paying attention at as obvious a hazard as a train crossing? Not sure I trust them to distinguish between autopilot and FSD. Although of course FSD is not actually capable of driving without supervision so it could easily be true.


Interesting how anything related to Tesla never lasts more than a few minutes on the front page.


Tesla is definitely at fault here, but in my experience, when camera visibility is very poor (only a few meters ahead), the driver's visibility is even worse. I've been in situations with heavy rain where I couldn't see more than a few meters ahead, yet the front camera's recording showed more than what I could see with my own eyes.


Perhaps it's time they consider lidar? Elon hasn't seen the WHY yet?

Tesla CEO Elon Musk has stated that cameras are the only sensors self-driving vehicles need, and that Tesla's electric vehicles do not use lidars. Musk has previously called lidar "expensive appendices" and "a fools' errand". He has also said that any company that relies on lidar for its autonomous capabilities is "doomed".


Now, try to be an engineer that tries to "correct" him. Musk's ego is so fat, if it walks by in front of your TV, you will miss the whole season.


A lot of these "FSD" failures end up being Autopilot w/ lanekeeping.


Discussed in many places. This is the original video.


Fool me once, shame on you. Fool me twice, shame on me.

Cause maybe the third time, someone's going to be a railroad statistic.


lol you trust Elon Musk and his PT Barnum salesmanship with your life and really any self driving technology ... recently seen two waymo's driving on the wrong side of the road in San Fran (https://www.reddit.com/r/waymo/comments/1c9oefx/waymo_going_...) and in Phoenix (https://www.12news.com/article/traffic/wrong-way-waymo-arizo...)

Im not against the technology it's just in it's infancy and those using it are paying tons of money to be guinea pigs so progress can be made.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: