Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Tesla drivers say new FSD update is repeatedly running red lights (futurism.com)
74 points by ra7 on Aug 28, 2024 | hide | past | favorite | 105 comments


Anybody buying FSD is a fool, and you know what they say about fools and their money. It blows my mind that some consumers would spend $8K on a feature without doing the 5 minutes of research it would take to determine that FSD isn't here, and won't be here for years.

I own a Tesla, and I chose not to buy FSD because in 2019 when I bought it, it simply didn't work based on what I had read. Earlier this year, when Tesla gave a trial to all Tesla owners, I tried it out and was surprised at how good it was, but it still make bad mistakes, one time even curbing the rear wheel while navigating a curve to the right [0], despite it knowing exactly where the curb was.

The problem is that you still have to babysit it. And if you have the babysit a self-driving feature, then it's entirely useless. It becomes nothing more than a party trick to show off to friends about how neat it is.

Autopilot is great. Love it, especially on road trips. But until FSD is good enough to drive my drunk ass home from a party and even park in my driveway, I'm staying away (and staying sober!).

[0] Here's the curb it hit: https://maps.app.goo.gl/BLeqSywhXHRyf7a39. With how wide that lane is, there's no reason it should have cut it so close that it hit the curb. There wasn't even anybody next to me.


I am so bitter about my FSD purchase in 2019. At the time, there were some enhanced autopilot features and the rest was promised by the end of the year. I occasionally try it out and it's so bad. It feels completely unsafe. FSD has left such a bad taste that I almost never use even basic autopilot. I feel like I was conned out of $6000. I will never buy another Tesla as long as fraudster Elon is at the helm.


I got mine for only $2000 and I feel conned. Nonstop empty promises, suddenly removed from the site as the date comes and goes. e.g., "Your car will be able to drop you off, park, and come get you by the end of the year. Really."

That was two years ago at least.


Remember that bullshit demo video where it drops the guy off in front of the office and parks itself? I don't understand how Elon isn't behind bars by now. How many staged demo videos have we seen? How many promises? These silly investor presentations where he makes wild claims? Robotaxis? The Cybertruck being built for Mars? It's one lie after another.

The amounts by which he's inflated the company's value and his own net worth with his lies and scams dwarfs Bernie Madoff. He's lied to squeeze short sellers who are onto his bullshit. He ran that Dogecoin pump and dump. The scale and audacity of his schemes are truly epic.


That's why Musk supports Trump. Two conmen cut from the same cloth.


It's the only logical explanation.

Otherwise, Musk's hard-right turn just makes zero sense. People on the right generally don't want EVs, while the people on the left do. He alienated his largest potential customer base for what reason? FFS, Trump even said he'd ban EVs (Though we all know it won't actually happen) and Musk still supports him!

I'm convinced Musk just doesn't care about Tesla anymore. Cybertruck release has been an utter disaster and he has no desire to make it right.


It really helped me as an investor, which I am no longer. :)

But yeah, I dumped it during the robotaxi """news""" knowing it's another pump. It's really a meme stock and personal piggy bank for Elon at this point. And I don't want to fund facism any longer. And you're right, how hasn't there an Elizabeth Holmes-like arrest?


I feel the same. I enjoyed the FSD trial but it confirmed that I was right not to pay any money for it. It drove me all the way home successfully on a route I thought for sure would require my intervention. That was cool. But then one day it randomly decided the bike lane was the correct lane. That was a funny feedback when it asked why I disengaged ... "You decided you are a bike but you are most definitely not."

What's weird is that it was awful on the highway. I once rented a 2018 Model 3 with FSD before I bought my first Tesla, and it was delightful on the highway. It could switch lanes and everything. But the current FSD ... I tried to tell it to switch lanes, and it got halfway between the lanes and then changed it's mind. Didn't disengage, just sat there for a bit before deciding to wander back into the lane I started in. Another time, right after I engaged it, it decided to switch lanes (successfully this time, at least) but right in front of another car coming up on us. And there wasn't anybody in front of us to begin with. WTF.

AP is fine. I'll keep that. Though I'd like to have the dumb cruise control available as an option, because sometimes the adaptive cruise is a bit psychotic.


I know people who like it but the expectations need to be totally different from the marketing. You're spending $10K or whatever for a better (in many cases) system that still requires you to pay attention in general.

As MIT's James Leonard said almost 10 years ago now, he didn't expect to see a true self-driving system in his lifetime. Arguably it's getting closer in a few areas. But it's a fantasy that the typical teenager won't need to learn to drive in most places without significant compromises.

And there's probably a component of people who have come to be feeling very betrayed because they're not getting a reasonably near-term future they thought they were promised.


In less than a year, you will be able to go from east coast to west during your sleep. That was said in 2016 IIRC. 8 years later we’re still waiting for it, and it’s not anywhere close.

You are an enabler of this fraud. Great for you that you love it.


> In less than a year, you will be able to go from east coast to west during your sleep. That was said in 2016 IIRC. 8 years later we’re still waiting for it, and it’s not anywhere close.

FWIW, I didn't believe it then, and I don't believe it now. Even ignoring needing to get out of the car to plug in, it can't park itself at a charger. It can barely navigate parking lots in general as evidenced by the terrible results of Smart Summon (which I did not try during my FSD trial because fuck that).

> You are an enabler of this fraud. Great for you that you love it.

I'm an enabler only in the sense that I bought a Tesla. I did not buy FSD.

And if it's worth anything to you, while I do love my Model 3, I would not buy a Tesla today for reasons far beyond lies about FSD capability timelines.


Fair, but I still don’t understand why someone would buy from a car from a company, while knowing they lie about the car capabilities and safety to their customers and shareholders.


> Anybody buying FSD is a fool

Anyone treating FSD as anything more than an L3 in beta (i.e. not L3) is a fool. That still leaves plenty of safe-enough entertaining use.

> if you have the babysit a self-driving feature, then it's entirely useless

I don't think so. My Subaru has L2 lanekeeping functions. I absolutely have to babysit it. But it's fun. And it reduces the stress of driving.

FSD is more stressful. But it's also more entertaining, and I don't think materially unsafe if it's being properly monitored.


I feel that simple lane-keeping is different. I mean, that's all Autopilot is: Traffic aware cruise control and automatic lane keeping. That's all it is. It does not navigate. It does not obey stop signs, traffic lights, etc. It won't even change lanes.

Technically, yes, AP requires babysitting, but it's at an entirely different level from FSD on city streets. Highway driving requires less attention than city driving. You're mostly keeping eyes forward with the occasional check of your mirrors. City driving requires lots of looking around, more turns, more lane changes, and overall just a greater awareness of what's happening around you. When driving on FSD on city streets, you still need to keep that heightened awareness, so it's useless.

Highway driving is just drudgery. Maintaining speed and a lane. AP removes the drudgery. Babysitting it just means making sure it doesn't suddenly decide to swerve or abruptly change speed, whereas babysitting FSD means doing the exact same checks for oncoming cars and traffic signals that the car is doing.


Is entertainment value really a good goal for such a system? A key part of entertainment is typically surprise, and that's also normally a bad characteristic of a safety-critical system.


> Anyone treating FSD as anything more than an L3 in beta (i.e. not L3) is a fool.

"I'm extremely confident that Tesla will have level five next year, extremely confident, 100%" Elon Musk, July 2020

https://www.businessinsider.com/elon-musk-interview-axel-spr...


When Musk announces a pre-sale of tickets to Mars it might be a good idea to pass;)


The "Dear Moon" project [1] promised to Yusaku Maezawa by Musk back in 2018 [2] was cancelled in 2023 because he just could not determine when SpaceX would deliver. So don't feel bad about your FSB purchase - it happens to the richest of us;)

[1] https://dearmoon.earth/ [2] https://www.geekwire.com/2018/spacex-elon-musk-bfr-around-mo...


You chose wisely. I got FSD in 2019 and it was frankly terrifying.


Yeah, I drove >4000 miles a few weeks ago, mostly on FSD. Wouldn't do it otherwise. It does dumb shit sometimes but it's a game changer and I won't buy a car that doesn't have autonomy as good as this.

I bought it at full price twice (2020 and 2024) and don't regret either time.


Still ignoring school bus stop signs as well.

https://x.com/bradsferguson/status/1828031824158683439

As it has been for literal years and has resulted in at a minimum one recorded instance of literally and knowingly running down at least one child exiting a school bus:

https://x.com/tesla2moon/status/1770599114494898310

https://www.youtube.com/watch?v=_ZiSZbWIrzA

https://www.youtube.com/watch?v=Ly6Juveo-7Y

Reported by Tesla confirming ADAS usage as incident 13781-5100 in the NHTSA SGO database [1].

https://www.wnct.com/on-your-side/crime-tracker/tesla-driver...

[1] https://static.nhtsa.gov/odi/ffdd/sgo-2021-01/SGO-2021-01_In...


I've failed inspection for a mere bulb being out or code not being set but there are people flying around in heavy and fast death traps running beta driving software. It's absurd.


Accidents are not good, but how does FSD safety compare to human drivers in general?

Human drivers run over and kill school children too, so which is actually worse statistically? Human or FSD?

I'll side with the FSD / Elon haters if FSD is significantly worse. Until then I ask questions and remain neutral.


I've tried FSD enough that I can say with perfect confidence that it is orders of magnitude less safe than an average human. Even a drunk one.

Every few miles of driving it would do something that would most likely be a fender bender at minimum. That's pretty risky. Compare to human drivers, which average roughly 1 fatality for every 100,000,000 miles driven. And that's including drunk drivers.

My guess is FSD in the current state would probably average 1 fatality for every 100 miles or so, if you were not closely supervising it. Assuming none of the wrecks along the way disabled the car.


What version of FSD? In what year?


Tesla's decision not to release the data required to make such an assessment might be considered some sort of indication about what that data says


It's indicative of dishonesty and fraud.


> Accidents are not good, but how does FSD safety compare to human drivers in general?

We'll never really know and tech companies aren't trustworthy.

A new software update will invalidate what we've observed about the last version and we'll be back to knowing nothing.

Humans in aggregate are predictable and nothing will make that entire fleet more dangerous overnight.

Legacy car companies came about in an era when they faced accountability from both consumers and lawmakers that we can no longer count on.


> A new software update will invalidate what we've observed about the last version and we'll be back to knowing nothing.

"Move fast and break things" is being replaced with "move fast and kill people."


Don't want to hear it so long as Tesla holds me legally responsible for whatever their software does.

Unless and until Tesla is fully legally responsible for whatever their FSD does, then it is NOT "Full Self-Driving." Calling it "Full Self-Driving" is fraud and those who purchased it should file a class-action lawsuit.


It is Tesla's duty to provide rigorous, scientifically, and statistically sound evidence for their claims of safety as Waymo provides [1][2]. Instead, they present bare numbers with no base rate estimations, confidence intervals, methodology, or model supporting them despite already killing tens of people. They deliberately market falsified numbers [3] and their models are unable to detect individual sources of error that result in underestimating crashes by a factor of 500% (from a single well-known factor, ignoring any other source of error which we should assume are abundant given they failed to account for differences in airbag deployment rates which literal internet posters have been pointing out is a potential source of bias for years).

They are either incompetent or intentionally misrepresenting their actual safety statistics. Until the experts at Tesla can find some rigorous evidence that their system is actually safer, it is safe to assume that it is not; they have had plenty of time for a competent team to produce sound estimates.

[1] https://waymo.com/safety/

[2] https://assets.ctfassets.net/e6t5diu0txbw/54ngcIlGK4EZnUapYv...

[3] https://static.nhtsa.gov/odi/inv/2022/INCR-EA22002-14496.pdf


Sounds reasonable.

Weird that the (US) government can't force Tesla to publish that data as a per-condition to be allowed to offer FSD at all.


Waymo's numbers aren't dishonest and their accomplishment is real, but this data should be taken in its proper context.

1) The comparison lumps in a lot of humans which cynically and selfishly break the law. Currently Waymos attempt to obey the laws of the road, but Tesla FSD shows that self-driving companies can and will market towards people who want to speed, run red lights, etc. So I am deeply concerned that a formal comparison of law-abiding self-driving cars will be used to encourage development of autotaxis which compete with human Ubers on travel time by speeding or rolling through stop signs like a human would. It would ultimately be a correlation/causation error: Waymo's training and development correlates with better safety records when the causative variable is ultimately risk-adversity, with better engineering being a secondary factor. In particular, I suspect Waymos will eventually be safer than humans at legal driving but dramatically more dangerous at illegal driving, and Waymo executives using motivated reasoning to dispel the concerns. Again, FSD literally offered different options for how much you wanted to speed or roll stop signs. This isn't hypothetical.

2) This is less concerning than reckless driving, but these stats include a lot of human accidents due to mechanical failure. This is something I expect self-driving to handle very poorly without human intervention. The most cognitively demanding moment of my entire life was having a rear tire blowout while driving a heavily-loaded truck with little experience - as the end started to fishtail I quickly realized what happened and carefully applied the brakes, hit the emergency lights, closely paying attention to the tactile feedback in my feet and hands, then pulled over as soon as I possibly could. If I were an AI and this was an edge case the engineers forgot to train on, Robo-Me would have easily flipped the truck. Mechanical failures aren't occurring in Waymos now because the cars are still new. Eventually it might be a real problem.


Meanwhile, Waymo can sensibly move through a street with crowds of people walking around: https://x.com/TechTekedra/status/1828816255404687773


The thing I wish we better understood about Waymo is how much the remote human operators are actually intervening on a daily basis - maybe I don’t know where to look, but I’ve never gotten a clear answer here. AFAICT the success of Waymo means Tesla needs to have a similar level of human oversight for its FSD vehicles. But since Waymo’s PR is all about the autonomy, way too many people have the impression that Waymo’s advantage is solely about better algorithms along with humility around limited geographical range. These are probably important factors! But I strongly suspect we don’t appreciate the human oversight.


Their remote operators have pretty much nothing to do with the video shown, though. It's been described in detail the way that remote operators interact with the car and it's not in a way that would be particularly helpful here. You may be under the common misunderstanding that remote operators can directly control the vehicle.


AFAICT it has not been described "in detail," the only writeup I know of is this vague, flowery blog post: https://waymo.com/blog/2024/05/fleet-response/

> it's not in a way that would be particularly helpful here.

Why would you say that? This claim seems completely unsupported. What if the human directed the Waymo to drive more slowly and carefully than it would otherwise? That sort of instruction seems entirely consistent with the blog post and would be critical to the impressive behavior we see in the video.

> You may be under the common misunderstanding that remote operators can directly control the vehicle.

I think you are under the misunderstanding that since humans don't control the vehicle, they must have only a second-order impact on safety. But I am not concerned about self-driving's technical ability to dodge obstacles, I am concerned about its judgment around passing school buses or not slowing down when there are lots of pedestrians. It's this sort of human-level reasoning that I suspect is critical for Waymo's safety record. Nothing in Waymo's blog post / infomercial suggests otherwise.


We can't appreciate it because it's unknown to the public. For Waymo's detractors, the assumption is it's AI stands for Actually Indians; for its fans, it's Nvidia GPUs and Google TPUs with zero human input. The truth is somewhere in-between, but as a customer of their product, how much do I need to care? if the system requires an army of contractors to actually work, rasing costs, that's between the Waymo investor and their balance sheet. Though I guess as someone that owns Google stock I technically am an investor.


Well, as I looked at my stock portfolio today, basically anyone who invests in funds even tangentially related to the tech sector, including ones with a bond component, you're in Google, Apple, Nvidia, etc, considerably.


Meanwhile Waymo just yesterday went the wrong way on the street.

https://abc7.com/post/wrong-waymo-driverless-car-goes-oncomi...

Waymo hits stationary pole.

https://www.youtube.com/watch?v=HAZP-RNSr0s

If Waymo is as good as you claim, then why don't they release data on the number of manual interventions>

It's just the Waymo's missteps are unlikely to show up on HN unlike Tesla's.

FSD can navigate crowds too.


They do release data on interventions in CA during autonomous testing with safety drivers [1][2][3] (distinct from driverless testing). As a ADS system under test, they are legally required to unlike Tesla who characterizes their system as a ADAS and thus absolves them of any responsibility for robust data collection or reporting.

Their numbers indicate ~17,000 miles between any safety driver disengagement. Tesla does not release comparable data, but the highly pro-Tesla biased community-reported numbers[4] average ~42 miles between disengagements.

[1] https://www.dmv.ca.gov/portal/vehicle-industry-services/auto...

[2] https://www.dmv.ca.gov/portal/vehicle-industry-services/auto...

[3] https://thelastdriverlicenseholder.com/2024/02/03/2023-disen...

[4] https://teslafsdtracker.com/


There are far more Teslas that have purchased FSD than Waymos, and the Waymos are trained on those specific streets.

Maybe Tesla should treat FSD like Waymo, only available in certain cities to a few hundred drivers? As a pedestrian that would make me feel a little better about it.


> Waymo goes the wrong way on the street

I'm familiar with that intersection in Tempe. There is a bar for beating the average human driver on it, and going to wrong way is sadly not above it.

I'm also not aware of any fatalities resulting from a Waymo accident. (Waymo also doesn't seem to be deluded about its system's safety. That's why they have human drivers on standby, remote and in person.)


From the video, it's hard to see what happened there. It's in the middle of an intersection. It kind of looks like the Waymo vehicle was attempting a left turn but something prevented it from finishing it.


If Waymo is as good as you claim why don't they release data on the number of manual interventions?

They didn't even release the video when a Waymo hit a bike while turning left at an intersection a few months ago.


They do. They report "disconnects" to California's DMV.[1] Collisions are reported in detail.

Tesla doesn't report to CA DMV because they are only a "driver assistance system".

[1] https://www.dmv.ca.gov/portal/vehicle-industry-services/auto...


I would not be surprised if the Waymo was left "stranded" in the intersection due to red light runners.


In SF, at least, the Waymo vehicles are nigh perfect. I happily occupy the street with them as a pedestrian and bicyclist.


Yeah. You can tell riding in them and watching the little screen, that the Waymo has a superhuman ability to detect pedestrians and cyclists around it.

I would much rather my children bike on a street where every car was a Waymo, than a street full of human-driven SUVs.


I also like that the presence of Waymos has an overall traffic-calming effect even when human drivers are present.


This sounds like a nice side-effect - I wonder why it happens.


Same. If we ever get (bike+AV)-only roads in SF, I'll be happy to let my children to to school on them. Very trustworthy cars.


With v12.5.1.1 My Tesla ran into a curb, I had to get 2 wheels and tires replaced

https://twitter.com/gourneau/status/1821005220190798038


Who paid for this in your case?


So far I have paid out of pocket, I have an insurance claim but I expect to at least max out of deductible.


So you're insurance rates are going up while Telsa avoided all liability for what should be considered buggy safety critical software?


Yikes. I don't want to use this product myself.

Still, it's worrisome to judge based on anecdotes. I'd really like to see some sort of overall statistics that compare Tesla FSD to other cars. Tesla claims their FSD system, overall, saves lives. Does it?

The Waymo statistics, I find pretty convincing that Waymo is safer than human-driven taxis. There are still bad Waymo anecdotes, and they should continue to improve, but overall it seems like a good thing for safety.

Tesla FSD, I just don't know.


The fact that Tesla will only release limited cherry picked stastics tells us a lot. If FSD were actually as safe as Elon wants us to believe he would release the data to prove it. Until then we should not assume that the system is any safer than the anecdotes of crazy failures imply.


FSD is surprisingly good when it does work. But the problem, as I have said on this very web site for more than a decade, is that the failure mode is unacceptable. You have to button up all the nigh unto infinite edge cases to truly be fully autonomous.


is this a technical or cultural feature?


Both. It's a technical failure that the system is running red lights, and a cultural failure for it to be pushed out to production.


Interesting to consider it might be cultural. If Tesla is using training data from their other cars, maybe this is something the car has just figured out that human drivers do?

They already allow the cruise control's automatic speed limit adjustment to have an offset applied, so that everywhere you go you're speeding by 10% or 15% over the legal limit.

On the low-speed roads mentioned in the article, people ignore unoccupied stop signs and red lights all the time. It was literally 6 days ago that I was biking home from work, stopped at a stop sign (I know some cyclists campaign for Idaho Stops, but they're not legal in my state yet and I'm out there as much for advocacy as I am for anything else), when someone blew through the 35mph 4-way stop without even putting their phone down. There were no cars, just me on my bike, so I guess she felt she didn't have to stop?


It's because it's trained on real tesla drivers.....


"FSD" is not running red lights, the so-called "driver" is. Ultimately you are responsible for what your car does, when you are in the driver's seat. If you allow it to run red lights, it's on you.


What does FSD stand for again?

If Tesla wants to sell Fancy Unpredictable Cruise Control they should label and market it as such. It is insane how many dangerous lies Elon gets away with. Anyone else without so much wealth and power would have faced serious consequences by now.


Curious who gets a ticket when FSD runs a red light?


The driver is responsible for maintaining control of the car at all times, so they will get the ticket even if they're using FSD.

There is an argument that Tesla is releasing products that are fundamentally unsafe, and they may be open to some torts there. Of course, Tesla itself would likely point to some language in some agreement that basically amounts to "you're an absolute fool for thinking that our FSD product is any good, and have no ability to sue us for releasing dangerous products;" they've historically been very aggressive in arguing that they bear no responsibility in any FSD or Autopilot crashes.


I think the only company who takes liability when you use their self driving feature at the moment is Mercedes, and it is only active in very specific driving situations.

https://www.theverge.com/2023/9/27/23892154/mercedes-benz-dr...


Curious to hear what you think the other options are apart from the driver.


The auto manufacturer, as they're the entity actually operating the vehicle. You can absolve yourself of responsibility with enough legalese "self-driving doesn't work, it will kill your kitten, you're responsible when you plow though a Denny's" but it doesn't make it right. Mercedes is taking liability, it's possible.

You have no control over how your car's self-driving operates, you can take over control but that doesn't mean you're driving. You're the passenger in one of those student-driver cars, you might have your own steering wheel and brake but it doesn't make you the driver. The life-guard isn't the swimmer.


> The auto manufacturer, as they're the entity actually operating the vehicle.

Not at level 2 they are not. At level 2 the driver is in control. FSD is level 2.

The Mercedes system you are talking about is level 3.

https://en.m.wikipedia.org/wiki/Self-driving_car

> Honda was the first manufacturer to sell an SAE Level 3 car,[12][13][14] followed by Mercedes-Benz in 2023.[


Mercedes is confident in their system, that's the difference


The driver, as it should be


Is SAE Level 2 perhaps 'good enough' for most people, especially on highways (either flowing or stop-and-go), for now?

Comma's 3X seems to be something that increases driving convenience for people, but doesn't over-promise / under-deliver on its capabilities. Should Level 2 be what manufacturers be shooting for when they're shipping their 2024/5 models?

(Though certainly strive for higher levels in the future.)


'FSD' is L2, not L3.


> 'FSD' is L2, not L3.

The reality of FSD may be L2, but is it being marketed as L2, or something more?


...running them safely tho?


> running them safely tho?

I'll grant safely jaywalking as a thing that can be done. Running a red light in a vehicle safely, no.

Best case the signal is malfunctioning. But that's a classic situation in which an L3 car should ask for a human.


No


What do you mean "No" ? I am asking if it's traversing red lights while they are otherwise safe to traverse (no lateral traffic). What's with this amazing pessimism infecting this forum lately.


You asked whether or not the FSD ran the red lights safely. You were given the answer of "No". You never, at any point, asked whether or not "it's traversing red lights while they are otherwise safe to traverse (no lateral traffic)" which is a question that contradicts itself.

If you ask a question, someone answers, and you start berating them because they were either not able to read your mind to determine what the actual question you're asking is or gave you an answer you did not like, then why ask the question?

The article you posted a comment to shows a video of numerous instances of the FSD trying to run a red light when it is absolutely "not safe" to do so.


So you have no lateral traffic and run over the person walking across the street?


Is FSD running red lights with no lateral traffic and running people walking across the street over?


Yes


[citation-needed]


What you're asking for was already linked in this thread ~30 minutes ago, at the very top. I doubt you didn't already see that.


Usually HN engineers are able to compartmentalize and correctly answer questions like this, but it’s a lost cause with Tesla for some reason. They’re unable to separate “it did a dangerous thing, but did not cause harm” in their minds.

Yes, it has problems. Yes, it has regressions. Yes, sometimes it does dangerous things. But it drove me 400 miles, door to door, last week and did fine. There’s clearly something historic happening here, but all HN can talk about is flaws.

If someone developed a warp drive, and 25% of the time it turned the operator into jelly, we wouldn’t sit here talking exclusively about jelly. We’d talk about how warp drive is cool and what the path to fixing stuff is.

It’s really bizarre.


This is a good point. HN posters have some difficulty looking at the objective reality of this situation, for example some are missing the obvious comparison to a completely unrelated sci-fi hypothetical that exists in your head.


Breaking my own rule only once: My car, that I own as a consumer, drove me through town, on the freeway, through another town, and into a parking lot four hundred miles away.

Twenty years ago, that was sci-fi. HN would have been able to reason clearly about it twenty years ago.

(Yes, I know HN wasn’t around in 2004. You don’t need to nitpick that)


This is a thread about Teslas running red lights. Your story about a trip wherein that didn’t happen has no real bearing on this topic.

Your nice trip aside, no sci-fi hypotheticals make running red lights acceptable either.

“What if we invented a reverse microwave but it made everything smell like cheese?” or “What if we invented an interpretive dance that solved climate change but only people over 6’4” could do it?” have the same amount of relevance to Teslas running red lights as the human jelly machine you’ve conjured in your mind.


I actually generally agree with you. It’s amazing. And I actually feel safer with waymo than human drivers. But the idea that running a stop sign is safe because there aren’t other cars is the issue here.


> If someone developed a warp drive, and 25% of the time it turned the operator into jelly, we wouldn’t sit here talking exclusively about jelly. We’d talk about how warp drive is cool and what the path to fixing stuff is

What? Who? We'd ask why the hell people are being put in it.

The hackers of yore, the ones we respect, weren't terrorists. When they phreaked the phone company they didn't try to take down 911.

> did not cause harm

Tesla's FSD has killed people [1]. It's also been the subject of multiple recalls by federal agencies.

[1] https://www.tesladeaths.com


> Tesla's FSD has killed people [1]

I looked at the very first one on the list and it says someone drunk in a non-Tesla hit a Tesla and resulted in a death of the Tesla driver.

> According to the Albuquerque Police Department, on July 1, Sandoval-Martinez was driving drunk, speeding, and without a license when he ran a red light and hit Tiger Gutierrez’s Tesla, as well as a Toyota Corolla.

Not sure why the website is putting that under a Tesla death, probably to inflate counts since there aren't many Tesla deaths due to them being very safe cars.

Why are you attributing and referencing these incidents as FSD killing people?


> Why are you attributing and referencing these incidents as FSD killing people?

Nobody is.

The first entry, case No. 432, appears to have a null value in the Autopilot claimed and Verified Tesla Death columns. The first Autopilot claimed death is No. 410 [1].

> probably to inflate counts since there aren't many Tesla deaths due to them being very safe cars

How is 555 deaths across millions of cars sold refuting that? The bold text at the top clearly states "Tesla Deaths is a record of Tesla accidents that involved a driver, occupant, cyclist, motorcyclist, or pedestrian death, whether or not the Tesla or its driver were at fault."

[1] https://www.kiro7.com/news/local/charges-filed-against-tesla...


> Nobody is.

You're the one that referenced that site as a source for FSD allegedly killing people. And now the only thing you have is an Autopilot death and quote the site text that says it includes deaths whether or not the Tesla or its driver were at fault? Where are the FSD deaths you claimed? Most people don't check sources and assume a comment is true because one is linked.

The fact that deaths are included as a 'Tesla death' regardless of who's fault it is shows that the site operator has an axe to grind and that the data cannot be trusted.

> How is 555 deaths across millions of cars sold refuting that?

555 deaths over 11 years worldwide is quite low given that about 45,000 people die in auto accidents just in the US every single year and 1.19M people die worldwide every single year.


> Where are the FSD deaths you claimed?

I believe the NHTSA's fatal FSD crash refers to No. 225 on that list [1].

> Most people don't check sources and assume a comment is true because one is linked

I'd assume anyone responding to a comment would be curious and competent enough to look at a source.

> fact that deaths are included as a 'Tesla death' regardless of who's fault it is shows that the site operator has an axe to grind and that the data cannot be trusted

Judging fault is subjective. Judging whether someone died is not.

> 555 deaths over 11 years worldwide is quite low given that about 45,000 people die in auto accidents just in the US every single year and 1.19M people die worldwide every single year

Yes. That's the point. It's a dataset that shows Teslas to be safe, Autopilot not so much and evidences FSD having killed at least one person.

[1] https://static.nhtsa.gov/odi/inv/2022/INCR-EA22002-14496.pdf Table 1


> it drove me 400 miles, door to door, last week and did fine

I call BS. Even in perfect weather on perfect roads with perfect visibility, 400 miles is at least 10 times as long as FSD can go without a disengagement.


I dunno what to tell you. It did it. A disengagement every 40 miles is not what I’m seeing on my car.

(By the way, this is part of the reason Tesla owners get called Elon-worshipers: we see the sentiment in places like this, it doesn’t line up with the reality we’re seeing every day, and trying to correct the discrepancy comes across as worship. It’s not, it’s just that it seems like something other than truth is driving these conversations.)


How did it even go 400 miles in the first place without stopping to supercharge?

I have a Tesla. My second so far, and I've been driving one since the first Model 3 was released. Pretty happy with the car. FSD is cool, better than I expected in many way, but I just can't envision it going anywhere over single-digit miles without a disengagement. I never got more than 3 or 4 miles before it did something requiring me to take over. And it was worse on the highway, not better.

Not interested in a fight, but I remain very skeptical every time someone makes a wild claim of hundreds of miles continously on FSD. It just doesn't match my experience at all, nor any of my friends who also drive Teslas and have experienced FSD.


It would not shock me if you simply mentally blocked out all recollections of disengagements or other self-driving issues.


I’m not insane, my ability to remember instances of my car trying to kill me is quite well developed.

Are you assuming good faith?


400 miles is at least 5 hours of driving, which is a bit long for not even having a pee break. Since you're driving a BEV, it's also above the maximum range of your car, so you had to have stopped at some point, which suggests that there are details in your claim that are already missing because (you feel) those details don't matter to the essential truth of your statement.

I believe that you believe you made a 400-mile drive with full self-driving without any issues. But I also believe that belief can be made in good faith even if there were issues during that drive--either the things that did come up you don't mentally classify as issues, or you've just failed to commit what did happen to memory.

That's not a knock on you, by the way; recently, I've been struck in a number of instances the times where I have physical proof something happened yet I have absolutely no memory whatsoever of it happening while feeling that I would have had to have that memory. Human memory is notoriously finicky and fallible.


It doesn't even have to run over a pedestrian to be dangerous, such a move could very well lead to the driver being pulled over and extrajudicially executed by a trigger happy cop


No


I love all the Tesla critics running to this form their complainy thought in hands. We all run red lights, repeatedly, we all run red lights. I've been saying for years Tesla FSD, better than racecar experts.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: