Hacker News new | past | comments | ask | show | jobs | submit login
Tesla Accused of Deception in Promoting Autopilot (bloomberg.com)
142 points by jijojv on May 24, 2018 | hide | past | favorite | 85 comments



The advertising is clearly deceptive. Read their marketing materials:[1]

"Build upon Enhanced Autopilot and order Full Self-Driving Capability on your Tesla. ... All you will need to do is get in and tell your car where to go. If you don’t say anything, the car will look at your calendar and take you there as the assumed destination or just home if nothing is on the calendar. Your Tesla will figure out the optimal route, navigate urban streets (even without lane markings), manage complex intersections with traffic lights, stop signs and roundabouts, and handle densely packed freeways with cars moving at high speed. When you arrive at your destination, simply step out at the entrance and your car will enter park seek mode, automatically search for a spot and park itself. A tap on your phone summons it back to you."

Tesla does not have that technology. Not even close. Their cars do not have the sensor suite to do it. They did a demo back in 2016 of a limited self-driving capability on quiet roads in Los Altos Hills. They haven't said much since.

[1] https://www.tesla.com/autopilot


Did you forget the next paragraph, where they clearly state that it is not yet available? Firsts words are even in bold: "Please note that Self-Driving functionality is dependent upon extensive software validation and regulatory approval, which may vary widely by jurisdiction. It is not possible to know exactly when each element of the functionality described above will be available, as this is highly dependent on local regulatory approval. Please note also that using a self-driving Tesla for car sharing and ride hailing for friends and family is fine, but doing so for revenue purposes will only be permissible on the Tesla Network, details of which will be released next year."


> Please note that Self-Driving functionality is dependent upon extensive software validation and regulatory approval, which may vary widely by jurisdiction

So the software is ready, it's just those damn regulators that are holding it back.

Come on.

The way this is phrased makes it sound like the features are just around the corner, not that the software is years from working, if it's even possible at all.

I know somebody who was told by a Tesla salesman that "Destination Based Autopilot is only months away". The advertising is absolutely deceptive, and I say this as a Tesla owner and fan of the company.


I agree with you. I don't personally have a problem with the claims around Enhanced Auto Pilot, but the FSD option is really bad marketing, IMO. It is nowhere close to being ready to do the things that they talk about and may never achieve some of them.

The owners who buy that are really donating a few thousand dollars to Tesla without getting anything in return. Its really strange.


Indeed.

A lot of people talk about the "Does it have enough sensors for Full Self Drive?" but my question would also be: Does it have the Compute?

As vehicles get closer and closer to full self drive the compute requirements have shot up, that's because a lot of the building blocks are just adding more machine learning models, pattern recognizers, and a more complete picture of the world. It is all well and good having a dozen sensors, if you can process their input.

A lot of the key cycles cannot be offloaded to The Cloud because of the tolerances (e.g. sub-second query/response for safety). One university project I happen to know of had $20K worth of compute sitting in the trunk. That will obviously come down as it is optimized but a vehicle sold in early 2018 isn't evergreen.


Musk mentioned this in the last IR Q&A. He said that it might indeed not have the compute, but that swapping in hardware with enough compute would be a 'plug and play' operation.


That's what he said but it won't be.


Come on... We all know that sales people will quite literally say anything to make a sale.


This is understood, and we all expect them to carefully phrase things to put their product in the best possible light. Misleading statements are the norm.

Outright lying is another matter entirely, and in some cases could even be criminal. If I have you on video telling me I can drive the car I'm buying at 90 mph toward a brick wall and the car will prevent a collision, and I then test that statement and die, the video will be used against you in court by my surviving family members.


This is understood, and we all expect them to carefully phrase things to put their product in the best possible light. Misleading statements are the norm.

Only where enforcement is weak. Two years ago, the government of Germany shut down Tesla's "autopilot" advertising for making false claims.


And that's no reason for us to accept it without comment. Especially when it encourages people who are not quite as smart as you to do things that put other people (and themselves) at risk.


Hand waving statements about the autopilot being safer than human drivers, and blaming passenger-drivers for not taking control earlier whenever there’s wreck is ridiculous, but it sounded more like overconfidence than bad faith.

That second paragraph is a far worse thing. Intentionally leading people into believing that government regulators are why a technology doesn’t exist makes it sound like they’re venturing into Theranos territory.

The last sentences about selling rides sound like the sort of false concern someone would try to raise in a pitch to misdirect investors away from asking about software problems that they internally worry might be insurmountable.


With his latest "fake news" attack he's starting to sound more and more like Trump:

http://money.cnn.com/2018/05/23/technology/elon-musk-media/i...

He liked being the media's golden boy when they were fawning over him for a decade, now things get tough because him and his company exaggerate and he wants to take his ball and go home.


Eh, I think at this point Tesla exiting the auto market would have relatively little impact on the future of electric cars. They may have played an important early role in proving the technology is ready and creating demand, but the demand is there now and many manufacturers are trying to meet it.

(Whereas, SpaceX exiting the market would probably set back space travel considerably.)


If Tesla fails, then EV are essentially dead for another decade. Hybrids will fill the role for ecologically respectful transportation.


This is a ridiculous statement, Tesla accounts for less than 10% of global EV sales.

January-March 2018: 304K EV sales, 25K Tesla sales

https://insideevs.com/monthly-plug-in-sales-scorecard/


Apple accounts for a similar fraction of smartphones, but is far more influential than any other maker. Tesla occupies a similar role in terms of EV mindshare. At least in the US, carmakers are developing EVs out of fear that Tesla will succeed and suck out all the profits.


As others have pointed out, Tesla's failure in the EV market won't do much damage as it really isn't the big player and their poor execution is only handing money to competitors. Worse for Tesla they have already given time for established brands to enter and or announce their own EVs. Many people are waiting for their preferred brand of car to come with an EV option.

What the real threat is the damage to self driving that Tesla gives the public. After a series of very publicized crashes where Tesla handled their response poorly journalist are looking at every Tesla crash for involvement of their Autopilot software. Every crash is under suspicion. That alone is a hole that Tesla is going to have a hard time crawling out of and it combined with other companies missteps it will only keep setting back the field.


Not a chance. Look at what is happening in China.


Who would have guessed that two blowhards who inherited their wealth would throw temper tantrums when the media doesn't paint them in the kindest light possible.


I'm not sure why you say "more and more like Trump". He always sounded like someone with a similar approach to handling a lot of things.

OTOH, that doesn't make some of his charges less accurate. The media does engage in a lot of hyperbole about Tesla crashes and always has. We don't get hyperbolic reports every time some other car has an AEB failure. We definitely don't get them about every other car's post-crash fire. And a lot of the financial opinion pieces are only marginally better.

On the other side of the coin, you have the company's misleading marketing materials. Some will say that is just as bad as the bad journalism, I guess.


By the way, this was also very offensive to me:

> Please note also that using a self-driving Tesla for car sharing and ride hailing for friends and family is fine, but doing so for revenue purposes will only be permissible on the Tesla Network, details of which will be released next year.

Why does Tesla get to tell me what to do with my car?


Frankly, it's not your car, no matter how much money you forked over for it.

And, if you have a Tesla, you can verify this by disabling its ability to talk to Tesla. After some period of time, your car will no longer run.

Or you can crash it, get it fixed by a non-Tesla service provider, and watch it also fail to run. Not because of a mechanical failure or even a software failure, but because Tesla hasn't OKed it to run.

I'm curious about what happens if/when Tesla goes out of business. What happens to all those Model S and Model 3's out there - do they get OTA patched to no longer phone home, does the data coming out of the Tesla flip over to whoever purchases that capability from Tesla, or do they all just stop? It will be interesting to see.


Your car perhaps, but probably not your software - check the EULA (it is an issue that bugs me too - I am offering this in way of explanation, not justification.)

The clause is probably there because Tesla wants to become the self-driving Uber, and it cannot (even if it wanted to) limit the restriction to corporations because, as we know, Uber does not have drivers or cars.


Probably because the autopilot tech isn't just your car. It's connected to the rest of the Tesla `fleet`, all the ML that makes it work is done by Tesla. All the updates you get to improve the capabilities is done by Tesla?


I guess Tesla believes while under self-drive it is no longer your car. That's going to be an interesting case for the law/layers in the future.


Tesla owns their software, not you. Why would they enable other ridesharing companies to profit off of their autopilot work?

If you want to drive your car for Uber, you can. If you want Tesla’s software to drive it for Uber for you, that isn’t part of the end user license agreement.


There will probably be court cases over this.

At one time automakers tried to say if you used third party oil filters or other parts, or if anyone other than the dealer did work, then your warranty was void. That didn't hold up.

Publishers tried to say that you could not re-sell books and recordings and software media after you purchased them. That didn't hold up.

Not sure that this will hold up either.


You’re referring to physical parts. To my knowledge, no automaker providers their software and related tooling to end users (please correct me if I’m mistake).

You’re going to need a copyright law overhaul to change this status quo.


And you're referring to what you can physically do with your own vehicle after you buy it. To my knowledge that has absolutely nothing to do with software copyright law. I don't understand your basis for arguing that software's copyright can be used to infringe other rights.

For example can my water heater's manufacturer ban me from washing my dog in the bath because they own copyright on the software in the thermostat micro-controller?

If Tesla were going to provide liability insurance, I could absolutely see them having the ability to limit usage (under that insurance). But software copyright is an unusual argument.


To use Tesla FSD (full self driving) with a rideshare network, you will need to hook into the Tesla vehicle onboard software or Tesla’s API. Those actions will require Tesla’s cooperation. Otherwise, feel free to sit in the car while it drives itself on autopilot, Tesla isn’t stopping that, only the use of its full autonomy software (when it arrives) for commercial purposes on rideshare networks other than its own.

Existing law supports this, and I don’t foresee policy changing in this regard.


> To use Tesla FSD (full self driving) with a rideshare network, you will need to hook into the Tesla vehicle onboard software or Tesla’s API.

If there is an operator in the vehicle, they could just enter the destination the old fashioned way.

> Existing law supports this, and I don’t foresee policy changing in this regard.

It really doesn't. I'm not sure which "law" you're even referring to.


They are I think referring to first-sale doctrine.


> There will probably be court cases over this.

Well, probably not, because realistically Tesla's self-driving isn't going to be happening anytime soon.


Tesla seems to think software license agreements don't matter though, because they've been violating other people's license agreements for years by shipping GPLed code without source. About a week ago they finally released some (not all) of the code they're required to.

https://electrek.co/2018/05/19/tesla-releases-softwar-open-s...


This is almost certainly illegal, and if you have a Tesla, you should use the self-driving ability for Uber or Lyft and literally just dare Tesla to sue you for it.


The simplest explanation to me is that the self drive software is essentially provide under a non-commercial license.


> Why does Tesla get to tell me what to do with my car?

My guess:

Because they think it's their car, even after they sold it to you. And, because, more to the point, they expect that (contrary to their claims) that retaining that control and exploiting it is the only way they will be able to be profitable with the cars.


> Did you forget the next paragraph, where they clearly state that it is not yet available

Where does it clearly say that? "may very widely by jurisdiction", "it is not possible to know" are anything but clear.


It's a shame that those specific details pertaining to the Full Self Driving feature were accidentally entered on to the Tesla AutoPilot page. The mistake seems to have caused a cascade of errors, because right after the FSD caveat, there's a video, but it's for the full-self-driving demo back in 2016, something that has nothing to do with the Autopilot feature.


> If you don’t say anything, the car will look at your calendar and take you there as the assumed destination

What, so you'd enter your car to vacuum clean it or something, and suddenly it'll start driving to some destination it "assumed" :/


I am glad people are starting to see the irresponsibility in some of Teslas Marketing. It was clear from day one that Teslas autopilot is a glorified lane assist and therefore should not be sold as some sort of autonomous ready vehicle.

When I first complained about how autopilot was being marketed here on HN, I was down voted and told I don't know what autopilot means.


> When I first complained about how autopilot was being marketed here on HN, I was down voted and told I don't know what autopilot means.

Had the exact same experience "any commercial airline pilot knows autopilot doesn't do everything for you !!!one" (which isn't even true, Cat-IIIc ILS autopilots are basically the equivalent of the car being able to do everything outside of leaving your garage and fully automated flight from takeoff to landing included was a thing in 1947: https://www.flightglobal.com/pdfarchive/view/1947/1947%20-%2...).

One could consider it a difference in what people are talking about: Tesla's tech roughly matches plane autopilot capabilities but not experience, because plane autopilot capabilities are nowhere near sufficient for unassisted autonomous driving (autopilots are assisted by automated ground-control systems amongst others).


> which isn't even true, Cat-IIIc ILS autopilots are basically the equivalent of the car being able to do everything outside of leaving your garage and fully automated flight from takeoff to landing included was a thing in 1947

Only a fraction of the worlds runways have Cat-IIIc capability. You could argue that a reasonable equivalent in the car world is that it can auto-drive between any two places, so long as those places are on major motorways / interstates.

Aircraft autopilots also have no way of avoiding traffic, ATC can't control them directly, and they don't have any automatic ability to do that.

On balance I'd say the abilities of Tesla's autopilot are fairly compared to an aircraft autopilot.

Unfortunately I think you are correct in that Tesla have marketed the system well beyond those abilities.


Additionally most people are not pilots and do not know much about autopilot. From movies and other media the even just the term autopilot gives a sense of complete automation.


Don't despair! Each new generation starts hopeful, and goes through its own hype-bust cycles. It is human to dream, and businesses have evolved to sell hope accordingly.

An eye for simplicity and a good set of BS detectors are not easy skills to learn. There's a whole spectrum between being a guinea-pig for the latest fad, and living like an Amish.

Having worked in the ML industry for a decade, the idea of buying anything "AI-driven' where a simple mechanical / manual option exists leaves me shuddering. Never mind autopilots and other "mission critical" systems: The first thing I did after buying our home was to rip out all the "smart home" features, and replace them by manual thermometers and switches. Year after year, I feel sorry hearing our neighbours moan about their "smart homes".


In addition to today's Autopilot, Telsa are charging customers $3,000 for a hypothetical future "Full Self-Driving Capability" feature on the Model 3 and others.

> "All you will need to do is get in and tell your car where to go," Tesla's ordering page says. "Your Tesla will figure out the optimal route, navigate urban streets (even without lane markings), manage complex intersections with traffic lights, stop signs and roundabouts, and handle densely packed freeways with cars moving at high speed."

This is very likely to be a huge balance sheet liability and probably won't be a real possibility until the cars are really old and near the end of their life. They don't give a real timetable so maybe they can get by with not delivering it for a couple decades, but it seems like that would land them in some kind of class action legal trouble.


It's not only deceptive but it puts public at risk.

The other day I was on 101 in the evening heading south (this is one the main highways in San Francisco Bay Area) where I was behind a Tesla on the fast lane. The Tesla in front of me was acting rather erratic. Fast jolt like corrections once in a while, etc. So I got annoyed by the erratic movements and passed the car. As I passed I glanced over. Sure enough, the driver was on his phone doing something and I'm guessing was relying on the auto pilot.


Autopilot is generally pretty smooth - what you describe is more characteristic of texting while driving, which idiots can achieve in any car.


Yes, but Tesla is giving its owners the false confidence that it is ok to text while driving because the autopilot will drive the car.


What above comment is saying is that the guy was probably not on autopilot and steering erratically himself.


Most states have a no texting and driving law, or even that you can't be holding your phone while driving laws in place. Would that still be valid if the car is driving itself? I'm guessing at least right now it would still be illegal. Does anyone here know?


To be clear, the story here is that a couple of consumer groups have written the FTC to complain about Tesla. That might be enough to get something started, but public complaints != formal investigation. That said, the FTC did successfully order VW to pay $1.2 billion for its diesel shenanigans:

https://www.ftc.gov/news-events/press-releases/2017/05/feder...

I get that people respect and are excited by Musk's vision and work, nevermind the cars, but I feel even Tesla fans should be able to agree that the current Autopilot homepage, which is going to be the main info source for Autopilot for anyone not in possession of the owner's manual.

Someone at Tesla made the deliberate decision to have the page's headline and very first words be, "Full Self-Driving Hardware on All Cars", followed by a paragraph of text about full-self driving, followed by order buttons for Model S/Model X.

The next thing on that page is a video. It's the first and only video on the Autopilot page, and it is a 2016 self-driving demo, that Tesla has (as far as we know) not been able to push to production nor do extensive testing (assuming they test in California, which requires a mileage report).

When you buy a Tesla, aren't the purchase options for Autopilot and (future) full-self driving separate options, at a cost of $5000 and $3000, respectively? Yes, buying AP is required to buy FSD. But my point is that Autopilot exists as its own separate feature completely independent of FSD.

So to belabor the obvious, why is the only visual demo on Tesla Autopilot a 2016 alpha demo of full self driving? On Youtube I can find plenty of amateur demos of the AP features. What possible reason is there for not creating separate pages for Autopilot and FSD, other than to have consumers conflate the two systems?


I'm at probably 2,000 miles in AP on my Model 3, and at this point, I feel like I'm very knowledgeable about where I can trust AP and where I can't. I do fear that people will give it too much trust in the beginning, and not try to learn how it behaves, because then, it's dangerous.

Dying in a Tesla is relatively hard to do, but getting hurt or even just totaling your car because you didn't understand how AP works is bad (and dangerous for others, too!)

I feel like maybe they shouldn't turn on AP until TACC has been active for some period of time, so that you can get used to the car and how it behaves. Even now, I am sometimes surprised (like the odd behavior when stopped traffic is ahead of you a ways, and the car right in front of you changes out of your line - your car doesn't notice the stopped traffic, so you have to stop yourself or you would slam into the stopped cars). I have to assume that, at some point, the front facing camera will supplement the radar and say, "hey, that looks like stopped traffic - I should stop", but I can understand, from a software development perspective, why that's a hard problem.


I don't own a Tesla but from what I understand you can get OTA updates which will change the way the AP is working. So if you need to build a mental model on how to "use" the AP to be safe and then the AP changes without notice I can see how that could end badly...


why would that be a hard problem? Shouldn't it be easy for the radar to detect if there are stopped obstacles ahead?

I am not familiar with AP tech. So would appreciate if someone can explain why it's hard to detect stopped objects using radar.


> Shouldn't it be easy for the radar to detect if there are stopped obstacles ahead?

No. It's very difficult. Radar reflects off of everything. The only way to get a useful signal is to look for things that are moving at a different speed than you, like something moving relative to the road. So the radar software filters out all signals that have the same speed at the road ahead.

This means the radar won't see anything stopped on the road, because it's filtering out a huge number of returns from the road. This gets even worse because Radar reflects very differently based off the material. It goes right through plastic, water (humans) absorb it.

Even if you could try to read through the noise of the road, Radar doesn't have enough resolution to tell the difference between the road going uphill and a stopped firetruck.


Thanks for the explanation.

"radar won't see anything stopped on the road, because it's filtering out a huge number of returns from the road"

what does that mean? sorry i am still confused.

Isn't the whole point of radar to detect objects around you?

It shouldn't matter if the object is in the road ahead, or on the side of the road, right? Because they are all obstacles that the car should avoid, no?

Like, a stopped car in front of you, vs a tree next to you on the side of the road, are both things the software should avoid, and NOT filter out, right?


Not an expert either, but the gist I got from previous threads is that the problem is distinguishing from objects which are obstacles from others which aren't (e.g. because you'll turn before hitting it). According to Techcrunch, that's also what happened in the Uber manslaughter - the system detected the pedestrian but didn't consider her a real obstacle to the car.

That said, the impression I get is that Tesla's system is worse than even their initial version (provided by Mobileye).


They need lidar/radar, then they have a chance to detect objects.

Especially stopped traffic is a problem for vision only algorithms. It's easy to see moving objects, it's harder to see stopped objects. We humans are quite good at this, but machines have a long way to go here. Even with stereo view, everything that's a few hundred meters away doesn't have much difference between the left and right picture. Add to that that a few hundred meters away it's just a few pixels. So if it isn't moving it's like detecting a small object in a photo.

Lidar/radar are the only sensors which can reliably detect stopped cars at a distance if you ask me.


They have always had radar. It’s not enough for detecting stopped vehicles at high speed though.


I mean, I won’t claim it’s easy, but my Subaru can do it without breaking a sweat.


Tesla seriously jumped the gun in marketing their partial autonomy as "Autopilot" when it's still level 3 autonomy at best (requires a human driver to intervene when necessary). Until we leap over the 'uncanny valley' of partial automation and have level 4 autonomy available it would be disingenuous to call such a feature "autopilot".


> when it's still level 3 autonomy at best

It's most definitely an SAE2 system, not remotely SAE3. An SAE3 requires that the human driver will respond to a request to intervene in all cases where the automation system figures it's going to fail.

Tesla's is not close to guaranteeing the driver will be forewarned of an issue (at all let alone in time to intervene), the driver must be aware of their surrounding and ready to override the automation system for safe operations, the system will not guarantee safety.


The frustrating part I find is that autopilot for pilots is actually a reasonably good description. Pilots are paid and trained to be alert all the time that autopilot is on.

It's also well known that an autopilot in a plane is no AI, it's simply following a course.

But I definitely agree it's a disingenuous name. It primarily gives the impression that you can take your hands off. So people will definitely get distracted for up to 6s which from the Uber crash is all it took. Plus beyond that you can have one hand on the steering wheel and be doing anything you want.

It's quite damming that even Uber were humble enough to realise how badly their system had failed with the fatal crash but Tesla is doubling down releasing PR statements that try to spin the statistics. I've seen interviews with Musk discussing the crash where he really seems not to be particularly bothered.

I can see Tesla putting back self driving by years if a legal clamp down comes. They are the company with by far the most miles driven with a level 3 system, but there's nothing at the moment to suggest that there won't be other crashes in Tesla autopilot cars that won't skew the statistics.

I've heard it said that self-driving cars will have to prove they're 10x safer. That takes 1B miles per death to prove. If self-driving is deemed illegal on open roads because of Tesla's recklessness it'll take much longer to prove that it is safer.


For me, it comes down to the fact that replicating the functionality does not replicate the experience. Tesla's Autopilot may well replicate the functionality of an aviation autopilot, but a plane will have less factors to consider (such as proximity of other traffic) and if something does go wrong/there is something the autopilot can't handle there, in most cases, is time for a human pilot to take over. Getting the same level of functionality that works in the air _will_ cause a crash on the ground, but calling it by the same name is going to make people think the _experience_ is the same (i.e. that the vehicle is fine left to its own devices for at least some period of time -- which would require a significantly more complex system on the ground).


Yeah pilots don't use autopilot when taxiing in an airport.

The equivalent environment of a plane's autopilot for a car is being on an empty race track. I'm sure Autopilot works well there.


An empty racetrack that is perfectly mapped out in advance, at that.

Plane autopilot just automates the control inputs/outputs that are needed to follow a predefined flight plan. For anything beyond cruising or following pre-programmed flight patterns like holding loops, it needs guidance from systems on the ground that have no other purpose than telling the plane exactly where the runway is and how to descend to it. There’s not a single system involved that uses vision-based methods and/or any form of object detection. Even the most advanced plane autopilot will not be able to land the plane if the airport systems are down.

So in a way, the level of autonomy that Tesla’s autopilot provides is not that far from plane autopilot. That’s clearly not how Tesla PR sells it though.


The advocacy groups say Tesla’s promotions of Autopilot suggest otherwise and are deceptive. Among the examples cited in the letter is Tesla’s Autopilot website, which proclaims Tesla vehicles have “full self-driving hardware” and contains a video posted that when played begins with text reading “the person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.”


I agree that deceptive marketing should be punished.

With that out of the way, I am genuinely curious:

Has you or anyone you know ever believed that Tesla's have autopilot?

Anyone I have ever spoken to basically thinks that Teslas have a feature to keep you in the lane -- an advanced cruise control -- and that you must have your hands near the wheel.

Please note that even if nobody thought Teslas have autopilot, I still think deceptive marketing should be punished. I'm simply trying to gather some perspective from others, if this is actually a thought that is out there in the wild.


Good, I like most of what Musk did, but always had issues with bringing semi capable drive assist that clearly made some accident possible. Not causing it, enabling irresponsible behavior and coating their product with glitter in the same time. I don't think it's right for companies to sell that sort of things.


Frankly I'm surprised it took this long to happen. It's a bit sad that people had to die when we could potentially have prevented it from happening.


This is obvious once you use it.


what ever happened to that coast to coast demo?


I used to admire what Musk did at Tesla; I even put the $1k down for a Model 3 reservation on the first day over two years ago.

Since then I've lost my faith in his honesty. He's been consistently and intentionally misrepresenting what Tesla can deliver and when.

Worse, he is taking a page from Donald Trump's playbook. Faced with negative media coverage, Musk is now attacking the media and wants to build a bubble of alternative truth controlled by himself.

Rather than fix Tesla's communications, he wants to build some kind of website where "the public can rate the core truth of any article" and shame individual journalists:

https://twitter.com/elonmusk/status/999367582271422464

Can he be so naïve that he expects an Internet public to arrive at a consensus on complex issues? Of course not, he hasn't been in a coma for the past twenty years. His proposed "Muskopedia" would just end up controlled by online partisans devoted to embellishing his takes.

(Reminds me that I should probably take my $1k out. The Model 3 ship date for Europe is still a year away.)


He's been overestimating deliverables for way longer than 2 years


But I wasn't paying enough attention. To my knowledge he also wasn't trying to build a private media bubble until now.


Repeat after me: "Autopilot does not mean autonomous".


And so it begins, the tearing down of a highly successful company.


> And so it begins, the tearing down of a highly successful company.

Alright. I think it's important to be clear what Tesla is. It's certainly a very promising company, but highly successful? That it is not - at least not yet. Tesla is not profitable, and without electric car subsidies it would be in even worse condition. It has a lot of promise, but has yet to deliver on that promise IMHO.


On what metrics is it highly successful? By every standard measure its about as unsuccessful as you can get. Financially they are under water. Their CEO refuses to acknowledge valid investor questions and uses his shares as collateral for private ventures. Announcements such as the semi truck are completely forgotten about and pitched to the side once hype dies down... Where is the silver lining?


I like how the only pro Tesla comment in this entire thread is grayed out at the bottom. Not suspicious at all...


Tesla's lawyers would have this covered. It sounds to me like a bunch of shorts who are sweating Telsa's recent moves to cut production costs and raise the price of the Model S. I hope they get their comeuppance.


Tesla’s lawyers should’ve advised against marketing a glorified ADAS (and AP 2.0/2.5 was even a pretty bad ADAS for quite a long time) as an autonomous driving solution.

They market it as it could be a RoboTaxi, Musk has been saying “autonomous driving is a solved problem” for 4 years now at every opportunity he gets this isn’t really ideal.

Tesla’s marketing a side the real problem is the concept of Level 3 autonomy in the first place (on the open roads). Humans can’t operate in a “hands off only” mode and it’s either all in or all out.

I’m really hoping that we could get to a good real world all conditions L4/5 (these are the same level of autonomy) before level 3 ones kill enough people to push back autonomous driving 3 decades.

Don’t get me wrong in controlled, restricted and enclosed environments for example truck yards, ports, large factories, airports, power stations, mines etc. Level 3 makes a lot of sense if you have predictable routes where trained operators know when they can be hands off and when they cannot be.

But this doesn’t and will never work on real roads with real people because eventually everyone gets “comfortable” with such a feature especially when it appears much more capable than it actually is and when Musk and many in the Tesla community make it sound like the only reason why your Tesla isn’t working for Uber whole you are at work is because of pesky regulation.


Tesla only has SAE 2, stay in lane and brake for obstacles (sometimes). They did a few videos of hands-off self-driving in controlled conditions on quiet roads, so they were trying for Level 3, but never really got there.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: