Musk's point has always been to combine vision with radar, instead of Lidar. I'm amazed that this combination is usually overlooked in discussions of Tesla/Lidar
Exactly. What is rarely mentioned is his exact quote on the reasoning for radar vs. lidar:
“If you’re going to use active photon generation, don’t use visible wavelength, because with passive optical you’ve taken care of all visible wavelength stuff. You want to use a wavelength that’s occlusion-penetrating like radar. LIDAR is just active photon generation in the visible spectrum.”
This article is still missing the point when talking about redundancies. LIDAR only works in essentially perfect weather ("not occlusion-penetrating"). Even if it serves only as a "redundancy" there's no advantage in relying on a sensor suite that operates in a less-safe mode in the most adverse road conditions. So basically if you are driving in snow or fog, your LIDAR-based AV has to fall back to Radar+Cameras. If that system can pass all the safety tests in the worst-case road condition then there is no value in the additional sensors that add expense but no safety margin.
What's even more overlooked is power consumption. LIDAR is far more power intensive, especially when we're talking about multiple packages per vehicle. In the future world of Autonomous Electric Vehicle Fleets, the vehicles using LIDAR will get significantly less range efficiency than their radar counterparts and cost significantly more to build. In a fleet scenario where every margin counts this will result in a significant economic pressure to ditch LIDAR.
So in the end I think Elon will be proved right. Those currently investing in LIDAR-based systems will eventually ditch it for purely practical economic reasons. Those that don't will be completely destroyed in the open market.
The real competitive advantage for AEV's is in the software, not hardware. LIDAR is a crutch for bad software that reaches a theoretical maximum far short of what is needed for economically-viable LVL5 autonomy.
I'll restate this clearly: there's simply no economic or technological advantage to using LIDAR for AEV's.
Lidar doesn't use visible spectrum light. They're usually infrared, so Musk's quote makes no sense.
You're making a lot of strong assumptions to draw your conclusions: that commodity cameras and radar can compete on measurement accuracy with lidar systems, and that lidar costs won't decrease with additional investment (we've already seen costs decrease, by like a factor of 10x in less than a decade). The power questions also aren't cut and dry: if you need extra in-vehicle GPUs to support the radar+camera approach, you may well be using more power than a lidar based approach.
There's also no real requirement that AVs operate in snow or dense fog. Those are only considerations in certain climates in certain seasons. You don't actually need the safety system to pass the safety tests (that don't currently exist) in worst case conditions if the vehicle works anyway. Why optimize for the worst case first?
I'll respond clearly: We're multiple computer vision leaps forward away from what Elon needs for success. They're easily half a decade behind Lidar based systems. And people die as a consequence of putting those systems on the road.
> The power questions also aren't cut and dry: if you need extra in-vehicle GPUs to support the radar+camera approach, you may well be using more power than a lidar based approach.
All current lidar-based approaches I'm aware of also supplement with radar+cameras. LIDAR isn't sufficient in isolation. GPUs consume way too much power, there's no way you can cavalierly just add more of them as a scaling solution.
> You don't actually need the safety system to pass the safety tests (that don't currently exist) in worst case conditions if the vehicle works anyway. Why optimize for the worst case first?
Not even sure how to interpret such a statement.
> I'll respond clearly: We're multiple computer vision leaps forward away from what Elon needs for success. They're easily half a decade behind Lidar based systems. And people die as a consequence of putting those systems on the road.
Custom ASICs for ML that Tesla is building is fairly well-understood tech at this point. High-end smartphones have used similar tech for years now (though perhaps an apples-oranges comparison).
Only criticism I would level against Tesla's current approach is their overly optimistic time estimates, and hand-waviness about the complexity of solving certain very complicated edge-cases. However their technological approach is quite sound.
> All current lidar-based approaches I'm aware of also supplement with radar+cameras. LIDAR isn't sufficient in isolation. GPUs consume way too much power, there's no way you can cavalierly just add more of them as a scaling solution.
Yes, but they don't need to get point-cloud level spacial data from a suite of cameras. That takes more compute power to do at 60fps accurately with cameras than with a lidar.
> Not even sure how to interpret such a statement.
Let me rephrase: You don't need the vehicles to work in worst case conditions at all, if economically they're still a success if you don't allow them to run in those conditions.
If Waymo or Cruise who whomever has self driving taxis in 2020 deployed in temperate cities, and Tesla doesn't have L4 autonomy until 2024, at which point it also works in a blizzard, it doesn't matter if they're cheaper and more effective them. Tesla will have already lost.
> Custom ASICs for ML that Tesla is building is fairly well-understood tech at this point.
This is one small piece of the puzzle. You also need algorithms that Tesla doesn't appear to have, and (camera) hardware that Tesla claims to have but others seem to agree can't support what they want.
> If Waymo or Cruise who whomever has self driving taxis in 2020 deployed in temperate cities, and Tesla doesn't have L4 autonomy until 2024, at which point it also works in a blizzard, it doesn't matter if they're cheaper and more effective them. Tesla will have already lost.
If those self-driving taxis are not available for retail purchase and cost their operators over 6-figure sums to add to their fleet - and Tesla have it working in their promised $35k Model 3, then Tesla will win in the long-term.
Also, remember the car industry moves very, very slowly (no pun intended). The Model 3 has been out for over a year now and has barely captured a fraction of its possible market.
Consider that people will buy/lease a car for individual use for 3-5 years before replacing it: someone buys a normal car in 2021 because they need one at the time, but will still buy Tesla’s FSD car when it comes out.
The “autonomous taxis will replace individual car ownership” trope only applies to hyper-urbanised environments where parking-spaces are luxury lifestyle accessories - and where people are already well-services by public-transit infrastructure.
Finally: people who try an autonomous (but unaffordable for exclusive individual use) taxi service in 2020 might be so impressed with the experience that they vow to buy the first available individually purchasable autonomous car - even if that happens until Tesla’s 2024 models come out - Tesla still wins.
By analogy, consider that Tesla is probably the /hottest/ car brand in the world today - and they did it without any traditional advertising - and they got started over 10 years ago. Enough people influenced by reviews and YouTube videos of the Roadster and 2012 Model S translated that into real money being spent on Model 3 buys today. That’s an anticipation gap of at least to 7 years - that’s impressive. Can you imagine people waiting 7 years for an “affordable” version of a luxury product? Why weren’t any of the other manufacturers doing anything to meet this clear demand for physically attractive EVs with over-hyped autonomous capabilities?
That's where most of the population lives in western countries (both US and EU have ~10% of the population in the largest 10 cities and that's without the well connected metropolitan areas which probably at least double the numbers). So if the futuristic predictions of shared vehicles actually materialize the market for the "personal" vehicle would shrink and manufacturers will have less incentive for whatever they produce for the "personal car" slice of the market. Unless we also get a decentralization trend and people start leaving urban centers.
> Why weren’t any of the other manufacturers doing anything to meet this clear demand for physically attractive EVs with over-hyped autonomous capabilities?
While I agree with the "physically attractive EVs" (although for me it's more of a nice to have than a must), the second part is pretty cynical. You're asking why aren't manufacturers knowingly killing people to sell more cars. Well they are. Some promise you clean diesel, some promise you self driving. Both missed the mark a bit. And both of those promises sold lots of cars.
>> I'll respond clearly: We're multiple computer vision leaps forward away from what Elon needs for success. They're easily half a decade behind Lidar based systems. And people die as a consequence of putting those systems on the road.
>Custom ASICs for ML that Tesla is building is fairly well-understood tech at this point. High-end smartphones have used similar tech for years now
joshuamorton is right and you didn't really address the point made. It isn't about understanding the hardware, it's about understanding the software. You point me to any individual who can tell you concretely how a specific convolutional neural network (CNN) arrives at a given solution and I'll take this all back.
We know so little about how CNNs work, and plane old neural networks (NNs) for that matter, it's embarrassing. They're prone to adversarial attacks, not only that, you can successfully mount the same attack against _any_ NN that was trained on a common dataset. We don't know how to effectively defend against these yet in a white box setting.
We have no idea what the solution space looks like. We barely understand why one of the most simple optimization algorithms outperforms almost all others on at these tasks. We barely understand why randomizing data visitation results in solutions that perform completely differently.
The counter argument goes something like "we tested it on a big eval set and it aced it" Well I'm here to tell you that things change. Assumptions made in creating that eval set might not generalize to all real-world cases. And in this case bad assumptions result in death.
As someone who has spent a good portion of my life working on this stuff I've learned that sometimes (most of the time) the best choice is the obvious, known, simple one. The fact that a good portion of cars may be controlled via something we know so little about should worry you.
On a different note, I think you're also missing why Tesla have opted to invest in a percept suite that doesn't use LIDAR. That reason is cost. Tesla needs to sell cars now. They and their customers cannot afford to put LIDAR on their current platform. At the same time they need to move the metal and they know their customers want AEVs. I think their strategy is sound from a purely business cost/benefit analysis. It's a risk but from a financial perspective a good one, because if it works they hit pay dirt.
> The fact that a good portion of cars may be controlled via something we know so little about should worry you.
I bet that we know much, much more about CNNs than about our brains.
And today pretty much all the cars are controlled by something that we don’t know basically nothing about. Why this doesn’t worry you?
The fields of neuroscience and psychology are far more advanced compared with the infantile field of deep learning.
Humans also learn independently from one another which means outlier events aren’t as much of a problem. You have the ability to observe the world around you, say you see an accident due to some rare weather event, and learn some abstract lesson extremely quickly. A CNN has to learn this over millions of samples out of band.
Based on the number of actual reproducible controlled experiments done on CNNs, I'm not convinced the field is less advanced than psychology at this point.
Well, we know quite a bit about how humans perform at the relevant levels of abstraction, and that's the level at which driving automation systems should be evaluated. They are not there yet, but there's no reason to believe the problems are insurmountable.
> I think you're also missing why Tesla have opted to invest in a percept suite that doesn't use LIDAR. That reason is cost.
Even beyond pure cost, one problem Tesla has is that they already SOLD tens of thousands of FSD option packages, predicated on NOT needing to retrofit the cars with LIDAR.
So arguably, Tesla might be worse off if they deliver a LIDAR based FSD (and need to retrofit tens of thousands of cars, or pay off the owners), than if they just plod on with a camera based FSD that never quite works safely.
>one problem Tesla has is that they already SOLD tens of thousands of FSD option packages
It's always seemed sketchy to me that they're selling FSD option packages when they don't know when FSD will actually happen or whether the current in-car hardware will be adequate to support it once it does happen.
He's likely banking on a bimodal distribution where FSD either happens very soon (within a year or so) in which case the hardware likely will be fine OR not for 5+ years in which case these cars likely will not be on the road any more and it becomes a moot point.
Safety systems certainly need to work in all conditions, but “work” in this case may mean refusing to activate in conditions outside its design range. It’s fine for a self-driving system to refuse to drive in a blizzard; it’s not ok for it to try and then fail to drive in the blizzard.
Refusing to drive could work when starting out, but handling changing conditions on the road is harder.
A scaled request for the driver to take over as conditions get worse can train the driver not to use the self-driving system in adverse conditions, so hopefully it wouldn't have to refuse (unless the driver is negligent).
Getting back to the OP, Musk may have a point: people are terrible at evaluating risk for low-probability/high-consequence events like a car accident, so LIDAR might lose in the market even if it is worth it. But if there were standards for when the car asked the driver to take over and LIDAR is able to pester its drivers less often because it is more capable, then perhaps LIDAR can justify its place in the market.
> Moderate rain is not an acceptable condition for cars to refuse to drive. Especially in the middle of long trips.
It is absolutely acceptable for a self-driving system to refuse to drive in any conditions it can’t handle, but it must also deactivate in a safe way when such conditions arise during operation (e.g. pull over to the side of the road if the driver hasn’t positively acknowledged resuming control).
Commercial viability of any given system is a separate issue, but it’s pretty well accepted moral responsibility to not accept control of a vehicle if you’re unable to operate it safely, regardless of the consequences of that refusal(1). I see no reason to not hold consumer-facing self driving systems to the same standard. Otherwise, they require some specialized training for the operator to be able to recognize the situations in which they are safe to use.
(1) Actual life-and-death situations change this a little, but future availability of rescue personnel and equipment generally weight the conclusion towards operating safely in those situations as well.
The pulling over to the side of the road solution doesn’t work at scale. What happens when it starts snowing on a 10 lane freeway during rush hour and 50% of the cars are self driving with this limitation?
It’s unlikely that, in a region that gets such snowstorms, a self-driving system that can’t handle them reaches 50% market penetration. That seems more like a “we’ll cross that bridge when we come to it” scenario.
OK then QED. The point some are making is that with at least the current type of lidar, that bridge may not exist. It may make more sense to devote resources to better radars and radar processing.
And I’m not disputing that at all. None of my statements say anything about any particular self-driving technology as I’m not an expert in the various technologies. My point is that what’s “acceptable” is a fundamentally a moral stance, and stated the minimum bar I believe all drivers (automated or not) need to meet.
This is simply a constraint that any system needs to work within, and it’s entirely possible that precludes the commercial viability of LIDAR-based systems. It’s also possible that there’s some niche market that fair-weather-only systems can be successful in long before the general problem is solved, and we shouldn’t artificially throw those out as “unacceptable” when there’s a reasonable framework for them to operate under.
Unless you're driving on a road, and heavy snow develops earlier or more severe than you and the weather service anticipated. Instead of going the next 2-3 miles safely to your exit, your car will decide it's best for you to strand you in the middle of the motorway in white out conditions until the weather passes? Color me skeptical.
> Instead of going the next 2-3 miles safely to your exit
You’re making an assumption here that the system is capable of continuing to travel safely. Obviously being safe at home is better than being stopped on the side of the road, but that’s not the choice you’re actually faced with. Similarly, a system that can operate safely in adverse conditions is obviously better than one that can’t.
> I don't give automated driving any allowances, if it can't do what I do, it doesn't belong on the road
In that case, you’re almost certainly grouping a lot of current and safe drivers in the “it doesn’t belong on the road” category. Not having lived in a place where I’ve needed to deal with white-out conditions myself, I doubt I’d feel comfortable continuing to drive. I know this about myself, though, and am likely more conservative than you about canceling or rescheduling a trip when such conditions are a possibility. This is the same bar that I’m proposing automatic driving systems need to meet; if that means they only work on cloudless days with 0% chance of rain, so be it. In practice, this means there’ll need to be some way for a human to take control for quite a while yet.
One industry that is very close to full autonomous operations is aerospace. In a controlled environment with less things to run into. And even they disengage the Autopilot in certain conditions. Why would we aim at full autonomy for cars in an uncontrollable chaotic environment with a lot of stuff to hit under all circumstances? In some countries you are not allowed to drive with summer tires in winter for safety reasons. So why not limit self driving capabilities in similar fashion?
From the business perspective, this would not be desirable because it destroys a lot of prospective business cases for self-driving vehicles. Basically all of them that are as-a-service. What good is a taxi service that stops working during bad weather? That's even when there would be more demand for it because people don't want to walk or bike anymore.
Says a lot about some of these business ideas, doesn't it?
But in all seriousness, if self driving cars are not ready as fast or as performant as expected a lot of these long term bets may be in a lot of trouble. This potentially includes Uber, Lyft, Tesla by self-declaration, and others.
And safety isn’t limited to working in harsh conditions. Mistaking a picture on a billboard for a real object, or mistaking a truck color for the sky can also get you a crash by clear weather.
The crutches analogy isn’t a very good one. If you ask a doctor, he will probably advise you to use crutches until you can stand firmly on your own two feet!
Those don't sound like things Lidar would have a problem with. The color doesn't matter only whether the surface is reflective (basically anything not matte black). But otherwise objects are easy to detect and distance.
The problem with weather is different as the individual droplets bounces the Lidar light before it hits the object.
My point is rather that those are things CV would have problems with that Lidar disambiguate.
And yes there are conditions where neither work well, and frankly where humans barely function either. Much of driving in heavy fog / heavy snowing is really a leap of faith at low speed.
> So in the end I think Elon will be proved right. Those currently investing in LIDAR-based systems will eventually ditch it for purely practical economic reasons. Those that don't will be completely destroyed in the open market.
High quality cameras are insanely complex pieces of electronics and optics. Ditto for processors capable of doing quality image recognition. Large scale manufacturing has made them cheap nonetheless. LIDAR is relatively niche, but if it proves useful to deploy it at scale, I'd expect costs to drop very significantly. The underlying technology uses very simple physics (relative to the algorithmic complexity of image recognition); seems like a solid basis to build a sensor off of.
> LIDAR is a crutch for bad software
You could invert this and say that high precision image recognition is a crutch for ill-suited hardware. The final combination is a product of hardware and software. If LIDAR is currently too expensive or energy-intensive to compete cost-wise at acceptable safety levels, that's one argument, but saying LIDAR is a crutch is just moving the goalposts from "good system" to "cheap hardware".
[edit] Also, just want to point out that RADAR resolutions are way too low to operate a vehicle safely (never mind road signage or other things).
> The underlying technology uses very simple physics (relative to the algorithmic complexity of image recognition); seems like a solid basis to build a sensor off of.
The crux of the argument is you will still need the algorithmic complexity in the end with or without LIDAR, so it doesn't add any advantage.
I'll just paste the quote FTA:
> "Lidar is really a shortcut," added Tesla AI guru Andrej Karpathy. "It sidesteps the fundamental problems of visual recognition that is necessary for autonomy. It gives a false sense of progress, and is ultimately a crutch."
> The crux of the argument is you will still need the algorithmic complexity in the end with or without LIDAR, so it doesn't add any advantage.
The approach described still uses the same ML on a Lidar data representation, but with an extra step on image recognition to put it into that data format. Image recognition only adds computational complexity post-hardware-sensor because you first need to generate a Lidar-like 3D map. So that is conceivably an advantage for Lidar in the long run given that its method for making the 3D representation of the space (upon which the ML runs) is dirt-simple physics-wise—much simpler than the image recognition algorithms and camera electronics used by image recognition. It's just not cheap yet because the market for processors and cameras is huge and the products have become incredibly sophisticated and cheap. Again, hard to imagine an idea as simple as Lidar not getting much cheaper with scale.
> I'll just paste the quote FTA
My point wasn't an ad hominem one, so I don't think the fact that an expert like Karpathy said "crutch" changes the fact that it's slanted phrasing. But given that Karpathy works for Tesla and has a vested interest in assuaging investors and consumers, it makes sense why he used slanted phrasing. "Lidar is a crutch" makes Tesla sound more visionary than "we don't want to pay for Lidar because it's still too expensive and we think we'll be able to rival Lidar with image recognition." It's a nice way to subtly jab at competitors who are investing in Lidar and frame it as if Lidar was the tech that had catching up to do (when, in fact, the opposite is true).
Since the key point of the article is that ML algorithms work better with Lidar data representations, it's pretty hard to see it any other way. They both go to the same intermediate data representation, and Lidar still wins in a head-to-head. Again, you can argue validly that cameras + image recognition will be good enough, but calling Lidar a "crutch" seems like pro-Tesla spin by a high-up Tesla employee whose job is to make Tesla look good.
While your comments have gone a long way towards changing my mind, Karpathy's comments are a bit rich, given that Tesla's systems could have benefited from a crutch to help avoid running full-tilt into large obstacles.
If all you use is words, it is of course easy to omit that the spatial resolution of radar is barely enough to tell there are one or or two vehicle sized objects in front of you, maybe one to the side, and they better be moving.
Ford did some research to improve Lidar for use in the rain/snow using a filtering algorithm.
> Ford’s autonomous cars rely on LiDAR sensors that emit short bursts of lasers as they drive along. The car pieces together these laser bursts to create a high-resolution 3D map of the environment. The new algorithm allows the car to analyze those laser bursts and their subsequent echoes to figure out whether they’re hitting raindrops or snowflakes.
FWIW, lidar is very effective in finding the surface below forest canopies. In fact, it will often identify the underbrush, in addition to the canopy top and the surface.
Aren't there still problems with multiple active sensors sweeping the environment?
I remember it being a problem in cars that used Lidar but cannot find the info anymore.
I think Lidar could still be of help and even the perfect software can use any form of sensory redundancy. But I agree that there might be alternatives.
edit: A laser is probably a lot cheaper than camera and imaging dsps if comparable production scales are reached.
>So in the end I think Elon will be proved right. Those currently investing in LIDAR-based systems will eventually ditch it for purely practical economic reasons. Those that don't will be completely destroyed in the open market.
Maybe, but there's no reason to leave a local maxima until you actually have something better.
Musk badly wants for you to not realize that nobody is proposing LIDAR-only, but are rather proposing LIDAR+optical+radar. Musk argues against straw men.
(Also the radar Telsa is using has jack-shit for angular resolution. It can't tell the difference between a tree next to the road and a fire truck parked right across it. Consequently that radar has very limited utility.)
More accurately, Musk doesn't care what others think is needed for self-driving, so your aspersions about Musk badly wanting us to think one way or another are not supported by facts.
Neither Tesla nor Musk make a big deal of lack of lidar.
The only reason his views on the subject are public (and so hotly discussed) is because during Autonomy Investor Day he was asked by an investor why Tesla doesn't use a lidar.
So he answered the question. You might not agree with his reasoning but he's not on some "NO LIDAR" publicity tour, trying to change your mind.
He should care what experts think. Whenever someone probes Musk in their own area of expertise, they find his knowledge is that of a stubborn, overconfident dilettante.
I think any expert is going to be able to outshine Musk in their area of expertise, but where Musk is great is in seeing the bigger picture. Musk simply doesn't have the time to become an expert in everything, he has to execute now.
> "More accurately, Musk doesn't care what others think is needed for self-driving"
That's obviously wrong. Musk is deeply invested in the matter, particularly the public's perception of the matter. He is currently promoting his 'solution', which has not yet proven itself, as being hardware complete and is selling it to consumers right now. Public perception is his priority and he shit-talks LIDAR whenever he feels doing so is necessary to defend the public perception of the product he's trying to sell.
Actually there was a presentation given by one of their Lead Data Scientists describing their ML architecture. At no point was radar mentioned. They are purely relying on vision to identity cars, obstacles, traffic lights etc with dozens of models each focused on one particular 'type'.
Radar by the sounds of it is being used purely as a fallback.
The question is if the vision systems fail to recognise an obstacle at high speed is the radar long range enough to compensate in time.
"A forward-facing radar with enhanced processing provides additional data about the world on a redundant wavelength that is able to see through heavy rain, fog, dust and even the car ahead."
So it's rather bizarre that you would speculate about how they're not using radar when they explicitly say they do.
There is zero chance a radar will recognize a traffic light. It is prone to false positives, to boot. Its only use is tracking large metal objects: cars and trucks, for the purpose of adaptive cruise control.
Most of the non-Tesla systems do. Waymo uses lidar, radar, and cameras. The cruise vehicles I see around have lidar and radar as well, and I just assume everyone has cameras because they're cheap and easy to stick somewhere.
To be specific about waymo (to be clear I work at Google, but don't actually have any special info on this), look at the photo in [0]. The cone thing on top is a lidar, but also has cameras in the larger part under the cone. The spinny thing on the front is also probably a lidar. The fin that looks like an extra mirror on the back, and the two on the front have radar. There's also probably a forward facing radar mounted on the nose somewhere near the grille.
So self driving cars are basically going to be really expensive for the first while as the sensors take time to reduce in price plus the computer in the back and less distance from the battery.
Sounds like a reasonable trade off. No one needs to own these cars just rent them on demand. Plus some wealthy people in the early adoption curve.
Lidar used for long range. Vision used for things like colour recognition e.g. is a traffic light green/red or is an ambulance sirens on. Radar used for reversing etc where Lidar given its location might not be able to see that close.
I suspect that optical will also supplement LIDAR in cases where very precise angular resolution is needed, such as human gesture and posture recognition (which is necessary if only because sometimes humans direct traffic, but also for things like profiling pedestrians to anticipate which is likely to jump into the road without looking.) Being able to detect which way a human head is facing will at the very least be necessary, and while you might be able to read faces with LIDAR from a distance, my gut says that optical will give you better data for that.
Of course, they were presumably using a mostly-stock car with custom niche sensor products, so that comparison would be a bit more favorable in production.
Currently, LIDAR is very expensive, certainly too expensive to build into every Tesla being manufactured. So Musk would not be able to sell a "full FSD capability" option on his cars if he acknowledged LIDAR is useful/necessary to autonomous driving.
The number one link if you search "Tesla" on HN is "All Tesla Cars Being Produced Now Have Full Self-Driving Hardware." It's been an extraordinarily effective marketing gimmick.
They will certainly have an economic value of multiple hundred thousand dollars, but that applies to all manufacturers. So if a manufacturer is able to produce non-LIDAR self-driving cars and sells those cars to consumers for $100,000 less than the competition, you can bet that they’ll still capture the robotaxi rental value that is there, through an app store-like agreement. Leaving the money on the table would obviously not happen, unless it was intended to drive the competition out of business.
There would probably be room for both these models (direct sales that capture much of the self-driving value and leasing), but regardless there are obviously strong incentives for a 5- or 6-figure reduction in costs.