As far as I can tell, this Drive Pilot thing is precisely the existing lane assist product. What's changed is that they've heavily constrained it so that it disengages if you go over 60 kph, or outside the hand-maintained list of blessed areas. Presumably they feel they can detect and disallow construction areas and other complexities fast enough to avoid accidents.
Which is to say... this is mostly a stunt. Teslas are literally driving people around now[1], and the rest of the industry feels they need to do something. Announcing a "SAE Level 3" product, no matter how constrained, at least gets them marketing hits like this that look like an advantage.
[1] No joke: mine takes my kids to and from school reliably. FSD beta isn't finished, but it's really, really good. There remain some path planning and confidence glitches that force me to disengage every dozen miles or so, but in 500 miles since I got it I've yet to see the car attempt anything genuinely unsafe. Mostly it just annoys other drivers by refusing to enter traffic.
If Mercedes-Benz is able to define conditions where their self driving software is guaranteed to work fully autonomously without human intervention and Tesla can't, that in itself is an actual advantage over Tesla's system even if Tesla's system can do more with human monitoring in other conditions.
If Tesla is also able to get approval for this (which might be hard if they push OTA updates that completely change the way the software works all the time and can't guarantee there won't be regressions) then they should by all means do so.
That's sort of true in the abstract, but misses the point. Autonomy isn't about "defining conditions", it's about solving real problems. Cars that are L3 only in traffic jams on the Autobahn aren't very autonomous, nor useful. If it's stop go and the system gets above 60kph, it cuts out. If the stoppage is due to construction, it won't work.
I mean, sure, it's an advantage for people who actually need to drive in exactly those conditions.
But it's not a engineering advantage of the system. Fast forward a year and look at where Daimler will be and where Tesla will be. Who's going to get to a L3 system on general roads first? Who's going to be running L4 driverless vehicles first?
Which is to repeat: it's just a stunt. It's a way to "sound like" they're "ahead" in this area of technology, when clearly they aren't. In fact this car is doing more or less exactly what Teslas were doing in the first version of Autopilot more than six years ago. And note that in all that time, those Teslas haven't been hitting anyone in slow traffic jams on well-maintained limited access highways either.
Tesla "will be there in a year" since... 2016 or so? (Musk said in January that he is confident Tesla will achieve Level 5 autonomy in 2021, looking forward to that)
> those Teslas haven't been hitting anyone in slow traffic jams on well-maintained limited access highways either.
weird then that Tesla, despite showman Musk at the helm, hasn't been doing the simple stunt of putting their money where their mouth is, getting L3 permission for their system and actually taking the risk for something that is "never happening" instead of finding ways of blaming the driver every time. It would be an easy way of providing an additional actual useful capability to their users on top of what they already have.
No, it's not some massive technical leap, but "ok, we'll take the blame for our system failing" is still a big legal step.
> Autonomy isn't about "defining conditions", it's about solving real problems.
Sounds pretty hand-wavy. Autonomy is about defining conditions in which a self driving system can safely operate, also known as Operational Design Domain (ODD).
Of course, ODD as a concept is foreign to Tesla FSD because it is "50% of the time, it works every time" i.e. you don't know when it works and when it doesn't. It's a YOLO driver assistance system with a misnomer. Not sure how many "real problems" that is solving and definitely won't be L4 anytime soon.
> That's sort of true in the abstract, but misses the point. Autonomy isn't about "defining conditions", it's about solving real problems. Cars that are L3 only in traffic jams on the Autobahn aren't very autonomous, nor useful. If it's stop go and the system gets above 60kph, it cuts out. If the stoppage is due to construction, it won't work.
> I mean, sure, it's an advantage for people who actually need to drive in exactly those conditions.
> But it's not a engineering advantage of the system. Fast forward a year and look at where Daimler will be and where Tesla will be. Who's going to get to a L3 system on general roads first? Who's going to be running L4 driverless vehicles first?
> Which is to repeat: it's just a stunt. It's a way to "sound like" they're "ahead" in this area of technology, when clearly they aren't. In fact this car is doing more or less exactly what Teslas were doing in the first version of Autopilot more than six years ago. And note that in all that time, those Teslas haven't been hitting anyone in slow traffic jams on well-maintained limited access highways either.
If it's a race to see which company can develop fully self-driving cars that work in all conditions in the next 5 years than maybe Tesla could be "ahead" of Mercedes-Benz (although it seems to be way behind other companies like Waymo).
However, if we are actually decades away from fully self-driving cars that work in all conditions it is much more useful to have cars that can be trusted to operate without human supervision in limited but well defined conditions, because it is much more useful to be able to have a car that allows you to do something else other than focus on driving SOME of the time than a car that assists you more in some ways but requires supervision ALL of the time.
If Tesla's approach precludes them from making self driving technology work reliably under limited conditions like this, it doesn't matter whether their self driving technology works better (but not reliably) in other conditions, because in practice it won't be as useful as more limited but reliable self driving systems.
Some countries are already requiring the approach Mercedes-Benz is taking here for regulatory approval and based on the number of accidents Tesla vehicles have been involved in its possible that the US could adopt the same approach.
If this happens it will not be useful at all to have a more advanced but incomplete and unreliable self-driving technology in the short term (but perhaps it will be possible to keep developing it without shipping it in cars until it reaches a sufficient level of reliability).
> As far as I can tell, this Drive Pilot thing is precisely the existing lane assist product. What's changed is that they've heavily constrained it so that it disengages if you go over 60 kph
Nope - it's a new system powered by LIDAR and cameras and only works on pre-mapped roads, comparable to what Tesla and Waymo are doing (but erring on the side of caution).
The 60kph limitation is temporary because that's what the regulator was comfortable with.
Lane assist is a different tech stack and continues to exist alongside this for regular driving.
I don't see anywhere in those slides where that's substantiated. Tesla and Waymo (and Cruise and Mobileye, FWIW) are making navigation decisions: they'll change lanes, take turns, wait for traffic, use roundabouts, read street lights and speed limit sighs, etc...
Drive Pilot doesn't seem to be doing any of that. It's just a lane assist package: it will drive straight, in its marked lane, behind another vehicle, and that's all it will do. Maybe it will someday, sure. But it's not exhibiting these features anywhere, nor is Mercedes claiming that it has them. Am I missing something?
Which is to say... this is mostly a stunt. Teslas are literally driving people around now[1], and the rest of the industry feels they need to do something. Announcing a "SAE Level 3" product, no matter how constrained, at least gets them marketing hits like this that look like an advantage.
[1] No joke: mine takes my kids to and from school reliably. FSD beta isn't finished, but it's really, really good. There remain some path planning and confidence glitches that force me to disengage every dozen miles or so, but in 500 miles since I got it I've yet to see the car attempt anything genuinely unsafe. Mostly it just annoys other drivers by refusing to enter traffic.