Hacker News new | past | comments | ask | show | jobs | submit login

> You don't actually need the safety system to pass the safety tests (that don't currently exist) in worst case conditions

I do not agree, making the systems work in all conditions is the hard part of true automated driving.

Also, worst conditions happen 1/50th of the year ( guess). So it's not that rare




Safety systems certainly need to work in all conditions, but “work” in this case may mean refusing to activate in conditions outside its design range. It’s fine for a self-driving system to refuse to drive in a blizzard; it’s not ok for it to try and then fail to drive in the blizzard.


Refusing to drive could work when starting out, but handling changing conditions on the road is harder.

A scaled request for the driver to take over as conditions get worse can train the driver not to use the self-driving system in adverse conditions, so hopefully it wouldn't have to refuse (unless the driver is negligent).

Getting back to the OP, Musk may have a point: people are terrible at evaluating risk for low-probability/high-consequence events like a car accident, so LIDAR might lose in the market even if it is worth it. But if there were standards for when the car asked the driver to take over and LIDAR is able to pester its drivers less often because it is more capable, then perhaps LIDAR can justify its place in the market.


Moderate rain is not an acceptable condition for cars to refuse to drive. Especially in the middle of long trips.

This means self driving cars must also operate without LIDAR, though the safety advantage when it is operating can still be a net gain.


> Moderate rain is not an acceptable condition for cars to refuse to drive. Especially in the middle of long trips.

It is absolutely acceptable for a self-driving system to refuse to drive in any conditions it can’t handle, but it must also deactivate in a safe way when such conditions arise during operation (e.g. pull over to the side of the road if the driver hasn’t positively acknowledged resuming control).

Commercial viability of any given system is a separate issue, but it’s pretty well accepted moral responsibility to not accept control of a vehicle if you’re unable to operate it safely, regardless of the consequences of that refusal(1). I see no reason to not hold consumer-facing self driving systems to the same standard. Otherwise, they require some specialized training for the operator to be able to recognize the situations in which they are safe to use.

(1) Actual life-and-death situations change this a little, but future availability of rescue personnel and equipment generally weight the conclusion towards operating safely in those situations as well.


The pulling over to the side of the road solution doesn’t work at scale. What happens when it starts snowing on a 10 lane freeway during rush hour and 50% of the cars are self driving with this limitation?


It’s unlikely that, in a region that gets such snowstorms, a self-driving system that can’t handle them reaches 50% market penetration. That seems more like a “we’ll cross that bridge when we come to it” scenario.


Rain and fog are much bigger issues for these systems than snow. Being unable to sell cars in places with rain does not leave many options.

However, for rare events it’s very possible for individuals to be fine with something and only collectively do you end up with major problems.


OK then QED. The point some are making is that with at least the current type of lidar, that bridge may not exist. It may make more sense to devote resources to better radars and radar processing.


And I’m not disputing that at all. None of my statements say anything about any particular self-driving technology as I’m not an expert in the various technologies. My point is that what’s “acceptable” is a fundamentally a moral stance, and stated the minimum bar I believe all drivers (automated or not) need to meet.

This is simply a constraint that any system needs to work within, and it’s entirely possible that precludes the commercial viability of LIDAR-based systems. It’s also possible that there’s some niche market that fair-weather-only systems can be successful in long before the general problem is solved, and we shouldn’t artificially throw those out as “unacceptable” when there’s a reasonable framework for them to operate under.


Unless you're driving on a road, and heavy snow develops earlier or more severe than you and the weather service anticipated. Instead of going the next 2-3 miles safely to your exit, your car will decide it's best for you to strand you in the middle of the motorway in white out conditions until the weather passes? Color me skeptical.


> Instead of going the next 2-3 miles safely to your exit

You’re making an assumption here that the system is capable of continuing to travel safely. Obviously being safe at home is better than being stopped on the side of the road, but that’s not the choice you’re actually faced with. Similarly, a system that can operate safely in adverse conditions is obviously better than one that can’t.


I know in my good-old human-driven mode of transport what choice I would make.

I don't give automated driving any allowances, if it can't do what I do, it doesn't belong on the road.


> I don't give automated driving any allowances, if it can't do what I do, it doesn't belong on the road

In that case, you’re almost certainly grouping a lot of current and safe drivers in the “it doesn’t belong on the road” category. Not having lived in a place where I’ve needed to deal with white-out conditions myself, I doubt I’d feel comfortable continuing to drive. I know this about myself, though, and am likely more conservative than you about canceling or rescheduling a trip when such conditions are a possibility. This is the same bar that I’m proposing automatic driving systems need to meet; if that means they only work on cloudless days with 0% chance of rain, so be it. In practice, this means there’ll need to be some way for a human to take control for quite a while yet.


One industry that is very close to full autonomous operations is aerospace. In a controlled environment with less things to run into. And even they disengage the Autopilot in certain conditions. Why would we aim at full autonomy for cars in an uncontrollable chaotic environment with a lot of stuff to hit under all circumstances? In some countries you are not allowed to drive with summer tires in winter for safety reasons. So why not limit self driving capabilities in similar fashion?


From the business perspective, this would not be desirable because it destroys a lot of prospective business cases for self-driving vehicles. Basically all of them that are as-a-service. What good is a taxi service that stops working during bad weather? That's even when there would be more demand for it because people don't want to walk or bike anymore.


Says a lot about some of these business ideas, doesn't it?

But in all seriousness, if self driving cars are not ready as fast or as performant as expected a lot of these long term bets may be in a lot of trouble. This potentially includes Uber, Lyft, Tesla by self-declaration, and others.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: