Hacker News new | past | comments | ask | show | jobs | submit login
Replaying real life: how the Waymo Driver avoids fatal human crashes (waymo.com)
139 points by Crash0v3rid3 on March 9, 2021 | hide | past | favorite | 175 comments



You can read the California DMV's set of autonomous vehicle accident reports.[1] Almost all the Waymo reports are "vehicle was entering intersection, detected cross traffic, stopped, was rear-ended by human driver". There's one Waymo report where someone ran a stop sign and hit them.

The rear-ending problem will be solved as automatic emergency braking becomes standard on cars. Already, it's shipping on almost all high-end cars and 70-80% of midrange cars. There's a US auto industry goal of it being standard by 2022. That's probably the main feature needed for self-driving cars to coexist with human-driven cars.

[1] https://www.dmv.ca.gov/portal/vehicle-industry-services/auto...


>The rear-ending problem will be solved as automatic emergency braking becomes standard on cars. Already, it's shipping on almost all high-end cars and 70-80% of midrange cars. There's a US auto industry goal of it being standard by 2022. That's probably the main feature needed for self-driving cars to coexist with human-driven cars.

I really wish we could just enforce proper education of being observant.

Most times when I hear about people getting rear-ended it's because the driver behind them was looking at their phone or otherwise distracted. We shouldn't invent features that allow that to continue.

On the other hand, I think this is fine in some rare cases, like maybe an accident or some other incident happens up ahead, a driver stomps their brakes, and then the person behind them who doesn't know what's happened doesn't stop soon enough. But even for the latter case, you could argue that the DMV says to keep enough distance where in most cases you should be able to still come to a stop with barely a moment's notice as long as you're paying enough attention. Of course, most people don't leave that much space much, if ever.


Any solution that relies on "let's just make humans beings better" is basically doomed to failure and little more than hand wringing and virtue signalling.

Cars have been pervasive for over a century and virtually none of the actual improvements in safety have come from "hoping humans do better".


The safety record of human drivers is very different between different countries. There are places where drunk driving is the norm and there are places where it almost never happens, et cetera. So there is definitely something going on there.


>Cars have been pervasive for over a century and virtually none of the actual improvements in safety have come from "hoping humans do better".

Improvements to signage and traffic controls, cracking down on reckless and intoxicated driving and the massive campaign to get people to actually wear their seat-belts (listed in no particular order) seem to fall solidly in the "make humans do better" category.


>>> I really wish we could just enforce proper education of being observant.

>> Cars have been pervasive for over a century and virtually none of the actual improvements in safety have come from "hoping humans do better".

> Improvements to signage and traffic controls, cracking down on reckless and intoxicated driving and the massive campaign to get people to actually wear their seat-belts (listed in no particular order) seem to fall solidly in the "make humans do better" category.

None of those things actually involve "making humans do better [at driving]. They either involve implementing technical solutions or preventing humans in certain conditions from driving at all.


>They either involve implementing technical solutions or preventing humans in certain conditions from driving at all.

I disagree, they facilitate (signs, markings), persuade (public awareness campaigns) and incentivize (DUI penalties) "doing better". Arguably road design, markings and signage are a technical solution but the humans involved are heeding them of their own free will. It's not like an airbag that goes off based on certain criteria regardless of how you feel about the matter.


I argue the opposite, humans adjust their driving depending on what is expected. The current bar is too low. If we were to raise the bar, people would adapt. I would be very interested in some data from places with draconian traffic laws.


> Cars have been pervasive for over a century and virtually none of the actual improvements in safety have come from "hoping humans do better".

I disagree, given how standards for getting a driver's license are different between countries and getting more and more tighter.

In the US, unless things are different from TV, you can get trained by someone else with a license in a regular car and do an exam.

Over here, you need a driving instructor in an adapted car (dual controls), and depending on how well you pick it up, spend anywhere between 15 and 50 (or sometimes more) hours practicing driving and traffic rules. There's a written / theoretical exam you have to study for (traffic rules), then a practical exam with an examiner.

In my case, it cost me about €2500 in lessons to get my driver's license. Then I spent another €1500 or so on top of that to get my motorcycle driver's license, where there's even more emphasis on personal safety and safe driving.

With that in mind, I'm confident in my claim that Dutch drivers are better than American ones.


Please stop thinking that one can know how “the states” are from a distance. The US is quite large and diverse. I’ve lived in four states and they were each significantly different in terms of legal structures, policies, culture, daily life etc. As an American I would rarely claim to know how things are in “the states.” I only know the parts I’ve lived in, or the few things that are actually federal in nature.

The US may be more homogeneous than Europe given its shorter history and mostly common language, but it’s far from some uniform block that you can summarize from what you saw on TV or a tourist visit to New York.


As someone who has had a driver's license in the US for a very long time, my experience is that--as with many things--one of the main things that makes you a better driver is putting in hours driving. Could you move more of those hours into a supervised learning environment? Probably. But you do that at the cost of making getting a license more difficult, expensive, and exclusive (in an environment where driving is often not an optional thing if someone wants to make a living).

I'm not sure how "more expensive training" eliminates some people texting while driving.


> I'm not sure how "more expensive training" eliminates some people texting while driving.

You can do and the other. It's not about eliminating. If someone is great at something that doesn't mean that they're good at teaching that skill. Further, just because someone drove for a long time doesn't mean they're a good driver. Likely they've improved, but it hasn't been checked.

When I got my license I learned about a lot of changes that older drivers weren't aware of.

> But you do that at the cost of making getting a license more difficult, expensive, and exclusive

A country can change their car reliance. It's also strange to focus on the cost of the license, while not checking if better drivers would result in lower costs elsewhere. E.g. reduced accidents (lower insurance), less medical costs, etc.


It would probably be great if everyone could take a high-end week-long defensive driving course. That's also not very realistic. I'm not sure the typical driver's ed imparts any particularly unique insights that generally responsible adult drivers don't have.


"A country can change their car reliance."

Not to be bleak, but I doubt this will happen for a long time, if ever, in the US. It -may- happen regionally among a state or two, but good luck getting multiple states to coordinate on a project like this.

We can't even get things like voting centers done properly, so how in the world are we going to implement public transit? There is no incentive for politicians, and I bet the car industry would quickly give them incentives otherwise if they even attempted it.


You've got a point in general, but here specifically I don't think OP was wrong.

The US definitely has an overarching philosophy, regardless of differing per-state policies. From my own experience with one of the more restrictive states in the US, all that is required to get a driving license is a written test and a road test once you reach 18 years old. Most people want to get their license before then, so they opt into training classes with an instructor and a car with dual controls. But that isn't required. I don't know how things are in the other more restrictive states, but I'd guess what I said goes for at least 90% of the country.


One thing that was darned frustrating (but in my opinion a good idea) with the UK driving licence test is that the theory test includes a section on hazard perception, where you watch a series of videos and click when you spot something that is turning into a hazard. You need to click at the point where there is enough evidence from what you can see that the hazard may exist, regardless of whether the driver actually needed to take any avoidance action. It teaches you to look ahead, not just respond to what is happening right now. And yes, I took two attempts to pass it.

These are things like a small child that you could previously see on the pavement has disappeared behind an obstruction (so may suddenly step out into the road). Or a football coming across the road from a playing field (a child may run after it). Or a car on a side street going at a speed that is inconsistent with them stopping at the junction.


> In the US, unless things are different from TV, you can get trained by someone else with a license in a regular car and do an exam.

Be cautious about believing TV drama is realistic. States I’m familiar with require 30 hours of licensed classroom instruction and 6 hours of licensed driving instruction beyond the training via a parent or other adult.

In any case, no amount of training seems to stop people from using their phone while driving.


I'm teaching my kids, insurance companies stopped giving any discount for drivers ed. My daughters all passed the exam to get a learning license on the first try. I made them read the dmv book, on the state website. I also quizzed them on various scenarios. They have friends who took classes and still failed the exam multiple times.

IMHO, the problem has more to do with the entitlement around driving. People don't view it as a privilege.


> I'm teaching my kids, insurance companies stopped giving any discount for drivers ed.

What's not clear is the difficulty of the test. Class based training, I'm not a fan. Only 6 hours of actual driving seems very minimal. Learning through another: they'll likely pick up bad habits.

> They have friends who took classes and still failed the exam multiple times.

That's good. Here it's a practical exam that often takes various tried to get. Plus theoretical one. It includes loads of things. Not just knowledge, but also e.g. judgement about safety. So in case something is unsafe the answer isn't "yes, it is legal", but more of a "there's a hazard".

From my perspective it's more difficult to unlearn a bad habit than to leach it properly (and immediately address something) right from the start. And it's great that some are good drivers, but I'm more concerned with the amount of bad drivers passing their "skill" to someone else.


>IMHO, the problem has more to do with the entitlement around driving. People don't view it as a privilege.

Large parts of the US have little to no public transit, and can only be accessed via car. You can't say driving isn't an entitlement and then arrange your infrastructure in such a way as to require driving.


Welcome to the USA. You shouldn't, but you certainly can and we have. Read any laws about driving licenses and reasons for suspension.


The problem is that the laws are trying to thread a needle between two entirely contradictory ideas - that it is both well and normal to build the world in a matter that expects everyone to drive - and that driving is a privilege that can be revoked from you.

You can't legislate, or law your way out of a basic contradiction in your society, so we end up in a middle ground where nobody is happy.

For another example of a basic contradiction - we allow and encourage people to drive intoxicated (After all, bars have parking lots, and do not enforce that every group have a designated driver), but not too intoxicated, and not in a manner that is too unlucky. The bar between the two is either an objective test from a breathalyzer device (a test that you can't take until you are pulled over - and one sets the bar too low for many people, in my opinion), or a fully subjective test of 'did you get into an auto accident, while sitting at 0.05 BAC'?


> People don't view it as a privilege.

It's difficult to view it as a privilege in a society that, over the past 70 years, was deliberately constructed with the expectation that everyone drives.


I did not have a driver's license prior to moving to the US.

The process was getting it was doing a multiple-choice quiz, followed by 20 minutes of paid 'instruction' as I drove around the neighborhood... And then a 'test' that repeated the exact same route. I 'passed' with a score of 83/100.

In contrast, I failed the test four times, while taking it in BC, and because it would have to be scheduled 3 months in advance, eventually gave up on it.

I am glad that the test-taker feels confident that after I stumble-assed my way through driving 20 and 25 mph on empty residential streets, I am now capable of safely driving on freeways.


>In the US, unless things are different from TV, you can get trained by someone else with a license in a regular car and do an exam.

What is this TV source you speak of? 11 o-clock news? Seinfeld?

> Over here, you need a driving instructor in an adapted car (dual controls),

Same.

> ...and depending on how well you pick it up, spend anywhere between 15 and 50 (or sometimes more) hours practicing driving and traffic rules.

Yup. We do that too.

> There's a written / theoretical exam you have to study for (traffic rules), then a practical exam with an examiner.

Again, we do that as well.

> In my case, it cost me about €2500 in lessons to get my driver's license. Then I spent another €1500 or so on top of that to get my motorcycle driver's license, where there's even more emphasis on personal safety and safe driving.

That is a lot of money.

> With that in mind, I'm confident in my claim that Dutch drivers are better than American ones.

The ones on TV or in your imagination?


> and getting more and more tighter.

It seems it's a trend to put more and more laws, red tape and regulations around driving in certain countries, and really around young people in general. All that voted by an aging population (who don't have to submit to the same requirements since their "rights" are grandfathered!) that's increasingly afraid of young people.

That coupled with the out of control unemployment numbers for young adults in the EU and sky high cost of everything makes me wonder what the brain drain situation is in Europe. You meet a lot of young talented and educated Europeans engineers in the Bay Area for instance, are there the same number of Americans ones in Europe?

> Over here, you need a driving instructor in an adapted car (dual controls), and depending on how well you pick it up, spend anywhere between 15 and 50 (or sometimes more) hours practicing driving and traffic rules. There's a written / theoretical exam you have to study for (traffic rules), then a practical exam with an examiner.

> In my case, it cost me about $2500 in lessons to get my driver's license. Then I spent another $1500 or so on top of that to get my motorcycle driver's license, where there's even more emphasis on personal safety and safe driving.

I see how driving schools and bureaucrats in charge of examination might like this guaranteed work. No innovation required, and grandfathered right to licenses issued to older drivers (so good support from an aging population).


Yay. Let’s make driving a perk of being rich.

All you poor schmucks can hoof it.


To a large extent, car ownership is already a perk of being “rich.” Looks how many people have trouble getting to voting locations or to vaccination sites.


The requirements in my US state are very similar to yours, except the amount of time you need to spend with a licensed instructor in an adapted car is fixed (don't remember the number). But after that, and before you can take your practical exam, you're required to spend a significant (fixed minimum) amount of time driving with a parent or guardian in the passenger seat.

And the cost is far less than the nearly $3000 you paid.


> you're required to spend a significant (fixed minimum) amount of time driving with a parent or guardian in the passenger seat.

That seems pretty unsafe from my viewpoint. Just because you can drive doesn't mean you can teach/instruct. I think people here generally do 20 hours of driving with a licensed instructor. There's no minimum I think, it's just that the (practical) exam is expensive so people only do the exam when you think you're good to go. Licensed instructors are also rated on the how often someone successfully passes an exam.


You're only allowed to drive with a parent after classroom instruction and driving with an instructor for the required amount of hours and passing. So it's not unsafe at all, because where you'd be allowed to just start driving on your own, my state requires you to drive with a parent for X hours before that.


While you may be correct in this case, I'm not sure what you're claiming is generally true. An equally valid observation is "trying to address social problems with technical solutions is doomed to failure". I wouldn't advocate for solving my country's domestic violence issues with technology for example - a shift in social norms addresses the underlying issue and is far more likely to be long-term effective.


Unfortunately, current cars have to be controlled by humans, and the human brain has certain characteristics (selected for by evolution) that makes it near impossible for it to be able to react to 100% of the driving situations. There are built-in efficiency circuits (similar to CPU branch prediction) that takes immediate action based on what it typically sees.

So if 99.9999% of the time, when a car in front starts to accelerate, and doesn't slam on the breaks immediately after, the brain makes the assumption that the care in front will never slam on the breaks and devotes less brain-CPU time to "Oh No, that care in front just slammed on the breaks" processing.

In general, human drivers don't do this unless they are very inexperienced. You don't mash on the gas and immediately mash on the breaks, instead you make sure there is nothing coming. If you can't see well enough that there is something coming, then you ease up slowly and only mash on the gas when you have enough information that you won't get side swiped going through the intersection.

And yes, intersections that have a "stop line" far enough back that you can't see cross traffic are poorly designed. Or if cars are really expected to stop every time before a crosswalk, then advance 2 feet to stop again, this needs better enforcement so that activity becomes normalized to the point where the car behind would expect that (one way of doing this is to put two stop signs, one before the sidewalk, and one at the intersection).


... but isn't rush hour traffic basically that? Car in front of you starts to move forward then sometimes suddenly stops. People usually have experience with this, though it's a whole other category compared to entering a seemingly empty intersection as the second car and the first car slams the brakes because it turns out the intersection wasn't empty.


> but isn't rush hour traffic basically that? Car in front of you starts to move forward then sometimes suddenly stops

It may be sudden but it's also predictable. The participants know that the car in front of them will stop, possibly quickly and are prepared.

A car (human or AI piloted) stepping on the brakes in the middle of a green light because a pedestrian anticipating the car's departure from the intersection began to step off the curb is far less predictable and arguably violated established traffic behavioral norms.

Predictable traffic is safe traffic.


That last point isn't a "you could argue" or "most cases", it's literally the law. Rear-ending is always the rear car's fault and it's always their responsibility to leave enough space and time to react and stop safely.

If the lead car braked when there was no reason or danger, that driver might also be cited for recklessness or some such, but a rear-end collision is always an error by the rear driver.


> it's literally the law. Rear-ending is always the rear car's fault a

Rear ending being the rear car's fault is a norm of the insurance industry that is used to settle disputes in the absence of other information. Furthermore, in the overwhelming majority of jurisdictions the government specifically does not establish "fault" for routine vehicle accidents (the government generally only decides matters of fault inside a court room) and the paperwork that comes with a police report of an accident usually says something along the lines of "we don't establish fault, just record the facts" prominently on it.


As everybody knows who have looked at traffic accidents on Youtube this has been misused in Russia. You want a new car for free? Stop then reverse into the car behind you. And that's why dash cams became ubiquitous in Russia and other countries.


No, it's more complicated than that. If someone cuts in front of you and you end up rear-ending them, it's not your fault. See all the fucking break-check idiots. (And the videos when someone does it to a car, which then turns out to be a cop.)

That said, yes, in theory you have to be alert, as the car in front of you can at any moment do something unexpected. (Eg. the driver falls asleep and crashes into the barrier.)


There's also an insurance scam where the driver slams on the brakes for no reason on the highway hoping you rear-end them literally was 50mph to 0. I barely dodged that one, since I allow 3-lanes of stopping distance, but I still needed to swerve toward the median, and the guy behind me hit me, and behind him hit him and behind him hit him.


> I really wish we could just enforce proper education of being observant.

I know we're talking about California here, but the other straightforward explanation is that self driving cars are braking unpredictably compared to regular traffic, possibly when other drivers have judged that they wouldn't possibly need to brake.

Such accidents are only the fault of the rear car in a simplistic prima facie sense. Unexpectedly stomping on one's brakes in a fit of road rage is generally seen as causing any resulting accident, as should overzealous braking by a self driving vehicle that isn't following the norms of the road.

For example, from the Waymo report of February 16, 2021:

> While in autonomous mode, the Waymo AV was stopped at a red traffic light at the unprotected intersection of 16th Street and Market Street. After the traffic light turned green, the Waymo AV began to proceed and then came to a stop as it yielded to oncoming traffic also entering the intersection. A passenger vehicle behind the Waymo AV then made contact with the stationary Waymo AV’s rear bumper

This could have been the Waymo turning left and needing to yield to oncoming traffic, OR it could have been the Waymo going straight through the intersection and slamming on its brakes due to someone coming in from the right that hadn't stopped before the Waymo's judgement margin. It seems impossible to tell from the accident report, but the two have very different root cause analyses.


That's better performance than most humans. Look at 16th and Market.[1] It's a 6-way intersection. Visibility to the left is very poor. The stop line is half a block back from Market Street. It's not even visible that there's a second street coming in from the left.

The Waymo planner is very aware of where its scanning is blocked, and makes worst-case assumptions about what it can't see. The mapping system knows there are 6 streets here, one of which is totally hidden at intersection entry.

It's permitted to turn right on red from northbound Noe to Market, but not from northeast-bound Market to 16th. So traffic from Noe gets stuck in its dedicated turn lane within the intersection. That traffic is supposed to wait and turn right, but it's quite possible that Waymo's system doesn't trust it to do so. This intersection also has dedicated bike lanes and a streetcar track. It had 7 accidents with 4 injuries in 2018.

It's hard to fault Waymo's system for cautious driving in that intersection.

[1] https://earth.google.com/web/@37.76429991,-122.43305656,39.4...


> That's better performance than most humans

I don't know how you can blindly assert that with the limited details in the accident report. I agree that type of intersection is a bit of a free for all, but that doesn't really inform judgement of how Waymo handles it.

My overall point is that city driving necessarily involves a bit of assuming that cars coming at you that are supposed to stop are going to stop. And I can completely see an autonomous vehicle company programming their cars to be overly cautious, and not being tolerant of others cars with just a little too much velocity for comfort (the kind that give your passenger white knuckles, but you have to tolerate as a driver) - as you said, making worst case assumptions rather than average assumptions.

I don't have any personal experience being around self driving vehicles, but in this very thread are people referencing the idea that a self driving car would stop, accelerate to move forward a few feet, and then come to a quick stop again - that's liable to be completely antisocial and unintuitive for other road users. A following driver should be able to handle that and even foresee it based on the car's movement pattern (the car they're following could just as easily be someone looking down at their phone), but continually putting this to the test is questionable.

If we could look at a video of that accident I referenced, it's possible we would see the Waymo doing something absolutely boneheaded like suddenly slamming on its brakes for no reason. It's also possible we would see the Waymo behaving completely reasonably, and the following driver spacing out. My point is that it's impossible to know from such limited reports, and it's a mistake to assume that being hit from behind automatically absolves the autonomous vehicle.

Your original comment did not directly claim that, but it did sort of imply it while stating the solution is for other cars to develop automatic braking, even though that's a long ways off for most cars on the road.


> I really wish we could just enforce proper education of being observant.

I know everyone considers themselves to be a good driver. I consider myself to be a relatively good driver and generally keep very conservative follow distances (no reason to be on someones bumper).

Even when paying attention, I've been in situations where I've had to apply the brakes harder than I should have to avoid a rear end. Sometimes you just don't read the situation correctly and need to play catchup.


The reality is that average human reaction times at speed are not very good and many people do not care to pay much attention or even follow the rules of the road. Most roads are also not designed for safety. They are usually designed for speed/throughput.


I support tackling this problem with both your solutions. But I certainly wouldn't bet my money on yours.


Please tell me more about this "proper education of being observant." Do you believe such a thing exists? Does it reliably make people consistently observant? Would it work on me? I would pay a lot of money for such a thing.


Yes, there are defensive driving courses offered at most racing schools and they're generally a prerequisite for taking more advanced racing courses but are also pitched as a terminal course for people who simply want to improve their daily driving skills. They'll take you through some of the basic theory of safer driving and then take you out onto the track to practice some basic defensive maneuvers.

People who take the course generally say that it has real, long term effects on their driving as it provides them a mental framework for how to evaluate traffic from a fundamentally defensive perspective and they find themselves practicing the same skills decades after they've taken the courses.

I'm not aware of any long term, longitudinal studies on whether defensive driving courses decrease raw accident rates but any study would be tricky to do since people who self select for defensive driving courses are likely to have any number of confounds that would make observational studies less than clear.


I understand this was supposed to be sarcastic but I've been informally working on this for years. I have a technique which seems to work wonders with students but I haven't the resources to test it scientifically, so do with this information as you will.

The technique is simple: ritually venerate the thing you wish to be more observant of, and never denigrate its value. So in the case of traffic stopping, have a thing that you do whenever you notice it (e.g. tap your braking foot), even while you're not driving, and never ever look at your phone whilst driving. Another way of thinking about it (if you don't like the word ritual), is as a variant of the Alexander technique: whenever you notice that thing, consciously note that you noticed it, and emotionally reward yourself for doing so.

I'm not sure where this fits on the scale of "outrageous quackery" to "blindingly obvious" but it seems to be news, and also useful, to fifteen year olds.


Indeed, this is a "take my money", "I'll subscribe to your newsletter" kind of breakthrough that I would dearly like to see everywhere (and in myself!), not just in the context of driving cars.


> I really wish we could just enforce proper education of being observant.

You'd have to give people whole new brains.

My grandmother had a condition where her eyes forced shut, completely out of her control. She insisted she could still drive without issue. She ignored multiple friends and family telling her she'd murder someone.

Last Saturday I had to dodge out of the way of a car (fortunately moving very slowly). The woman driving was staring in my direction at me as I waved my arms. That she nearly ran me over eventually clicked and she rolled down her window to appoligize.


> I really wish we could just enforce proper education of being observant.

If I had a dollar for every person I see driving around in thick rain, or in full darkness with their lights off, I could probably buy a nice new macbook air.

It's truly shocking how many people are far from alert and observant while driving. Either because they really don't care, or they've been conditioned to take driving very casually.

I use the number of cars I see driving with their lights off as sort of a gauge of how many 'bad' drivers might be on the road, in a 4-5km radius, at any given time... And it's a percentage that is too damn high. My working theory being that if people are putting a car into gear and getting into traffic without checking that their lights are on first, they're probably far from alert and ready in all other driving scenarios as well.


This is more of a problem caused by emissive displays replacing traditional gauge clusters. There is no longer any feedback in low light conditions from illegible gauges forcing you to turn on the lights. Automatic headlights (without needing a selectable auto mode) should be mandatory on every vehicle with such a display.


I don't know, I was trained in the navy to drive ships and sometimes I have a near accident when driving my car, and a car is way easier to drive. I think people are just imperfect and sometimes have accidents, so we should allow for that when engineering tech.


I'm certainly not blaming the victim for getting rear ended, but I've noticed that a lot of people brake a lot harder and more suddenly than needed. It's easy to see how that can lead to getting rear ended, even if the other driver is only distracted for a second.

I try to spread my braking over the space I have to do so. Lower braking force over a longer time is much safer. It both gives people more time to react, and if a collision does occur the speed differential is lower, and less damage will occur. This doesn't save you from getting rear ended while stopped, but that's a minority of rear endings from what I have seen.


> I really wish we could just enforce proper education of being observant.

The problem: on long stretches of road, especially with tunnels or trees acting as side wall, human biology gets in the way - we literally don't perceive stuff happening properly anymore (and as usual, German has a word for it: Tunnelblick, tunnel vision). The additional factors of people being (too) tired to drive or stressed out by children, traffic jams etc. also won't disappear no matter how much enforcement there is... the only thing that enforcement can suppress is phone usage and drug usage.

Better safety systems save lives!


> The problem: on long stretches of road

In Netherlands some of those are mitigated. Long straight roads have found to be more accident prone (doesn't matter if it's a highway or some local street). So infrastructure now tries to avoid such long stretches. It won't solve it, it's the Swiss cheese model of airplanes (try to allow multiple things to go wrong before it results in a failure/accident).


While I agree, people are unpredictable. Second, even when they are paying attention they may still react too slowly. Even when there's a competent driver at the wheel paying attention, crash prevention systems can still outperform them.

Anyway, I don't think the goal is to prevent accidents entirely; that would be ideal, but it'll be really hard. More important is to prevent injury and death. A car is just hardware, that's what insurance is for in the end. In that regard, even without automation, cars have gotten safer over the decades with roll cages and airbags everywhere.


When do the automatic emergency brakes kick in? Normally you're supposed to maintain at least 5 cars distance on the highway -- will it brake to always maintain that, eliminating those idiot tailgaters? :)


No thanks.

I drive an aged car and drove a friends recent Mercedes long distance. The thing has a feature where it breaks where it thinks it’s good to break, often just when I need to accelerate to get out of a tight spot. I need the vehicle to respond to my inputs directly, not mix my inputs with what it thinks must be done. It’s extremely unpredictable and it feels like I’m not in control of the car.


If studies show that automatic emergency braking cuts down on deaths overall, I think it's a worthwhile tradeoff -- I'd even be happy to have it be mandatory/non-disableable. It sounds like the car you were driving had some kind of defective implementation of automatic emergency braking. That doesn't invalidate the good that these types of systems do when implemented properly.

Also I'm wondering what kind of situations there are where you "need to accelerate to get out of a tight spot?" I'm a pretty experienced driver and I think the number of "must accelerate" situations I've ever encountered could be counted on a hand. Almost universally, you can solve whatever problem you're in by being more patient and decelerating instead. For example, if you're trying to make an aggressive merge into the passing lane because someone is coming up quickly behind, you could just wait until that person passes. If you're trying to get up to highway speed as you merge, but someone is slow in front of you, it might be best to leave that person more of a gap and wait until they go. Even in the case of passing on a two lane road (a situation where I doubt emergency braking would trigger erroneously), you've already made a mistake if a rapid acceleration is needed to avoid a dangerous scenario. Etc.


> Even in the case of passing on a two lane road (a situation where I doubt emergency braking would trigger erroneously), you've already made a mistake if a rapid acceleration is needed to avoid a dangerous scenario.

Are you talking about a road with each lane in one direction? Outside US that is the most common way to travel between cities.

It is easy to say that acceleration shouldn't be needed, but that's life right now. When somebody starts overtaking, the better acceleration he has, the safer he can do the maneuver. At that point the car behind the car that takes over may have closen the gap to go back, and emergency braking can be fatal.


> When somebody starts overtaking, the better acceleration he has, the safer he can do the maneuver.

If you overtake someone while driving in an oncoming lane, and cut it close enough that the emergency braking of a modern car cuts in, you didn't have enough room to perform the maneouver in the first place. Even safer than performing the dangerous maneouver is not performing it in the first place.

This is coming from someone who regularly drives a stretch 150 mile long road with many parts of the road where dangerous overtaking maneouvers are the only way to overtake, and you're sharing the road with articulated vehicles speed limited to 10mph lower than you on steep hills. The choic is always to _not_ make the maneouver.


I think he's talking about AEB engaging on the car you're trying to pass. If you are overtaking on a 2-lane you want to back up, wait for your opportunity and close on the vehicle to be overtaken at a decent rate of speed before pulling into the other lane in order to minimize time in the wrong lane. Also if the other person decides to be a dick and try and prevent you from passing this gives them less of an opportunity to do that.

If AEB forced me to get in the other lane before stepping on it because it thought I was gonna rear end the car about to be passed I would be very annoyed.


>Also I'm wondering what kind of situations there are where you "need to accelerate to get out of a tight spot?" I'm a pretty experienced driver and I think the number of "must accelerate" situations I've ever encountered could be counted on a hand. Almost universally, you can solve whatever problem you're in by being more patient and decelerating instead.

There are a myriad of situations, often involving merges of some sort, where the moves of various traffic participants have resulted in a situation where an aggressive acceleration up to speed and into traffic or up to the traffic speed of another lane so you can get into it, closer to the car in front of you to make space behind you for someone else, or something else like that results in a better overall situation and a net fewer number of people needing to react to the situation and therefore less opportunities for someone to do it wrong causing a near miss or accident.

Yes, you could likely solve these situations by braking but in doing so you are roping more traffic participants into the situation and making it more dangerous making accelerating and merging the more courteous approach. Braking for everything is definitely more stupid proof heuristic and if followed blindly will yield infinity better results than accelerating for everything but it leaves a heck of a lot of efficiency on the table.


Also someone barreling toward you from the side when you're in an intersection. I'm not sure how far down perpendicular roads Waymo cars can see, but I'd like to see them solve this weak spot to eliminate the rear-enders.


This is a good example. In this situation you’d actually prefer to lightly rear end the car in front of you and deal with those consequences than possibly lose your life because someone asleep at the wheel hit you from the side.


It's not a worthwhile tradeoff because erroneously functioning humans are still there, and this tech enables them! Today it's crashing a car, tomorrow it's forgetting to walk your dog or pick up your kids. Mindless humans are the problem and auto braking is just a band aid.

By all means, if auto braking (and in the future - auto driving) is enabled, there must be an option in the OBDII interface to disable it.


You're surely one of them though, no?

There's not a single person who can claim they've never been distracted while driving.

People hear "distracted" and their minds go to texting and their virtue signaling goes to 11... case in point

Realistically humans are easily distracted. Distracted can be an errant thought, a random sight, a bad mood, literally limitless possibilities for why our brains would wander from the task at hand.

I don't have a problem with saying automatic breaking should be controllable, but the diatribe about mindless humans... we're all mindless humans.


Interesting you should ask that, I had a lot of problems in college with distraction so it's something of a goal to be concentrated outside my house. Call it a coping mechanism.


If you look at driving patterns in developing nations it’s not the same as in America.

Driving data is cultural data, not some inviolable aspect of the physical world.

There is nothing that stops the behaviors in, say India, from happening in America. Which means everything from stray cows on the road (farm escapee) or people driving in the wrong lane.

This is also before we get to things like thieves who work in gangs during heavy traffic.

Non standard human behavior will become more needed if you start reducing the more common cases where humans screw up.

Side note - I don’t look forward to the moment in this conversation where government decides what patch firms must put out, or begins asking for access to control car behavior during emergencies - the last of which is a fair ask.


I’m getting downvoted for voicing my opinion on a darling HN technology, is that it? Next time I’ll just keep my mouth shut (as I usually do because of the downvote brigade).

This time, I’ll bite. I guess you guys have never driven in Africa. Or Asia. Braking is not always the best course of action is actually dangerous in some situations. Like if I can see someone screaming up behind me in the fast lane, there’s a car next to me that means I can’t move out of the way, so I need to accelerate to get past that car and move out of the way. If I don’t get out of the way I’m now in a stressful situation with an unpredictable and often reckless driver behind me, often right up against my tail.

Maybe it’s better on average but personally I prefer to have full control of my vehicle.


>Like if I can see someone screaming up behind me in the fast lane, there’s a car next to me that means I can’t move out of the way, so I need to accelerate to get past that car and move out of the way.

Sounds like that guy's car should have automatic braking.


Even if the car isn't "screaming up behind you" if someone is closing at a decent rate of speed your two non-jerk options are move faster and then move over or move slower then move over. Depending on what's beside you faster might be the obviously better option.


I think it will be a long time before AEB is mandatory in Africa or Asia, so you should be good in that case.

Personally I think this is a sort of moral hazard scenario, where exactly the people who most need AEB would be the ones who would turn it off. So even if it makes a few people unhappy, I'd probably support not allowing the user to have control over this safety feature. Of course we'd need to study the issue to establish whether this is the case.


In near-critical situations there tend to be disjoint optima in how the group of participants need to behave in order to avoid an incident. If the automated system is programmed to prefer the more - hm how would you call it - calm, civilized, obedient "minimum", but the participants the more agressive one, I can feel your problem. Sometimes just squeezing through resolves a situation more directly and quickly. The designers of those systems are usually not allowed to prefer that. Even if the system would recognize it properly (which needs much more processing power, because the search space is much bigger and, as mentioned, non-convex.

I don't think you should be downvoted, because that is one of the core ethical aspects we need to discuss about those systems. And how they interact with humans. SV dystopia "soon there won't be humans in the loop" doesn't help here.


> Like if I can see someone screaming up behind me in the fast lane, there’s a car next to me that means I can’t move out of the way, so I need to accelerate to get past that car and move out of the way.

Those vehicles exist outside of the countries you mentioned, and the correct thing to do is to keep going and move out of the way when it's safe. WHat if the car on the inside sees the other car too and accelerates to give you room to drop behind him? now you've made the situation worse for 3 people.


I have got auto-braking on my car and it has never once activated during the 5 years Ive owned it.

It has warned with a beep signal I think 2 times when it deemed the car too close to other traffic.


Have you considered reflecting on your driving behavior? I'm personally rarely (i honestly don't remember when I've had to do this, but let's say less than once a year) had to accelerate quickly to get out of a tight spot.


People who believe they should accelerate out of dangerous situations, and who later comment upon such beliefs, should have their licenses automatically revoked by autonomous natural-language-processing robots that crawl around on the web for that specific purpose.


AEB is great, and I'm super happy that it's going to be standard, but its not going to totally eliminate the rear-ending problem. From wikipedia [1]:

> Forward collision warning plus autobrake is associated to a 50% decrease in front to rear crashes.

Similar stats are reported from around the world.

Low friction surfaces, weird car angles, poor radar/camera visibility and too high entry speed all mean the system isn't going to be 100% effective.

[1] https://en.wikipedia.org/wiki/Collision_avoidance_system


Or very slow entry speed. My wife rear ended someone while parked because her foot wasn't fully engaged on the break and it creeped forward until it was touching the car in front. It's a tesla so it started beeping but not until it was too late. No visible damage but that didn't stop the other driver from collecting insurance info and reporting it.


> The rear-ending problem will be solved as automatic emergency braking becomes standard on cars. Already, it's shipping on almost all high-end cars and 70-80% of midrange cars. There's a US auto industry goal of it being standard by 2022. That's probably the main feature needed for self-driving cars to coexist with human-driven cars.

Not necessarily. I have a Honda, and the system in my car is deliberately designed to only mitigate a collision, not prevent one (e.g. it will only engage at the last second to slow the car, not stop it). I'm speculating, but I'm guessing the idea is to discourage people from "testing it out" or relying too much on it.


The system I have in my Subaru is pretty useful, and warned me a few times when the vehicle in front of me has suddenly slowed down or stopped. However, it has too many false positives, and if it actually tried to brake every time, the false positives would have been very annoying and I would disable it.


It's generally split into two steps: forward collision warning and automatic emergency braking. Usually you can adjust the distance of the FCW in your infotainment system. I have to or even slight curves trigger it.

I have only had AEB kick in once when a driver decided to suddenly u-turn on a 45mph road and it saved my car from being totaled.

As always read the owners manual to know the limits. Most AEB systems are only rated to reduce the severity of a crash not prevent it from occurring.


Yeah, that makes sense. The automatic braking has never triggered for me, but the collision warning triggers pretty regularly (probably at least once a week) on one part of my commute, right around going uphill here https://goo.gl/maps/r7LpqzHq3Wqg6N8S7 where the system thinks I'm about to hit the parked white van, instead of realizing that the road is curved.

> Most AEB systems are only rated to reduce the severity of a crash not prevent it from occurring.

I think that Subaru actually calls their system Pre-Collision Braking.


It will be due to false positives. By the time the system is really sure you are going to hit something it’s probably too late to stop.


> Almost all the Waymo reports are "vehicle was entering intersection, detected cross traffic, stopped, was rear-ended by human driver". There's one Waymo report where someone ran a stop sign and hit them.

I've driven some some recent cars with a lot of safety systems and more advanced cruise control and they always felt like a bad human driver. They seem to be unpredictable and abuse the breaks, especially when driving on highways where you should just drive with the flow and look in front of you to see what's going on at least 2 or 3 cars ahead of you. I don't know why I have the feeling that technically the Waymo cars are behaving like they should, but some of their actions are just too unpredictable or come all of the sudden, surprising the drivers behind them with sudden breaks and actions that a human driver wouldn't do, or would be more predictable at it.


Solving the automated driving problem will require these systems to behave in ways that regular human drivers consider acceptable - at least for a transition period of time where both live on the road together. Sudden hard breaking in situations that humans see absolutely no need for it is an unacceptable behavior for human drivers to have. Just because it's not your fault doesn't mean you can't do things to prevent it. I have prevented many crashes throughout my life due to my mitigations around others' driving mistakes. We should expect self driving cars to do this at least as well as - not worse than - humans.


I’m speculating here but it’s possible that the waymo vehicle being rear ended is an optimal outcome here. If the waymo vehicle would have otherwise collided with a car in the intersection it would probably be a much worse crash than a rear end with a low/moderate relative velocity.


It might be locally optimal in the "out of the frying pan, into the fire" sense, but there is no way this is optimal.


It could be that the Way no car stops earlier or stronger than a human driver.

For instance the driver behind them might see "ah cross traffic coming, but the guy before me tries to go through anyways" and suddenly the Way no stops.

It is true that laws typically state (i don't know the applicable californian law, but assume) that the driver in the back has to drive in a way that they could always safely stop, but that's not how most people drive, while one drives with expectations on the car ahead.


> That's probably the main feature needed for self-driving cars to coexist with human-driven cars.

How is that feature related to self-driving cars? Are you saying self-driving cars get rear-ended more than human-driven cars do?


Sounds like Waymo just can't adapt to the same situation.


This does not, by the way, imply that the machines are working correctly and human drivers are the problem. These cars are rear-ended more often than human drivers, which suggests they stop more suddenly or at times humans don't expect.

Slamming on the brakes is a flaw. Even if every car on the road had the ability to respond instantly to the car ahead, being forced to brake hard still risks loss of control especially in bad conditions.

Also, having automatic emergency braking on all vehicles is at least decades out. It will never work on motorcycles or scooters or bicycles, period. It cannot be retrofitted onto existing cars. Requiring it for all new cars would require pretty hamfisted regulations. I expect human drivers to be widespread for at least the rest of my life.


> Even if every car on the road had the ability to respond instantly to the car ahead, being forced to brake hard still risks loss of control especially in bad conditions

Driving on the road is not some kind of high speed car chase. If one had left enough space to the car in front based on the current road conditions and speed one wouldn't have to brake hard. Anyone can adjust their speed to increase the distance to the car on front. Often arguments sound like driving too close or too fast are inevitable facts of life.


> This does not, by the way, imply that the machines are working correctly and human drivers are the problem. These cars are rear-ended more often than human drivers, which suggests they stop more suddenly or at times humans don't expect.

I had a look and couldn't find an answer; does their increased rear-ending frequency result in a decrease in head-on collisions? Is waymo getting rear ended to avoid being t-boned (or a head on collision with someone else)?


They can use it to start fixing the roads, not the cars.

Just look at the cross intersection they use as an example, typical of the US. The ROAD is the death trap there. People are crossing each other at high speed, this is the BUG, and should be fixed.

The proper solution is putting there a roundabout, and you will have instantly less fatal accidents.

The US was the first to use roundabouts, but those early designs did not work well, and as a result we have those monstrosities like in the picture all around the US.

With electric cars that record accidents, the first bugs that we should fix are those in the roads, like those concrete barriers that have no smooth transition but a front wall out of nowhere.


> Just look at the cross intersection they use as an example, typical of the US. The ROAD is the death trap there. People are crossing each other at high speed, this is the BUG, and should be fixed.

> The proper solution is putting there a roundabout, and you will have instantly less fatal accidents.

People are only "crossing each other at high speed" in these simulations because the scenario involves someone running a red light:

> Here, for example, you can see on the bottom of the screen that the simulated Waymo Driver avoids a reconstructed version of a real-life fatal crash by obeying the speed limit—and not running a red light, as the initiator did in real life [emphasis mine]...

I don't think "roundabout all the things" is the answer to every traffic and road safety problem. This looks like an intersection between two 3/4 lane roads, and you'd need monster roundabout for it that seems like it would be confusing and/or slow. Also, a road can "have instantly less fatal accidents" by being closed or turned it into a traffic jam. IMHO, the road engineers need to balance accident avoidance, throughput, and user-friendliness.


I live in Germany. We have some roundabouts. And let me tell you: they rarely work better than red lights. Traffic flow requires more than roundabouts. They require proper planning, green waves, proper sizing and still different lanes for roundabouts. Don’t forget about semis either. Just throwing roundabouts somewhere won’t solve anything. I’ve seen that often enough. That will just create more traffic jams.


The literature says differently, and is also aware of your opinion.

https://www.safetylit.org/citations/index.php?fuseaction=cit...


Undersized roundabouts with 1 lane are not fitting for certain places. As said, I have seen many traffic lights being replaced with roundabouts. If you go to the Netherlands you see how roundabouts are executed well. Also Denmark has well made roundabouts.

Just putting a small roundabout somewhere without putting thought into it is making things considerably worse.


I do agree that road design would drastically cut down on excess speed and crashes. I'm not so sure about roundabouts as a solution though. Roundabouts may be useful to allow more cars to travel through a place but they are not great for other road users. I think you are also correct about the speed. Cars are mostly driving too fast. We have the tech to solve this problem today in a rather straightforward manner but the very idea of it makes people go crazy: speed governors on vehicles. The funny thing is, some places have them mandated on things like scooters and e-bikes, but a car with 700 HP that can go 180 mph does not have one and drives on the same road. There is a LOT of low hanging fruit.


https://www.youtube.com/watch?v=Ra_0DgnJ1uQ shows an alternative approach to regular accident spots.


I wonder who the insane engineer is that thought a 8 lane cross intersection is a good idea.


The one paid and told to: make an 8 lane intersection there.


I really dislike the way the data is being summarized as "100% avoided or mitigated*".

First, they're omitting the types of case they're failing to mitigate (rear endings in this case). Cue "60% of the time it works every time" anchorman scene.

Second, "mitigated" is defined to mean a 25% relative reduction in chance of serious injury. Which is good, but again smacks of "60% of the time it works every time".

Basically, it looks to me like someone wanted to be able to show a bold blue 100% and required the definitions they were using to be massaged to match. And apparently they got their way. This has damaged my trust in future summaries put out by Waymo.

A more truthful summarization style would be e.g. "X fewer serious injuries over 72 incidents", referring to expected injuries and simulated incidents of course.


I agree the results are excellent but the way they try to over-present them is very scummy. I don't see why they didn't think the results were good enough to stand on their own.


This is really cool.

I suspect that when we have self driving cars, it's going to take off very quickly due to insurance. In particular, if you are in a non self driving car and in an accident with a self driving one, it'll be hard to not be shown as at fault. So insurance I would guess go up based on the percentage of self driving cars, and likely be really cheap for self driving cars.

I wonder if there's a period though where people cut off self driving cars, knowing they won't get hit. I suspect pedestrians in cities will jaywalk a lot more (and maybe that's ok).


I suspect the adoption process will be slow because consumers really don't want this product. Self driving cars will drive the speed limit and infuriate both other motorists and their owners who are getting passed by speeding human driven cars. There will also be well publicized stories of the few times that self driving cars perform worse than humans. All drivers think they are above average and therefore the statistical comparison to the average driver does not apply to them.


If there's one thing I don't care about on the road, it's infuriating other motorists with my safe driving. But, for what it's worth, I drive the speed limit and I pretty much never see anyone get angry at me for it. If you hang out in the right lane, all the type A people driving 20 over are in the other lane, zipping past you. You simply don't encounter the type of people who would get mad at speed limit driving in the slow lane. As long as the self driving cars are implemented to follow passing rules, it's not going to be an issue if they were to drive the speed limit (although as another person pointed out to you this is a counterfactual).


> and their owners who are getting passed by speeding human driven cars.

I personally drive 5-6 mph over the speed limit on freeways when the weather and visibility is good.

But I absolutely 100% do not care if the vehicle is driving the speed limit if I do not have to be controlling it. What do I care if the vehicle takes an extra 10 minutes to get to the destination? It'll just mean 10 extra minutes of work or nap time.


I think people will want it. They'll get to watch netflix or read or take meetings on their commutes. They'll get to live further from the city in a bigger place than they could otherwise.

I also suspect we'll see cars looking more like living rooms or offices.

The other thing is that as we have more self driving cars, speed limits will be able to be relaxed. Most humans can't safely drive at 100mph, but on a road of only self driving cars that seems very possible. So that 1 hour commute radius gets bigger and bigger.


I agree that a world of 100% fully self driving cars that talk to each other makes the road capacity and travel speeds much greater. I don't see any way, however, to get from here to there (at least in the US). Are we going to give cars to people that currently rely on beat up used cars to get to their jobs? It's the same reason we can't have single payer healthcare -- the chasm is too far for us to cross politically and economically even if the destination would be superior.


> because consumers really don't want this product

> Self driving cars will drive the speed limit

Tesla is proving both to be false, today.


Tesla's aren't self driving. They are helping humans drive. For a Tesla to not drive the speed limit it would have to either be told to do so or fail.


Maybe you should drive a Tesla before you claim what it does.

It will gladly speed without driver input. There’s an offset setting since not everywhere has the same customs. Here it’s about 7% increase is safe without getting pulled over.

The FSD beta is rolling out to everyone that wants it (which is a lot) in the next week or so. Yes it’s not perfect, but saying it’s not self driving is doing a huge disservice to the entire development team.


If I take a drive on a highway in the Tesla and the signs says 80 the car won't just decide to suddenly go 125 just because it feels like having some fun. You speed, your car doesn't, unless it is faulty.

Tesla's aren't self driving no matter how many times Elon says so. It's not even the best in most tests in being a driver assistance..

Being a fan of a car and not being able to see it's faults isn't healthy.


ok


When I was young people would never buy a car with automatic gearing because that was not cool (in Europe).

I remember the first mobile phones and everybody said that will stay niche, because why would you need to call if not at home? (Except sales people)

Convenience is a very, very strong motivator.


Consumers definitely want a self driving product.

What they don't want is a self driving product where they bear liability for the mistakes the self driving product makes.


There are plenty of people who are afraid of driving. They might be self driving car audience, if self driving cars would not require driver license.


I have two conflicting thoughts on self-driving cars.

* On the one hand I totally agree with you, the majority of people don't care about driving and just want to get from Point A to Point B. Given the opportunity to sit back and watch Netflix vs fighting traffic and having lower insurance premiums I think will make self-driving cars a tempting combination.

* On the other hand though, outside densely populated places where there is less traffic and more unpredictable terrain, self-driving cars are going to have an extremely uphill battle. First with the lack of extreme traffic, there isn't much of a case for needing for a car to drive itself.

Then there's the issue of trust. Say you're out in rural Idaho, nearest anything is 50+ miles away. Out there all those cool tech gizmos and self driving tech is a massive liability. If you're out in the boonies and your car looses connection and refuses to drive what do you do? What about if some wired internal computer system breaks? This is why cars like the Toyota 4Runner and Tacoma are so so popular, they are old school simple cars but more importantly, you can take the thing out the middle of nowhere and trust that you won't get stranded by "cutting edge technology"


On the other hand, if self-driving cars are as good at avoiding accidents as the article says that may cause the price of insurance for non-self-driving cars to fall.


I recall reading a few years ago that insurance companies are looking to partner with car manufactures. The idea is, when you purchase a full self-driving vehicle, insurance will be included with the cost of the car.


So who's going to pay for that? Tesla?

No, tech companies will just hide a "Not actually self-driving, driver takes full responsibility" sticker near the VIN. It'll be easy to blame the human for negligence then.


It's easy to be cynical. Volvo has already said they take responsibility [0], and I assume other manufacturers will also accept blame if their self-driving was involved.

0 - https://www.caranddriver.com/news/a15352720/volvo-will-take-...


Will Volvo directors go to jail if Volvo car would kill someone? I don't think so. That's not a responsibility.


A big caveat to this study is that, in the reconstructed simulations, once the self-driving car (SDC) deviates from what the human-driven car did, the behavior of all other agents becomes unrealistic. Often in these sims, instead of reacting to the new SDC trajectory, they replay their original behavior.

Ideally, once the SDC deviates, all other agents are simulated as well. A tall order, but necessary if these counterfactuals are to hold weight.


> Road safety is a major, global public health crisis. More than 1.3 million people die on the world’s roads every year, according to WHO. That’s more people than die from HIV/AIDS, and is equivalent to a passenger plane’s worth of people crashing every single hour—or one death every 30 seconds.

> I’ve spent over 20 years working in crash avoidance research, in the belief that improved driving technology is the key to reducing these needless deaths.

This is a noble goal, but I find it really damn hard to ignore the possibility that this is all just another ploy by Google to steal our data and violate our privacy. Just imagine trying to push back against a technology that saves lives for something as "frivolous" as privacy.

Why does it seem like the consumer can never win? The free market is supposed to be self balancing, yet I can't remember a time where it actually felt that way in the technology sector.


I think if we want to solve large scale intersectional problems like traffic (where lots of individuals with their own goals and own variables intersect) we will need to give up SOME of our privacy in terms of where we are headed (just heading), speed, car model/make or some kind of capabilities estimate... This could result in a smarter traffic flow, road design, or in Google's case powering safer self-driving cars.

I'm up for any of those outcomes and am also concerned about giving up data unnecessarily. I think the crux of the article besides hyping up self-driving simulated driver's better decision making is that roads aren't designed as safely as they could be and some minor inconvenience to the user (time, some data given up) could save lots of lives.


1) Would a random driver (random steering and throttle) also have avoided these crashes? It seems like they were fatal because several variables coincided very exactly, and really any other input would have avoided them.

2) What input is the Waymo driver using here? If these were not originally Waymo cars, then is it perceiving simulated video or getting raw access to ground truth vehicle positions?


They should try human drivers in a simulator. You would have to be careful not to signal when the accident was about to happen, because accidents in real life are very uncommon.

You could estimate how avoidable an accident was by the percentage of human drivers in the simulated "responder" vehicle that avoided it.

Maybe some accidents are easily avoided by humans because the real responder driver was distracted. I'm not impressed if Waymo driver can also avoid these.

But some accidents may be very difficult to avoid for humans. If Waymo driver can perform better in these cases then it's truly remarkable.


For 2, Waymo has a simulation environment with a whole simulated physics and sensor suite. They use this environment to train their driving systems on far more virtual miles than the number of physical miles they drive. So presumably they induce the reconstructed initial conditions from the crash and then they run the simulation a lot of times to see if their driver also gets into the crash.


This shows (sort of) that the computer does better in situations where humans are known to fail. It's entirely likely that the computers will have new and unexpected failure modes that humans would never run into.


Other humans put in place of the ones who caused the accidents would probably have a similar high avoidance rate.

It just shows Waymo was better than a small sample.


Would they spot someone running a red light at speed? I imagine most drivers are just waiting for green and doing nothing more than that.


That's ridiculous. All humans make mistakes.


Figure 3 has the responder slowing unusually soon. Makes me wonder if their model is overly fit to the sample. Still, if it's genuinely that good at responding and mitigating then the future looks bright.


Figure 3 almost looks like the responder is lagging. Feels like it slows down and speeds up in several places.

Assuming that’s not a gif/video rendering bug, it’s a great way to get rear-ended.


Here's the semi-annual reminder to everyone that you are probably following too closely, and this is exactly why you shouldn't.

The rule of thumb is that you want 3-4 seconds between cars, which at highway speeds is 80-100 meters or yards.

Yes, almost no one leaves that much room. That's part of why so many accidents happen. It's easy for you to do better.


I leave plenty of space for this exact reason. Doubly important on a motorcycle because we have less grip.

My point was more that the gif looks choppy and I don’t think that’s how the car actually drives. Imagine being a passenger in a car that brakes hard and accelerates every 2 meters while approaching an intersection.


Waymo does not drive like that. It's smooth, actually. Watch this guy's channel for daily videos of how the driver behaves in practice. https://www.youtube.com/watch?v=eq-QndXs-nI


Worth noting these are also animations of exceptional circumstances.

There's that little qualifier that "in the vast majority of events" it drove smoothly while avoiding the simulated fatal crash.

Now ideally, they'll be able to iterate to go from "vast majority" to "all" ... but if my future self-driving car drives smoothly most of the time, but I occasionally experience it braking suddenly or swerving without warning, and it is avoiding a fatal crash by doing so, I'm not going to complain.

Of course, if it periodically slams on the brakes because an empty plastic bag blows into the road or some other false positive for a potentially fatal collision ... I'n a little curious what the acceptable false positive rate is on potentially fatal car crashes. Would I (or others) accept 10 random brakings per necessary braking? More, less?


Maybe not so easy. In my experience, in any even moderately heavy traffic, if you try to stay that far behind anyone, then other cars will constantly move into that empty space.


That's fine.


"We suck at driving, you should compensate."

Nope. It's normal to expect technology to outdrive humans, not struggle to reach us.


Why limit simulations to fatal crashes instead of all reported crashes?


How could they recreate every crash that's happened? It seems the police only keep detailed records of the fatal crashes that happened there.


This gap in data is a bugbear for cycling/walking advocates such as myself who feel that non-vehicles do an unseen and unfair portion of the work of preventing crashes which might otherwise involve them. That they basically paper over a myriad of bad driver behaviours and ghastly infrastructure designs by always being on high alert and generally getting really good at "jumping out of the way." You can partly tell how effective this system is because when there are pedestrians killed, commentators express regret that the victim didn't just jump out of the way like a normal pedestrian, or imply that it was their fault for other reasons such as wearing the wrong clothing, looking at their device, or choosing an inappropriate time of day to go for a walk. All of this leads to attitude among drivers and politicians that current solutions are fine because the only numbers they have are literally deaths.

Anyway, one would think with SDVs that there'd be an explosion in data for this kind of thing, since the cars will witness and participate in many more close calls than could ever be reported or tracked by anyone, much less the police. I wonder how much Google is able to process their car data for these kinds of things? Potentially it could be a huge aid, not even just for law enforcement, but also for urban planners looking for the close call hotspots.


"The simulations were not conducted independently of the company, nor were they reviewed by any third-party for verification prior to the company making them public." from https://www.theverge.com/2021/3/8/22315361/waymo-autonomous-...


Why is that relevant to my point? How can they do all crashes without knowing the specifics of each crash?

"This research was made possible because of the thorough crash reports the Chandler Police Department and Arizona Department of Transportation (ADOT) prepare following fatal crashes, and we’re grateful to ADOT, Chandler Police Department, and Arizona Department of Public Safety for making these reports available to the public upon request."


But what about choosing between hitting one doctor versus a group of school kids? /s


This is why we need a social credit-rating type system, then it just totals the points.

Kids are faster, less likely to be hit, and generally have worse credit so it all works out.


Plus making more kids is easier than making more doctors. It's a no-brainer!


Another thing that would avoid fatal crashes is reducing the number of cars on the roads, making them smaller and making them slower. That's a lot cheaper than making them more complex.


Not leaving your house would solve a lot of things. Why don’t people do that? Anyway, slowing down traffic is not the answer. Driving should either be more automated or more engaging so drivers stop using their phones. 2 hours of driving 20 mph in a straight line will be boring enough to create more accidents.


That comparison is obviously nonsense. I don't drive and I can live my life just fine, without staying confined indoors.

If people had to drive 2 hours at 30km/h, they'd be more inclined to quit driving. It's a stupid addiction which is way more lethal than smoking. Better public transport is a much better idea than messing around trying to improve driving. The situation where the majority of the population has (or "needs") a car is ridiculous.


If cars were limited to a certain speed it would for certain cut down on crashes. It would also cut down on fatalities. When the news article says "speed was not a factor", they are only referring to the fact that the driver was going at or under the posted speed. That does not mean speed was not a factor. You think it would not work because driving is boring? Perhaps if you can't handle driving responsibly, you should not do it?


Europe has built roads forever that makes driving engaging. Europe has a lot less issues with phones while driving etc.

Germany also solves that problem by having no speed limit when it gets boring. Making driving engaging is the best way to create mindful drivers.


What do you think makes for an engaging road? One thing that comes to mind is road width. A narrower road forces a driver to pay more attention.


Regular corners. Road width isn’t even that important really. Smaller roads makes some people drive very inconsistent with their speed, which annoys people and encourages overtaking at possibly risky spots.


See also: fewer accidents at rotarys because people know they need to be on their A-game.


What do you call that when you subconsciously suppress any tests or scenarios you know will break your code? Like you notice somethings kinda racey but it involves other components you don't control and so you don't do the parallel tests you probably should, that kinda thing.

Because that is exactly the bad feeling I get with Waymo never actually testing anywhere that doesn't have mm-precise maps and the all important California weather. There seems to be this implicit understanding that when you drop the thing in light snow it will just careen off the side and nobody is particularly incentivized to find that yep, that is exactly what happens.

(Even in these simulated scenarios here, you got what looks like a barren wasteland and exactly two cars)


They have been testing in snow in Michigan, rain in Washington and a bunch of other places. You just don’t see it because they’re not operating a commercial service there yet.

Arizona is where bulk of their operations is currently and where they offer a commercial service, so it makes sense for them to focus on that area in their safety reports.


Well, devil's advocate here, but teaching a car to drive in good weather is hard enough, and has lots of applications! It doesn't seem totally unreasonable to focus on that first?


Yes, if basing your entire model on visible road markings that is broken by a light coating of crystallized water.


Oh. Personally I learned how to drive in good weather first.


If the road markings are visible, do you use them? Or do you ignore them and never use them since they may sometimes be covered?

The former?

So makes sense for an automatic system to do the same?

> a light coating of crystallized water

Do you mean snow?


Not if the roads are mapped. You have prior knowledge of exactly where the markings are even if it’s not entirely visible due to rain.


Waymo is opening testing in Ohio right now, so it’s going to deal with terrible weather soon enough. https://www.limaohio.com/news/438416/waymo-to-open-ohio-test...


Of course they're incentivized to test and implement for other areas. Their system will be worth astronomically more if it works everywhere and if it requires less precise mapping. Why would you assume otherwise? Also, small thing but it's AZ weather, not CA.


Even if you require precise mapping, that's not at all a deal breaker. Google went and mapped rural Kenya to make Google Maps.

Mapping wealthy cities with >1 million population for a product with vastly more revenue potential is a trivial hurdle.


So long as they are intentionally constraining themselves to well mapped places with good weather, what's the problem? Summer tires don't work in the snow either, but that's fine because they aren't expected to.


> There seems to be this implicit understanding that when you drop the thing in light snow it will just careen off the side and nobody is particularly incentivized to find that yep, that is exactly what happens.

Most people from Southern states will careen off the side when dropped into light snow--I'm exaggerating, but not by a lot.

If the Waymo car does nothing else short of "maintain appropriate following distance" it is already ahead of 90% of all drivers in all weather.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: