Hacker News new | past | comments | ask | show | jobs | submit login
Police Say Uber Is Likely Not at Fault for Self-Driving Car Fatality in Arizona (fortune.com)
315 points by tlrobinson on March 20, 2018 | hide | past | favorite | 436 comments



My takeaway from the comments in this thread: self-driving Ubers don't drive defensively, almost surely can't "read" pedestrian behavior, and even with lidar may never have seen Elaine Herzberg. Oh, and they speed.

This sounds like straight out of the "hell" scenario for driverless cars: "we cannot accept responsibility for what happens to pedestrians and cyclists foolish enough to stray out of their designated safe zone".

I'm referencing Robin Chase (the Zipcar founder)'s "heaven or hell" framing - https://www.citylab.com/transportation/2014/04/will-world-dr...

For a recent, thoughtful talk about the damage self-driving cars might do to the city if we are not careful, see this Congress for the New Urbanism talk about "Autonomous Vehicles & The Good City" at https://www.youtube.com/watch?v=utnPEbDNbrE

It's also worth mentioning that there's a bill (AV START, or S. 1885) in the current Senate that would actually preempt local and state regulation of autonomous car safety, but it has been put on hold.


>Oh, and they speed.

3mph variance is perfectly normal even on modern cruise control. And even on most modern cards, what you see on the speedometer is often off by ~1 mph at low speeds, and even more at higher speeds. (This is the reason you will often see people talk about 'GPS verified' speeds when it comes to talking about the top speed they have achieved in a car). Federal law in the US only requires they be accurate within 5% when using a standard tire size on normal pavement with the recommended tire air pressure. If you have over or under-inflated tires or have upgraded your car to larger tires for whatever reason, this 5% might become a very different. Even if the Uber was perfectly maintained, that could be 1.75mph variance on the speedometer @ 35mph.

Keeping to exactly the speed limit also just isn't how anyone drives. It wouldn't be, even if speedometers were perfectly accurate. It's impractical. It's hugely fuel inefficient. Everyday driving, be it by person or cruise control, involves accelerating and then letting off the gas. This means if you are targeting a speed limit of 35, you're swinging a few mph in either direction. Add in things like changing road conditions - every highway I've ever driven on will have random patches of road that are made up of different material or different levels of wear after having had potholes, etc, repaired - changes in road incline, weather conditions (winds), or even the behavior of cars in front of you - a vehicle changing lanes is going to result in different air resistance, especially larger vehicles, and plenty of other variables, makes the idea that any vehicle should be kept at exactly the speed limit is pretty unreasonable.

Speedometers aren't perfect. Cruise control or even excellent human driving will never keep you at exactly the speed limit. While saying going '38 MPH in a 35 MPH Zone is speeding' is technically accurate, it's not particularly useful.


3 mph is almost 5km/h. If their technology is unable to stay within 3mph of a certain speed (which is laughable), they should drive 3mph slower than the speed limit.


In Toronto, if a sign says 60 (km/h), the driving instructor will tell you that you're supposed to go at about 65. In Sao Paulo, a 50 sign often means the radar is calibrated to ticket you at 50.1. We can't really make any judgment about being off a few miles unless we know what's the prevailing interpretation of speed signs in Tempe.

Besides, some commenters already pointed out that the google maps of the road in question says that the speed limit is actually 45 northbound (which was the direction the car was going), whereas the 35 number is the speed limit southbound.


I would be very surprised if your cruise control kept within 3mph of the set speed at all times in all conditions. My car certainly doesn't, and it's a 2017 model with a sophisticated adaptive cruise control.

I'll ignore ACC for a moment and talk about the modern implementation of regular cruise control, however. You'll have your target speed stored in a control unit, which cares about 3 to 4 things: your set speed, your current speed, the flow of fuel to the engine, and in newer models, access to the brakes. So, how do we determine the current speed? The drive shaft will have a magnet mounted on it, and each time it spins around there is a sensing coil that sends a signal to the control unit. Based on the frequency of these signals, it determines what your speed is, and compares it against your set speed. If it is low, it will send a signal to a servomotor that is connected to the accelerator, sending more fuel into the engine, hopefully increasing your speed. If it is going to fast, it will cut fuel, and on some models, apply breaks if you are more significantly over the set speed.

However, this is a pretty dumb system. It only knows how fast you want to go, how fast you're going, and whatever the formula is that it follows for increasing the flow of fuel to the engine. It doesn't know about millions of other factors that affect how fuel efficient that acceleration is going to be. It is constantly readjusting based on the new speed readings. To my knowledge, all manufacturers also go for something that provides a "smooth" driving experience for their adjustments, rather than how to most quickly achieve the set speed. When you take the limited data available for a cruise control system and combine it with built in smoothing to keep the ride comfortable, you end up in a situation where the cruise control is making frequent adjustments that end up on one side or the other of your target speed. The lack of perfect information means they purposefully do not attempt to perfectly match the target speed - attempting to due so would require so many adjustments that it would be pretty terrible for fuel economy.

Now, hopefully, the Uber vehicle is going to have even more data available than my ACC equipped car. But even with my ACC equipped car, almost all of the work has gone in better determining what the set speed should be, and significantly less on making sure it hits it exactly. I expect this is probably the same on the Uber vehicle. Because quite frankly, it's just not that important. To be quite blunt, the difference between being hit at 35mph and 38mph is probably just how far your corpse flies.

As for saying they should drive 3mph slower than the speed limit, then they might end up going 6mph under the speed limit instead of 3mph under.

I suppose my fundamental argument in the previous post and now this is: What outcome is expected from them being able to stay perfectly at 35mph instead of having enough variance to hit 38mph? What benefits are there? A few mph makes the most difference in damage from an injury standpoint at very low speeds, and these are the speeds that variance is almost always significantly lower - there's no evidence here that Uber cars are going 8mph when they're meant to be going 5mph, for example, and this would not be the type of variance we see in cruise control systems in general - and very little difference to the outcome at high speeds, where the greatest variance is likely to be shown.


That's a lot of text for explaining why a car might go to 3mph too fast. There is a big difference between a cruise control and an autonomous car. In Germany you can get a ticket for going more than 3km/h (1.875 mph) too fast. It's not acceptable for a self driving car to go so fast that you can get a ticket. If that means going up to 6mph too slow, I guess that's the way it is. Also, going 5km/h means you will need an extra 5m of braking distance or so in case of emergency braking. With electric cars, regulating the speed smoothly will be even easier.


It's an explanation of how cruise control works.

It sounds like cruise control is dangerous to use in Germany, based on your description. My 2012 VW GLI certainly did not stay within 3km/h of the set speed when utilizing cruise control.

>If that means going up to 6mph too slow, I guess that's the way it is.

If you want to be in a situation that is more dangerous than going 3mph over the limit, sure. If everyone was going 6mph under the limit this would be fine, but staying closer to the speed of traffic around you is less likely to cause accidents than going significantly slower. (Which, surely, if 3mph is significant, than 6mph is even more so)

>With electric cars, regulating the speed smoothly will be even easier.

I agree here. That just means solving the speed determination issue, which is not simple. GPS is the best method we have now, but is very problematic if there is no signal, or a degraded signal.


This would seem like a major problem then. In Silicon Valley it’s very common for freeway traffic to move at speeds that are 10 or 20 mph above posted limits. You either stay under the limit creating a dangerous situation as the result of differential speed or violate the limit and match the speed of surrounding traffic. How does the engineer not become liable in this context, like the VW engineers in the emissions scandle.


> It sounds like cruise control is dangerous to use in Germany, based on your description. My 2012 VW GLI certainly did not stay within 3km/h of the set speed when utilizing cruise control.

You don't need to stay within 3km/h of the speed limit to follow the law. You need to stay below 3km/h over the speed limit. So if your cruise control has a variance of 5km/h, you should set your cruise control at 2km/h below the speed limit.

> If you want to be in a situation that is more dangerous than going 3mph over the limit, sure. If everyone was going 6mph under the limit this would be fine, but staying closer to the speed of traffic around you is less likely to cause accidents than going significantly slower.

Unless your cruise control has a significantly larger variance than all of the other vehicles, you should be fine because everyone else should be setting their cruise control to roughly the same thing as you.


Sounds like maybe the culture is different in Germany for this. In the states 5 mph over the speed limit is normal. While you could get a ticket for it, you won't unless you're in a small town that's just trying to pad its budget. In some parts of the states even 10 mph over is regularly tolerated on highways.


I read your comment and thought that was a ridiculously tight margin of error, even for the Germans. Then I looked up the fine.

€15 for an in-town violation, €10 for out-of-town, no points for less than a 20 km/h variation? That's more of a secret tax than an actual deterrent.

Electric cars can't regulate the speed more smoothly than gas cars while on cruise control. It's about the control algorithms, not the source of power, and the gas systems could be on tighter control ranges if it weren't so inefficient and uncomfortable for the passengers. Adaptive cruise control runs on a PID loop control scheme and overshoot is inherently part of the game.

While on cruise control a car will go slightly faster than the speed setpoint for a few seconds, then slower, then faster, until it settles on exactly the correct speed. Then you go down a hill and it takes a little while to slow down, or you go up a hill and it takes a little while to speed up. Electric cars can use regenerative braking when the car is going down a hill but they're still constrained by the nature of control loops and aggressive braking will just lead to wonky acceleration-braking cycles while hunting for the setpoint.

That's also how humans drive. We just don't do it as well. If you're worried about tickets set your cruise for 3 km/h under the posted speed.

The setpoint is the setpoint. Key element of controls engineering. Secretly subtracting from the setpoint behind the scenes to compensate for your local driving laws would be the car lying to you and just leads to more trouble than it's worth.

Every day people tell me that they want a setpoint to be the temperature that the room never exceeds or the temperature it never falls under or five degrees above the highest temperature the boiler hits. It's like setting your clock ahead by five minutes to avoid being late. You can do it if you want to but it's ridiculous functionality to build into the timepiece.

Further, outside of detection errors like the one in this article, the reaction time of a self-driving car is orders of magnitudes faster than a human. You don't need 5m more braking distance per 5km/h because within milliseconds of the computer noticing the issue the car will hit the brakes. Human reaction time is more like 0.7s to 3s. You still need more time to brake as you go faster but the computer doesn't need as much time as our human laws already give us.

In a case like this where the victim enters the path of the car closer than the car's brakes are capable of stopping, even given instant detection, a human would have just hit the victim at a much faster speed because of the reaction time. That's going to happen sometimes. It's just how it works.

Basically everything you're saying is based on an outdated notion of how cars work and in particular your very German desire to follow the rules exactly. The rules are going to change, dude, and in the meantime you're free to set your cruise at 47 km/h to avoid accidentally triggering a photo-radar trap if you like.


> 3mph variance is perfectly normal even on modern cruise control.

That'd be a variance between true ground speed and the speed as estimated by the car. Chances are that the 38mph number came from the car itself, so there should be no "variance", leave alone 3mph over the limit. Modern cruise controls are perfectly capable of maintaining set speed, even at a considerable downhill.


> Chances are that the 38mph number came from the car itself, so there should be no "variance", leave alone 3mph over the limit.

My understanding, which may be incorrect, is that the 38mph number came from GPS recording, and that "cruise control" in Uber's self driving vehicles is not based on GPS speed, due to the impact poor/loss off GPS signal would impact it.

>Modern cruise controls are perfectly capable of maintaining set speed, even at a considerable downhill.

I'm not particularly handy with repairing cars, but I have a pretty decent understanding of how cruise control systems work due to nerding out on it when I bought a car with adaptive cruise control. Everything I have read on the design of cruise control says this simply isn't the case, because it isn't even a design goal. Much more important in the design is keeping close to it while maximizing fuel efficiency and driver comfort, and even in modern cruise control not all systems have access to the brakes, which would be required for keeping it at the set speed going considerably downhill. Anecdotally, all of the cars I have owned fit this and do not keep my vehicle at exactly the set speed, including the most recent one with ACC.


> Keeping to exactly the speed limit also just isn't how anyone drives.

People should not "keep to the speed limit". In England at least the speed limit is the absolute maximum speed at which you drive. You should be driving slower than the speed limit at all times, with no exceptions. Anything else is almost always incompetent driving.

I find it hard to understand why drivers think the laws don't apply to them. Sure, it's annoying when cyclists ignore road laws, but cyclists are far less likely to kill other people.


> In England at least the speed limit is the absolute maximum speed at which you drive.

In theory this may be true but it's absolutely not in practice. It's not uncommon to see people at least 20mph over the limit on motorways. On the M4 that is practically the norm.


Try sticking to the speed limit through long 40 or 50mph roadwork sections. I've done it for a laugh before. You end up getting overtaken non-stop including by trucks which, to me, feels far more dangerous than just fitting in with the prevailing speed.


I think there's something off about your arguments, human drivers speed, but they don't do it because of properties of the speedometer or the cruise control, or air resistance, or potholes - they do it because they don't care much about the speed limit.

It's completely possible to stay below the speed limit in a car, a lot of people just don't bother. Cruise control won't apply the brakes (at least in older cars) if you encounter a significant downhill gradient - that's up to the driver, and while speedometers are often out and GPS is more accurate, speedos universally over-read in my experience. If you're doing GPS 38mph your speedo is probably reading 40, which is not really within an excusable error margin of 35mph IMHO.

I would expect a fully autonomous vehicle to be able to accurately obey the speed limit and I suspect the reason it wasn't/doesn't is because that behaviour has been programmed in or developed. You could make arguments that doing what the other cars around you are doing is safer than being an outlier, but that's not the same argument you're making. I think the 38mph figure is significant.


Current systems cannot know the exact speed of a car due to the mechanics involved:

A speedometer's reading is measured on the gearbox, at the output but there are a myriad of little things after that including variance in the final drive, tire size and tire pressure all of which can make the "calculated" speed and the real speed to be within a few % of each other. All of these components are also affected by wear too.

At the moment you need a GPS to conclusively measure a car's speed with >99% accuracy but GPS signal isn't available/sufficient everywhere. One could argue that we should create some sort of highly accurate local system that measures speed and we certainly can but we'd be trying to solve a 0~10% speed measurement problem: we'd get no real value from it.

In other words: a car measures how fast a specific cog is rotating, adds a few variables and estimates how fast the car is going based on that.

edit: and all of this assumes the car is going straight, calculating speed when the car is curving is more complex.


>they do it because they don't care much about the speed limit.

Oh, certainly, and I apologize if my argument came across as "It's not possible for people to not speed" or "People only speed because of the properties of speedometers or cruise control"

My point is that speed within a few mph is already inaccurate, and that 3mph is pretty much within the current margin of error.

>speedos universally over-read in my experience

I don't know that there's a common leaning one way or the other in actual variance due to speedometer construction, but over-reading is the most common occurrence due to tire pressure being low causing it, and a huge amount of people drive with low tire pressure. Overinflated tires, or moving to larger tires in general, will result in under-reads.

>I would expect a fully autonomous vehicle to be able to accurately obey the speed limit and I suspect the reason it wasn't/doesn't is because that behaviour has been programmed in or developed.

I don't agree. lagadu covers this plenty well in his comment, however.


Speed isn't the issue, reasonable speed is the issue. As a human driver, if there are a lot of pedestrians - or I notice pedestrians stumbling/fumbling or seemingly not too conscious of where they're walking, I slow down.


I think you might be confusing "defensively" with perfectly. There are situations even defensive driving cannot prevent. For example, any car going 35 mph will need a few feet to brake if something jumps directly in front of it. Defensive driving just accounts for one's surroundings and likely outcomes. Seeing as the cars have been on the road in Arizona for a while, and this is the first we are hearing about a pedestrian collision, it seems the algorithm is pretty safe.


An important observation is that the probability of death drops very rapidly with drops in speed. A rough calculation shows that at 38 mph and a near instant reaction (as a robot should) would within ~1 second of applied breaking force, allow going from a ~70% chance of death to ~5%.

A very rough calculation: v(t) = -μgt + v_0. To reach 32 kph or 20 mph, t ~= 0.82/μ, guessing a fair μ=0.8 and the chart here https://ec.europa.eu/transport/road_safety/specialist/knowle...


For the people that practice street-math, Kinetic Energy goes as (velocity)^2. So halving the speed reduces the energy to 1/4. A third of the speed is 1/9th the energy. etc.

As you go faster, your energy goes up exponentially. This is why there are bitter fights about increasing the speed limit on roads; they become exponentially less safe (big caveats apply, naturally).


So if you are going 10% faster (38 vs 35) you have 121% the kinetic energy.


My math says it's ~1.18 times more KE.

But yes, about 1/5th more KE with just a 3 mph difference.


Except if the speed limit is 35, you need to drive 35. You can't drive 25 just to be more "defensive", that will get you a ticket.


Typically, no. There are some municipalities and areas of road where there is a minimum speed limit, but in the US, those are very rare. Granted, a law enforcement officer has wide jurisdiction to determine if you are 'safe' to drive and a low speed may become unsafe. But they are unlikely to ticket you and are more likely to try to help you out. Driving under the speed limit may be annoying, but it is nearly-universally not illegal.


It's actually more dangerous as it forces other cars to change lanes to get around you.


WTF? Spoken like an inveterate speeder... it's at best equally dangerous, the effect you are speaking of happens for both higher and lower speeds. At low speeds, other cars pass you. At high speeds, you pass other cars. Either way, every pass is an opportunity for failure, and the probability is higher the higher the relative speed. However, the road, barriers, trees, etc, is always going at 0 mph, so not just the relative, but also absolute speed control crash severity. Twice the speed means 4 times the kinetic energy... speed kills.



What's your point exactly? The Quora poster's argument is just a rehash of the OP and also can be restated symmetricly for over speeding as for underspeeding and thus adds nothing here.

It also fails to address my point about kinetic energy, which is proportional to the square of speed!


> Typically, no.

Typically it will get you pulled over. Obstructing the flow of traffic is a thing.


>Seeing as the cars have been on the road in Arizona for a while, and this is the first we are hearing about a pedestrian collision, it seems the algorithm is pretty safe.

One fatality in 4 million miles is safe? Humans are 20 times better than that.


>> One fatality in 4 million miles is safe? Humans are 20 times better than that.

And this is considering the fatality rate for all human drivers, including those that are drunk, playing on their smartphone, sleepy, speeding, street racing, etc. I would not be surprised if those cases account for a very sizable majority of fatal car accidents. Now imagine you would reduce those numbers by relatively simple measures that don't involve AI. Let the car enforce speed limits, mandate smartphone companies to have a 'car mode' that cannot be shut off, mandate attention tracking capabilities for cars, dead-zone indicators, automatic emergency braking, etc. Maybe even mandate an alcohol lock, or at the very least a black box that records driving behavior so it can be analyzed in case of a crash, with sever penalties after the fact if the black box shows signs of drunk driving, etc.

The list of things you can do to improve road safety without requiring AI cars is endless, yet billions are invested in trying to fully automate driving. This just shows that self-driving cars are not about safety, but purely about business opportunities.


And conversely, those 4 million miles will (presumably) be highly biased to safe/above-standard driving conditions.

As you've basically said, the human stats include all the very worst drivers in the full range of driving conditions, whereas the auto-cars not only have the very best driving conditions, but likely also have the statistical benefit of a likely number of cases where the human actually stepped in and upped the auto-car score even further (since we can't account for the auto-car deaths/fatalities that humans have successfully stepped in and stopped).


Yes, so far the statistics we have about the safety of AI cars are mostly meaningless to say anything useful about how they would scale if applied universally.

Another thing that bothers me is how often the argument 'human drivers are terrible, far worse than computers for situations A, B, C, etc' is used to suggest that AI systems will improve road safety. I will be the first to admit that many people are terrible drivers, they make mistakes, don't pay enough attention, don't obey traffic laws, have bad reaction times, etc, no argument there. But so far I have never seen any statistics about how many of the accidents caused by these kinds of human failures actually lead to fatalities. Taking the human out of the equation could in theory improve all kinds of car accidents, but how many would be just fender-benders as opposed to road fatalities? It's not like it is super-easy to accidentally cause a fatal road accident, and everyone knows one or more people who were involved in one. Who can prove that AI cars will not decrease the total number of accidents of any kind, but increase the kinds of accidents with fatalities, like this one with the Uber car? Maybe human drivers, with all their faults, would have avoided this particular accident by driving more defensively or using social cues to be more prepared for this woman crossing the street?

I'm not trying to make an argument about the viability of AI cars that are safer than humans here by the way. I personally don't believe AI cars will ever be safer overall than humans in all possible situations that people drive in, but that's a different discussion. What bugs me is that suddenly it seems as if the whole world is willing to bend over backwards to interpret statistics about the presumed safety of AI cars in the most favorable way possible, and at the same time ignore all these relatively simple things that could actually improve safety right now, at much lower costs than developing fully autonomous cars.


Yeah, I have always felt that a lot of these easy reasons for crashes (texting, drinking alcohol, speeding) are, while statistically linked with crashes of course, also emphasized due to psychology -- we all want to feel like there's something straightforward we can do to avoid serious wrecks. If it's moralistic, even better, that means morally good people don't wreck and wrecks happen when people do morally bad things.

There's not enough emphasis on the strategy and skill of driving, and that's for the same reason, I think. We've tried so hard to make driving as basic as walking, someone anyone should easily be able to do, and that doesn't fit well with something that takes a long time to master, that you can always learn a little more about.

So instead there's a narrative that all a driver has to do is obey some cut and dry rules, stop at all stop signs, never drink and drive, always do exactly 5mph over the speed limit (or is it 10? or is it 4? 7? 0?), and if they do all that they can zone out and not worry about anything but what's right in front of them in the windshield.

No immediate source, but pretty sure I read somewhere the safety stats between the best and worst drivers is one or two orders of magnitude. And the above is why. The best drivers are playing chess, the worst drivers are playing checkers. The worst drivers look at the same playing board but ignore almost all the information and just stay zombie-like in their lane till they need to brake to avoid something or their GPS says time to turn.

I'm really curious how self-driving software approaches all this. I have a sad suspicion they're programmed from a naive rule-following non-defensive point of view but would love to be proven wrong.


That's also an interesting perspective, it's not just about accidents caused by irresponsible behavior, but also about a difference in how people mentally engage the task of driving.

To me, these kinds of observations only solidify the idea that AI that is 'better than the average driver' does not necessarily mean 'less road fatalities', and that to improve driving safety, it would be more effective to start by taking the most common factors in accidents out of the equation first.


Yeah, in a nutshell, there is a huuuuge gap between two kinds of human drivers.

1) people who, like, you know, just drive, man, and like, as long as everyone else does the right thing, then like, i guess things will uh, like, work out dude

2) defensive drivers.

Obviously, self driving cars need to be more like #2 if they plan to actually be as safe as an average human, and have a shot at actually beating humans on this... I suspect Waymo gets this and Uber, well, is uh, ''moving fast and breaking things''


I wonder what the margin of error is on that one in 4 million rate since the sample size is one.

Also of note:

> Arizona has experienced a surge of pedestrian fatalities recently, with more than 10 in a single week of March in Phoenix alone. The state has the highest rate of pedestrian fatalities in the United States.


The sample size is not one. The sample size is 4 million. The 5% and 95% quantiles of the posterior are roughly one in 11 million and one in 800,000.


> any car going 35 mph will need a few feet to brake if something jumps directly in front of it.

It seems like you're working pretty hard to justify this woman's death.

She was apparently walking her bicycle through an intersection. Did she grab her bike and jump directly in front of the car? Maybe, but let's not jump to assumptions.

> it seems the algorithm is pretty safe

It ran at least six red lights in San Francisco and would apparently swerve into the bike lane when making turns, now it's killed a pedestrian. It just doesn't seem that good. How many similar incidents have you heard about with Waymo? They've driven more than twice as many miles in city streets.


I agree but one nitpick: merging into the bike lane before a right turn is the correct maneuver.


"Rather than merging into bike lanes early to make right-hand turns, as per California state law, the Uber vehicle reportedly pulled across the bike lanes at the last second, risking collisions with oncoming cyclists"

https://www.theverge.com/2016/12/20/14020720/uber-self-drivi...


At least in some jurisdictions. I remember seeing articles about Uber getting in trouble for doing this in jurisdictions where it is not the correct behavior and for not doing it in jurisdictions where it is the correct behavior.


I think you might not be familiar with the concept of defensive driving.

Defensive driving means slowing down and paying more attention in areas of low visibility, especially if there are likely pedestrian crossings. It means increasing your following distance. It means increasing the odds that even if something unlikely happens you, the driver, will still be in a better position to react, or you will be less likely to cause harm. It's not about prevention, it's about risk management and harm reduction. This is definitely something that is likely to be difficult to teach to autonomous systems.


> This is definitely something that is likely to be difficult to teach to autonomous systems.

Why? What you listed seem like conditions that should be easy enough to recognize even with simple algorithms.


Yup. An algorithm should be in much better position to recognize its confidence about what it sees and can happen, and alter the driving speed to reduce the amount of uncertain spots that fall within its stopping range.

That is, unless they use neural network magic black boxes to implement the self-driving part.


>That is, unless they use neural network magic black boxes to implement the self-driving part.

Is this actually done? I thought neural networks were more or less used for the vision system to detect parts of the scene (pedestrians, stop-signs, etc). This is the only area where to me it makes sense to take the blackbox tadeoff right now.


Neural nets are also likely to be used to model the behavior of other road users (i.e. 'how is this car/pedestrian/etc likely to move in the next few seconds'?). No-one seems to be seriously considering an end-to-end neural network (i.e. inputs are sensors, outputs are throttle and steering wheel), but neural nets are pervasive components of the full system.


Judging based on anecdotal feelings, you find that it "seems the algorithm is pretty safe"

But if you looked at the data (just numbers I've seen repeated a lot in this thread, heh), you'd find Uber's achieved 1 fatality after only around a couple million miles of driving. The average human being goes a hundred million miles between fatalities. Not a lot of data but so far Uber looks 50 times more dangerous than an average human driver. They have a lot of catching up to do


The google maps streetview sign says 45, so the car was not speeding. The article most likely had a typo...

edit: https://www.reddit.com/r/SelfDrivingCars/comments/85ozqr/exc...


"The police said that the vehicle was traveling 38 miles per hour in a 35 mile-per-hour zone, according to the Chronicle—though a Google Street View shot of the roadway taken last July shows a speed limit of 45 miles per hour along that stretch of road."

(https://arstechnica.com/cars/2018/03/police-chief-uber-self-...)


Do we know which way the car was going? The speed limit is 35 in that section going southbound.

https://goo.gl/maps/GNfKZzePD9t


The Uber was going northbound.


Google Street View says 45 MPH, but they might have change it in the past seven months.

https://www.google.com/maps/@33.4350531,-111.941492,3a,75y,3...


Thanks for the Youtube video, that is making sense to me now, it explains what I see as a cyclist. I have thought about 'investing' in an electric vehicle and participating in the brave new future that will eventually be autonomous driving machines, but it is quite badly thought out. But I think I will stick with the bicycle, it really is the fastest transport when combined with rail, there is no quicker way across town and these autonomous cars are not going to change that. Public transport and bikes are the only way.


I don't see why autonomous vehicles are so big. Virtually all travel is one person, ideally it would be a tiny 4 wheel electric with the sole passenger highly reclined.

The hangover from internal combustion is real.


The same reason non-autonomous ones are so big. It's a status symbol, plus there's the arms race of "bigger truck wins in an accident"


> bigger truck wins in an accident

And that's why I go by tram. :)


Jokes aside, it seems foolish to create an entirely new class of vehicle that drives itself while completely ignoring the physics of big bits of metal.


Even the original DARPA Grand Challenge in 2005 was won by a system installed on VW Taureg, an enormous two-tonne SUV.


They used a big SUV to get rough road capability, necessary for the Grand Challenge, and to have the space to rack all the equipment they needed. 13 years later, an Nvidia Jetson (a single board computer) has more computing power than their whole setup, and a medium drone can carry it.


What does autonomy have to do with it? Most regular cars would be better off as single-person vehicles; there are a few small-scale efforts in this direction (the Smart car, the G-Whiz electric car, I think BMW has an enclosed-canopy motorbike) but they largely haven't caught on.


BMW built the C1 from 2000-2002. They showed an electric version, the C1-E as a concept in 2009.


As autonomy is implemented and refined, and safety improves dramatically, new regulatory and economic models will emerge. This will spark design innovation. No doubt both single rider and shuttle-bus options will emerge, with much lighter builds as crashes decrease.


It's a test-vehicle, not the end-product.


Because A) Crash-safety and B) Luxury still being in the bigger packages.


> According to the Chronicle, the preliminary investigation found the Uber car was driving at 38 mph in a 35 mph zone and did not attempt to brake. Herzberg is said to have abruptly walked from a center median into a lane with traffic. Police believe she may have been homeless.

I know this isn't the most pertinent issue here, but I'm surprised the self-driving AI is allowed to speed, though with further research, this seems to be the case with Google/Waymo software too:

http://www.bbc.com/news/technology-28851996

Though in that BBC article (which is from 2014), the stated scenario is when the self-driving car is surrounded by traffic exceeding the speed limit. It doesn't sound like this was the case for the Uber vehicle.


Of course a self driving car should be able to speed. My understanding is that speed limits are set by measuring the speed people are driving on the road and then determining a speed at which roughly 15% of the drivers are exceeding that limit. There is an expectation that people will speed, but by setting the speed limit a little bit lower almost all traffic will fall under safe bounds. Driving 38 in a 35 zone is completely reasonable. I probably would have been going 45 - 50 if the road was empty. If self driving cars aren't allowed to speed, they're going to be a nightmare for passengers and other drivers. Obviously different people have different feelings about the issue of speeding, but personally I lose my mind when I get an Uber driver that's driving to slow. Try and think about what it would be like if a computer followed every traffic restriction to the letter of the law rather than driving like an actual human.

People will vote me down because the idea of a speeding self driving car sounds scary, but I personally can't wait until the software has advanced to the state that it can confidently drive 200 mph on the 5 from LA to the Bay Area. (Presumably they'd have to build a much bigger better 5, with special lanes dedicated to autonomous vehicles)


> My understanding is that speed limits are set by measuring the speed people are driving on the road and then determining a speed at which roughly 15% of the drivers are exceeding that limit.

That is not how all speed limits are set; it just one possible technique: https://safety.fhwa.dot.gov/speedmgt/ref_mats/fhwasa12004/


Thanks for clarifying, and the link! Actually a really interesting read.


Talking about a future in which AVs are allowed to go as fast as physically possible seems irrelevant when we're talking about current policy and infrastructure, in which AVs have to live in a world with majority manually-operated vehicles, driven by people who do speed, but are also punished for going over the speed limit.


An autonomous vehicle should also be pulled over and ticketed for operating unsafely. At this point in time the driver is still liable for the actions of the car and is required to monitor the autonomous system and takeover if necessary. However like a human it should have some discretion in interpreting the law. Should it go 120 in a 60 zone... definitely not but if it thinks it can do 38 in a 35? That sounds fine.


>> People will vote me down because the idea of a speeding self driving car sounds scary

Or because they don't think that speed limits are set so that everyone can keep under them- but so that keeping under them means you're driving safely.


It's one thing to state that it's common and simple to, say, overclock a CPU, even if the processor specs have a speed a few percent lower. Typical headroom, operator requirements, and a decent cooler suggest that everyone could probably bump clock speeds by a few percent without issue.

But you wouldn't expect a plane crash investigation to mention offhand that the flight computers were mildly overclocked.

A test drive of a new self-driving car should not speed, even if it's common or normal.


They should not be able to speed arbitrarily, but it seems sensible that they would match the prevailing traffic speed.

One car carefully observing a 30mph limit creates a hazard when everyone else is going 35-40.


Going at speed limit NEVER creates a hazard. It is always the drivers going over or under it. I am not getting a speeding ticket for peer pressure. If you want to speed then you can pass me at your own risk.


Going the speed limit OFTEN creates a hazard. Technically speeding is not "exceeding the speed limit" but "driving too fast for conditions." In a torrential downpour with slick roads and poor visibility, the legal speed limit may easily be too fast for conditions, and you can be ticketed--to pick one scenario that refutes your "NEVER." And driving the speed limit in the passing lane creates an obstacle to faster traffic trying to get by, a frustration and a hazard. There was a video of some idiot "pranksters" with your exact attitude that created a rolling roadblock on a freeway by coordinating five cars to drive abreast at the exact speed limit, proving ... something. They were ticketed for hazardous driving. cthalupa is correct, though more diplomatic than I. It's more hazardous to be out of step with the surrounding traffic flow than it is to "speed."


>Going at speed limit NEVER creates a hazard. It is always the drivers going over or under it.

Please be in the right lane if you are going slower than the flow of traffic. Regardless of whether or not you feel the people speeding are ultimately at fault, if everyone is speeding, it is a more dangerous to have a car that is going a dissimilar speed than it is for you to be speeding.

Keeping right is a small mitigating factor.


Of course everyone else going at 35-40 are a hazard themselves in the first place.


Or everyone else creates a hazard when exceeding the speed limit.


What speed would you drive if there was no speed limit ?


What speed would you drive? Would you take close corners at 120km/h? People (on average) drive at the speed their actually very well tuned survival instinct tells them to.


I disagree that the average driver has well-tuned instincts. Accidents are rare enough events that you can drive poorly for a long time and still not end up in a seriously bad situation, especially in areas where inclement conditions are also rare.

The first time you lose control of your car might be in icy conditions, and might be not just losing control, but spinning out of control in traffic. The line between in control and not in control can be very thin, and so hard to know how close you are to it.


I mean self-preservation instincts, not driving instincts. I agree totally with the later!

By self-preservations instincts what I mean is that most people will try to be as safe as possible on the road, meaning drive slower on low-visibility situations, when there are many pedestrians around, etc. You cannot account on unforeseen circumstances or lack of knowledge, but otherwise people will try very hard not to get into harm's way.


Without a speed limit, I'd take i-70 from KC to STL at about (160~mph) 250km/h for 95% of it. I'd make day trips just to get white castle at white knuckle speeds.


I don't think that is your average American! At least I hope not ;)


I-70 is a mostly straight highway with a divider, it'd be pretty safe as long as it's not crowded. Autobahn speed limits are certainly doable in many of America's highways.


I don’t think that’s true at all (though I only have my own experience to go off of). I tend to use the speed limit as a point of reference, and then drive a little above it. Prior experience has told me that going the speed limit plus some percentage ‘feels’ safe. If there were no speed limit I’d actually have to pay more attention to the road and surrounding environment to determine what a safe speed is. Right now most people’s ‘survival instinct’ is basically speed limit + some delta in clear conditions. That delta may change from driver to driver, but in general everyone drives within 10-20% of the speed limit.

As a fun thought experiment, imagine what would happen if your residential speed limit were raised from 25mph to 70mph. People may not hit the 70mph limit, but I can almost guarantee that the average speed would increase significantly, perhaps even dangerously in some cases. I don’t have much faith in people’s ability to judge space and speed accurately, particularly when they aren’t used to doing so. And this includes myself.


Ah but that is a whole different issue, the problem is that for the last few decades residential areas have been designed more and more like high-speed areas, and that had a larger impact on driver speed than the actual posted sign. I cannot seem to find the original articles but here are a couple of related ones:

- https://www.citylab.com/design/2015/11/some-20-mph-streets-a...

- http://plannersweb.com/2013/09/wide-neighborhood-street-part...


A book on the topic quotes:

> "Note that the people drive the speed they feel comfortable with regardless of the posted speed limit if enforcement is not present"

- ref: https://books.google.co.jp/books?id=tF7uyw7feOMC&pg=SA13-PA1...

And a paper:

- "PEOPLE DRIVE THE SPEED THEY ARE COMFORTABLE WITH ON GRAVEL ROADS REGARDLESS OF SPEED LIMIT, K-STATE RESEARCHERS SAY"

- https://www.k-state.edu/media/newsreleases/april09/speedlimi...


This is well-studied. It turns out that the 85th percentile speed (the speed where 85% of drivers are below that speed, 15% above) is nearly insensitive to speed limit. Road design is the biggest single determinant of travel speed. http://www.lsp.org/pdf/troopc85thSpeed.pdf


Amazing insights:

> Won’t raising the speed limit cause people to drive faster and cause more crashes?

> The Federal Highway Administration studied nearly 200 roads in 22 states where speed limits were raised, lowered or left unchanged. Prior to the speed limit change, 55 percent of drivers exceeded the posted speed limits. After speed limits were raised or lowered as much as 20 mph, there was a slight change in speed, but generally less than 1 mph. There were no significant changes in crashes, although crashes tended to decrease where speed limits were increased to realistic levels. Also, there was little effect on speeds or crashes on intersecting or nearby roadways.


This is for federal highways, traffic in a city should not go faster than 25mph in most shared spaces. Motorways is a different subject, but there should only be overpasses or resegined roads and traffic lights for such places.


I would like the speed limit on my interstates and highways to go up to 70(they're all straight. 55 is just the only speed limit around here). Driving at 65-70 feels safe in normal conditions. Driving above 70 feels a bit too fast. Speed limits should be more variable depending upon car size and braking power though. I hate having to fear that I might be pulled over for going a bit above the speed limit when there are so many more threats than just speed. I don't believe speed limits should be anything more than a suggestion with dangerous driving being the focus.


I find this is extremely dependent on car. I've had cars that felt like you were about to die at 55 mph, and I've had cars that you couldn't tell you were going fast until you were really moving. For laughs, as my wife and I were driving home one afternoon earlier this year on a completely empty stretch of I-5 south of Portland, I casually accelerated up to about 110 and left it there for maybe a minute, then eased back down to normal speed as we came up into traffic. She never even looked up, didn't notice. But then again, we were in my brand new Camaro SS and I'm not positive it even got over 2000 RPM for that exercise, so I guess that's not a surprise.

So I guess I'm saying I agree. Speed limits are pretty arbitrary, there are a lot of things drivers do every day that are far more dangerous than merely speeding. Closure rate, unsafe passes, tailgating, cutting people off, etc.

/don't get me started about how people drive around you if you're towing a travel trailer, what the heck is it with people?


On most roads around where I live, if you are driving exactly the speed limit, you are a slow moving obstacle that is probably more of a danger due to the extreme speed differential with the other drivers who are 10-20 MPH over the limit. I’d hope self driving cars were programmed to deal safely within the bounds of reality. If there is nobody else on the road, sure drive the speed limit. If cars are whizzing past you on all sides you’re a hazard even if you’re obeying the law.


Depends on the conditions. With moderate temperature and dry conditions every single corner on a highway can be taken at 120km/h on any modern car, the overwhelming majority of them can be taken far in excess of 200km/h depending on the car's characteristics.

Then again under bad conditions on a 90km/h roads my survival instinct like you mentioned hasn't allowed me to go past 20-30km/h sometimes.


Prevailing traffic plus 10-15 mph. Or 135-ish with good conditions and no other cars. My days of real speed runs are behind me, and 135 is fast enough to keep things interesting and make good time without getting too crazy.


I would say they should be able to drive up to 15% higher than the speed limit at their discretion. So in a 35 zone they could go up to 40. But that's not a question for me to answer.I just know that the answer isn't that they follow the letter of the law all the time.


In Europe there are global definitions for speed limits. With a few notable exceptions, there's no such thing as "no speed limit".


It may not be programmed to speed per se, but if you’ve ever had cruise control turned on you’ll know it doesn’t keep you exactly at 55mpg. It might be 54 with the throttle applied to speed up, or it might be 57 which doesn’t trigger the brakes yet but is coasting to a slower speed.

Imagine the car in front of you cycling rapidly between the gas and brake pedal to keep the speed exactly at 55mph. One, that’s completely impossible with how cars are designed, and two that’s really illogical. Cars have to follow the laws of physics too.


There is no brake involved with cruise control. What you are describing is bad drivers who do not use cruise control. It's actually extremely pleasant and predictable to be following a driver who is using cruise control. Following a well-regulated self-driving car would be similar.


My car has cruise control with brake functionality. Not sure if I've ever seen it actually do it - I only really use cruise control on the motorway, where I expect wind resistance is a major factor - but apparently it will apply the brakes to maintain the set speed when necessary.

At town speeds I would expect it to use the brakes a lot more readily. At 30mph in 3rd (2000rpm) or 4th (1500rpm) there's not really all that much engine braking, and wind resistance won't slow it much either.

(Mine is a 2010 model, but I think this stuff was introduced in 2004-6, something like that... it's not exactly new technology.)


My car has a CVT (Continuously Variable Transmission). With cruise control enabled, going downhill, it will actually change the effective gear ratio and "downshift" automatically.

(Once or twice when going downhill without cruise control, just using the brake pedal, it "downshifted" automatically while I had the brake pedal pressed.

CVTs are computer controled and better than manual/standard for gas mileage, but with the downaide that you can't tow heavy loads.)


> but with the downaide that you can't tow heavy loads.)

And the frustrating rubber-banding.


I've never heard of that phrase before, so I went to look it up. Apparently it has different meanings to different people when it comes to CVTs. I haven't experienced any of those different issues.

When it comes to accelerating and high RPM, I just don't mash the gas pedal down. By pressing the gas pedal on my car just enough, I can accelerate past other cars without high RPM just fine.

Gas pedal management is definately different with CVTs and other types of transmissions if you want a good driving experience.


My car absolutely uses the break when required to maintain the set cruise control speed, I've watched it happen. The car I owned before my current car didn't use the break to keep to the set cruise control speed though, so I get why you may think there is no break involved. It largely seems to depend on make and cost of the car you're driving.


Our Passat uses brakes for cruise control, our older Kia Ceed did not. I actually hate when it uses the brake. The non-brake method is much better imo. Feels like such a waste to heat the brakes on a steep incline instead of just rolling with it.


I'd have said age and which assistance systems there are. In many older cars there's just no path for the CC to control the brakes.


Yeah true. Seems like the more feature complete assistance systems are making their way down to even some of the more low-end brands these days. Level 1 autonomy is getting pretty old hat now I guess.


All SDVs that I know of operate some kind of longitudinal controller that has access to throttle and brakes at a minimum. That and steering wheel angle (or torque, depending) are the two essential control primitives. All the motion planning stuff happens on top of this.



There can be several ways to explain this but generally speaking you are right.

It shouldn’t go over the speed limit, and under no circumstances should it take the liberty drivers usually take of being within 10% above the limit.

I hope no one wrote it that way.


I don't think that's true at all. There was an article posted about the subject a while ago, but the thrust of it was that researchers think they need to do more to make cars behave like (fully attentive) human drivers rather than being weird anomalies that behave in surprising/irritating (to humans) ways if they're going to actually go on the road, and I think that's completely right.


"(fully attentive) human drivers" probably cause deadly accidents every day, and they certainly behave in surprising/irritating ways. Anyways, a human driver going the speed limit is no hazard, so I fail to see how having self-driving cars follow the limits could be a problem.


Autonomous cars need to share the road with human drivers, so they should not behave in a way that is surprising to humans, to the extent possible.


Why not? The range between 1-10 MPH over the speed limit is statistically the safest. (Yes, I know, good luck finding a citation for a study I read 20 years ago...) It's hard to argue against the notion that you're safest when you keep up with prevailing traffic. That means that self-driving cars shouldn't be programmed to go slower.

Speed limits are usually too low. Often they have nothing to do with engineering or safety considerations, but are primarily revenue-oriented. If the advent of autonomous vehicles forces regulators to listen to the engineers for a change, that will be an unalloyed Good Thing.


Man, sorry for the downvotes. This link -> http://www.lsp.org/pdf/troopc85thSpeed.pdf doesn't include any references, unfortunately, but the statements are that:

-- 55% of drivers regularly exceed the speed limit, meaning that the speed limit represents the 45th percentile speed

-- driving between the 50th and 90th percentile speeds is the safest

Taken together, driving at the speed limit is significantly unsafe.


Man, sorry for the downvotes.

You have to remember that many people on here probably don't own a car, or necessarily even hold a driver's license. They most likely either live at home or at school, or take a bus to work (NTTAWWT).


As far as I understand your comment, driving at the speed limit is unsafe because everybody is speeding. Sounds like a self-fulfilling thing. Maybe it would be even safer if the majority would obey the speed limit?


Did you read the link I posted? Theres lots of empirical data out there that says that drivers drive the same speed on a road, regardless of the posted limit. That speed is based on their judgment of the road - sight lines, shoulder width, sharpness of curves, etc.


People speed because of poor road design and poor enforcement of speed limits.


Just curious (and not trying to challenge or be confrontational), how long have you held a driver's license?


And the idea that the speed limit itself is the problem... is there any room in your worldview for that possibility?

Why not lower it to 5 MPH on a nationwide basis? Think of all the lives that would save!


Speed limits in cities and residential areas may also be set to reduce road noise in the neighborhood.


I remember early reports about self-driving experiments saying that some of the issues where because the cars were driving at the speed limit, which is usually not the case in the US with the human drivers.

Maybe they decided to change it.


If they just track human driver habits then we're no more safer! What's the point left about self-driving cars then? Free up drivers of even more responsibility when they kill people?


> What's the point left about self-driving cars then?

The point of a self-driving car is unburdening the user from the task of driving it, allowing them to engage in other activities if they chose to do so.

That it has the potential of being safer is a nice perk but definitely not the primary goal. Right now the challenge is whether we can make them as safe on average as human drivers do, that way we can avoid effectively sacrificing people for the convenience such a system would provide.


> Free up drivers of even more responsibility when they kill people?

That's a strange way to put it. Surely if someone is not driving their car, they can't be responsible if the car kills people (assuming of course the car was well-maintained and the model was legal). I don't see how that's a bad thing, since they have zero control over the situation.

Do you mean that it could make car users feel less responsible for deaths, even though cars will presumably still kill people?


I'm not sure if you meant it, but you essentially answered my question with a "yes." It's to absolve individuals of responsibility in killing people.

If that's the case then I'm not sure how it is a benefit to society.


Responsibility isn't being absolved, it's being shifted to the one that actually operates the vehicle - probably either the car vendor (who either manufactured or licensed the self-driving car software), or the fleet operator depending on what kind of sales and operations model self-driving cars adopt. When human escalator operators were replaced with automatic escalators, was there are problem with shifting responsibility over to Otis and other escalator manufacturers?

As with automatic escalators, there are two primary benefits to society: greater aggregate safety[1], and freeing up people's time. You point to the lack of a human operator to hold responsible as a detriment, but the statistics point the other way around: human drivers are horribly irresponsible and even if we can punish them when they do wrong that doesn't substantially change their behavior. Freeing up people's time is self-explanatory.


So I think you have a great analogy, and I generally like your perspective. The problem is that I think this accident clearly demonstrates that the suggestion that AV's would be safer than human drivers is now suspect. A post elsewhere compared the deaths per mile for human drivers to that of this pilots' and it's worse by orders of magnitude.[0]

That's my point. I 100% agree, if self-driving cars did make things better, than the safety aspect is a good thing.

I think the freeing up people's time is okay too but only if all of society has access to AV's. However, like many innovations of the past, advancements which are not accessible to all people just free up the time of some people while offsetting burdens onto others (ie., those who can't afford AV's). BUT, that is a separate issue.

[0] It is just one data point, but it's a bad sign to be so quick to kill a pedestrian so soon out of the gate.


Worse by an order of magnitude is only the case if you exclusively measure fatalities - and as you point out there is exactly one data point and not enough miles logged to produce any meaningful conclusion. The data on non-fatal accidents for Waymo indicates that self-driving cars are an order of magnitude less likely to be at fault for an accident that human drivers: https://www.huffingtonpost.com/entry/how-safe-are-self-drivi...


You're reducing the entire benefit of self-driving cars down to whether they follow the speed limit.


The #1 cause of collisions is inattention, which ought to be impossible with automated cars.


As another driver on the road, I'd far rather that autonomous vehicles maintain speed with the flow of traffic around them, in lieu of strictly sticking to the posted limit.


There is no traffic "around" them, just the car behind. And if the car behind is speeding it's up to them to slow down if the car in front is obeying the speed limit.

This kind of argument is just used to justify speeding.


You don't seem to drive. If you don't follow traffic flow, people will cut off in front of you, tailgate, zigzag, and do other dangerous lane changes/maneuvers much more frequently. And that's when the trigger-happy ones don't yell obscenities at you.


>You don't seem to drive

LOL. I'm not sure if you're trolling, rude or just don't pay attention to the cars around you when you're driving, but I'll guess it's a combination of all 3.

>If you don't follow traffic flow, people will cut off in front of you, tailgate, zigzag, and do other dangerous lane changes/maneuvers much more frequently.

From my 45 years of observing people's driving and driving myself for about 30 years, I find that most people generally obey the speed limit or don't go much faster. Probably about 90% of people in general. At rush hour, it drops to about 50%. Some of the other 50% certainly do do all those fucked up, rude and dangerous things you mention, but it doesn't mean that the 50% of people obeying the speed limit are at fault. It's the impatient speeders who are at fault.

It's sometimes a bit of a Darwin award scenario. I live on Vancouver Island, and there is only a single highway over the mountains to Victoria. During commute times, about 50% of people speed significantly, and they do some of the dangerous and rude shit you mention. A few times a year one of them drives too fast for the conditions and ends up killing themselves, which then shuts down the entire north-south highway (the only fucking route) for 4 hours. The government is currently spending a shit ton of money to put barriers in the centre of the highway to try and prevent the excessively speeding idiots from killing themselves so often.

So no, I don't have much patience for the rude people who tailgate me, go 50km/h over the limit even in rain, and then get themselves killed and hold up the entire island for 4 hours because of their stupidity and lack of knowledge of basic physics.

(And I'm not calling out everyone who speeds, I do it myself sometimes, it's only the ones who go way over the limit with no consideration for weather conditions or safety, and the rude idiots who are in too much of a hurry to extend basic courtesy to other road users).


> I find that most people generally obey the speed limit or don't go much faster.

Regional culture definitely plays into this. I grew up in the midwest and almost everybody speeds. Now I live in Seattle and I'm the fast guy in the left lane going 5 MPH over the speed limit.


You are ranting about reckless speeding, which I agree 100% with you that it's a stupid thing to do. But the comment you replied to was talking about following traffic flow, which is about minimizing speed differentials between vehicles so as to minimize triggering fits of impatience in drivers around you.


My point is that:

[1] It's not my problem if others get irate because I'm at the speed limit.

[2] It's not dangerous to be at the speed limit, it's dangerous to be over it. And you're not likely to hit someone going at the speed limit if you're dangerously speeding, you're likely to hit the median or an oncoming car, and more likely to get killed doing so (much greater speed differential than between the speeder and the person going at the speed limit).

[3] There is only a differential if you are passing, and that is easily managed by waiting for a sufficient gap in traffic.


> It's not my problem if others get irate

It's not like in kindergarten where kids point at each other yelling it was the other who started it. If someone does something dangerous around me, it absolutely concerns my family's safety, regardless of what triggered the dangerous situation. Thinking two steps ahead to prevent these situations from occurring in the first place is at the core of defensive driving philosophy and it's why it's often recommended that one pick going with traffic flow instead of going at the posted speed limit if forced to choose given traffic conditions.

I agree that in many roads, going above the posted limit is a very accurate measure of recklessness, but that isn't the case everywhere and it isn't at all uncommon that the flow of traffic is 10 or even 20 km/h over the posted speed limit on many freeways or countryside roads.


I disagree that going at the posted speed limit when someone behind me is going faster is dangerous, or that the person going at the speed limit is at fault. If anyone in this scenario is creating danger it is the person speeding. Generally it is not "everyone" who is speeding, only a percentage. If everyone is going over the limit and it creates a dangerous situation if someone drives at the limit, then there is some serious problem with that road and it needs fixed urgently (either by increasing the limit, or by enforcing the existing limit). It's unhelpful and silly to suggest that people must break the law to be safe, and it completely defeats the point of speed limits and the rule of law, not to mention common sense.

Defensive driving means looking ahead and predicting when someone is going to do something stupid, not braking hard if someone is behind you, being very careful when changing lanes, driving in a parking lot, etc. It definitely doesn't mean driving above the speed limit to keep people behind you happy.


> It's unhelpful and silly to suggest that people must break the law to be safe, and it completely defeats the point of speed limits and the rule of law

The law doesn't necessarily codify what's safe. You could go at 40km/h on a highway and be technically entirely within the law, but that's certainly not a safe thing to do. Similarly, you could go at exactly the posted speed limit on a snowstorm when everyone is driving slower and you would be entirely within the law, but you're probably going to get yourself killed.

Maybe if I always drive in the rightmost lane, being pedantic about posted limits is probably safe for a variety of reasons, but if I'm on the left lane of a 100km/h 3-lane highway, I wouldn't exactly feel comfortable going at 100 if everyone else is going at 125 (including cops).


> There is no traffic "around" them, just the car behind.

How about when merging, or changing lanes? The speed limit is not _always_ safe or practical to obey strictly.


>How about when merging, or changing lanes? The speed limit is not _always_ safe or practical to obey strictly.

In the case of overtaking, you just need to make sure you don't cut someone off in the overtaking lane. No need to go over the speed limit just because you're too impatient to wait for a proper gap in the traffic.

I've never seen a situation where you're merging onto a freeway and everyone in the right-most lane is speeding excessively. That would be a very dangerous situation, and if you yourself go significantly faster than the speed limit to merge you're just adding to the dangerous problem.


This is probably less the case with self-driving cars, as I imagine they're using pretty new or well-maintained vehicles, but there's usually a 5mph leeway with speed limits due to slightly mis-calibrated speedometers. I anecdotally remember hearing this being the reason for speed limits that end in "5" rather than being multiples of 10.

https://travel.stackexchange.com/questions/40113/do-any-stat...


Nowdays its much more electronic than it used to be, so speedometers are much more consistent than they used to be.

The way speedometers used to work is pretty interesting: http://www.explainthatstuff.com/how-speedometer-works.html


Wouldn’t that be mitigated by the GPS and the map? It would seem that an automated car using maps as an important input would no its exact position and therefore speed.


GPS can be remarkably bad in urban environments. You can't really trust it for second-to-second speed or positioning. It's only really good for long term/high level planning. For speed, the wheel sensors are going to be way more reliable and consistent.


The typical approach is to fuse GPS, inertial, and wheelspeed sensors to track a vehicle's motion in space. They all have their shortcomings, and all compensate for each other.

GPS is, as you said, wildly inaccurate, disconcertingly often. Wheelspeed is great when you're moving, but at low speed it tends not to have the granularity you want. Inertial sensors are pretty amazing nowadays, but they still drift.

To illustrate, if your wheels aren't moving, then you can ignore the IMU telling you that you're sinking into the ground at 5mm/s. Watching the monitor of a mapping vehicle as it depicts the car returning to mother Earth is pretty hysterical though. Until you have to fix it.


Actually, with a good receiver you can use GPS for very precise speed measurements by using the doppler shift of the carrier signal. It's just that most receivers don't give you access to that so the only other option is using the current and previous position and the timings of them.


That could easily be the case, sure. Some other people mentioned that it could be just to more closely emulate a human driver. I don't think we'll have a good answer without knowing the implementation.


Id rather the cars travel at safer speeds than an arbitrary speed limit which is 99% of the time unsafe.


Not speeding would cause it to impede traffic flow. Speed limits are often set for revenue generation, not safety.


On a 35 mph road with no traffic to impede, there is no reason to speed.


There is always a reason to speed: getting from point A to point B faster.


This is just police standard operating procedure. In any kind of manslaughter where a vehicle is involved, save on paperwork and blame whichever party ended up dead before the driver has even spoken the words "she came from nowhere". Here is the police chief spelling it out for Uber:

she came from the shadows

I wonder what that LIDAR thinks about the shadows. I mean, this system is in place because everybody drives and so we've legalized recklessness in the pursuit of "There but for the grace of god go I" but who benefits if police declares an AI driver blameless before anyone has even downloaded the recordings?


You aren't kidding about it being standard operating procedure. Almost every time a pedestrian or cyclist is killed in New York City, the police come out and blame anyone but the driver. And then video will turn up afterwards showing that the driver was driving unsafely, and that the fatal accident could have been avoided. Still, no charges are filed. It's like they have an allergy to paperwork.

I want to see video of this fatal crash. Nothing else will suffice.


I wasn't saying that in jest or exaggerating for dramatic effect. It's simply reality that today, unless you are drunk or high, you can kill someone who is walking in a crosswalk with your vehicle and get away with not even a ticket:

https://twitter.com/KeeganNYC/status/530515713405231105

Afterwards, police will, as in this case, make up a story how "the little kid broke free from its grandma" and decline to charge.


According to the news reports on the case that tweet is about, the police in fact ticketed the driver. A judge dismissed the tickets later, against the recommendation of the police.


It's standard operating procedure in SF, and not only will the police not bother to look at surveillance, but they will lie and claim they it doesn't exist [1].

[1] https://sf.streetsblog.org/2013/08/23/sfbc-finds-what-sfpd-d...


This is really scary I think!

In Denmark it is standard procedure to immediately arrest the other party and charge with manslaughter if there has been a fatality in an accident, no matter who might have been at fault. This way the police can gather information quickly from the driver and release him shortly after. An in depth investigation is always carried out to precisely map out what happened (this can take weeks), and not before that will charges be dropped depending on the result.


The difference is probably because US culture is very automobile focused and in some places downright hostile to cyclists and pedestrians, whereas in Denmark cycling is very popular.


The difference is that in NYC the cops can get away with that just by taking the surveillance tapes from the local businesses and then destroying them. In this case presumably they know the video will eventually come out.


Are there any reports that NYC police actually do or have done that?


When there's a crash, all information related to the crash should be public information: all raw telemetry information collect by the car, and the logic used to evaluate that telemetry. The idea any portion of this is proprietary is dangerous. If a police officer asks a human driver questions, and they refuse to answer, they can be compelled. If they are asked if they can think of anything else relevant that hasn't been explicitly asked about the accident, and they say no, and it turns out they did and were withholding information, that's a crime.

There must be equivalents for autonomous driving. They can't be allowed to shield themselves behind "proprietary information" claims.


In the United States lying to local police is generally speaking not a crime (certain lies can be, but saying you have no further information when you do is not one of them). Also you cannot be compelled to be a witness against yourself so you have every right to not answer questions if you suspect you are under a criminal investigation.

That being said in the United States non-human persons such as corporations have no such rights.


> I wonder what that LIDAR thinks about the shadows.

I honestly don't follow these systems very close, but I thought that was one of the big selling points, LIDAR can see into tough situations and things like "the shadows" and help avoid situations where people would fail.


I think your parent post was being sarcastic since indeed LIDAR isn't suppose to be affected by "shadows"


And yet the speeding vehicle didn’t even attempt to stop.

https://www.theverge.com/2018/3/19/17140936/uber-self-drivin...


Can we please stop calling 38 in a 35 speeding? In many places that speed would be inpeding traffic.


Uh, but it is speeding. I can't be the only person who's gone through a town that happens to have a zealous patrol officer who brings in huge revenues for the town (seems to be common around college towns). Exceptions are generally made when it comes to maintaining flow of traffic, but it seems like the Uber vehicle was by itself?


In most places the driver isn’t an experimental automated vehicle. Maybe while trying new tech that can apparently fail to see a woman with LIDAR, stick to the damned speed limit? That seems fair to me.


"Exceeding the stated speed limit by 8.5%" better for you?


So? That is because the traffic is speeding. What sort of twisted logic do you have that someone has just been killed, and the point you want to make is that 3miles over is not really speeding?


This really isn't an argument. It's just saying that because someone has been killed we shouldn't be able to have rational discussion because our logic is "twisted".


It's region specific. In many regions, yes, it is speeding, and ticketable, sometimes automatically by speed cameras. I have no idea what is normal in this particular region.


I thought computers excelled at matching numbers?


I've seen people charged with a misdemeanor for going 5 over the limit where noone was harmed. If a human can be held to that standard I think a self-driving car should in the case of a fatal collision.


+1. Calling this speeding is willful ignorance of the practical flow of traffic everywhere in the country.


Police write tickets for being 3mph over the speed limit. Can’t speak for other states but California DMV test does not give any kind of allowance to go over the speed limit. So why is this “ignorance”?


The DMV might not give any allowance but enforcement is no where as harsh in California. The i-5 in Southern California is filled with people cruising at 80 (even the highway patrol!), 15 miles above the limit and no one gets a ticket.

The same could be said for the 85 and 101 in the Bay, where the cruising speed is 75-80 during low traffic and almost always in the carpool lanes.


I wasn’t claiming that enforcement was strict or consistent. Just that it is patently absurd to say that 38mph in a 35mph isn’t “speeding”. I don’t care what your personal anecdotal experience is, we’re talking about what a court can penalize you on and what is written in the legal code.

And yes, I go over the speed limit plenty of times myself. I remember in HS how my friend was let go even after being caught at +10mph because he told the officer he was late to our calculus study group. Cops, and all human entities, make tons of allowances and exceptions.


Request denied. The sign says speed limit. What about a plain language sign are you not groking? Legally it is speeding, and it's an opinion to call it impeding traffic. And as a result of speeding, there's certainly civil liability attached because the car would not have been where it was, when it was, had it been driving at or below the speed limit.


Well the funny part is that it turns out it wasn't speeding. The limit was 45 and it was doing 38. If it had been speeding it wouldn't have been where it was when the person was jaywalking.


LIDAR indeed doesn't care about ambient light, but it can get confused by certain surfaces. I've seen LIDAR not be able to see a clean black car. The light just got absorbed and there wasn't enough return for it to register.


I think he is saying that she probably would have died even if a human were driving.


Sure, but presumably that's because a human wouldn't be able to see the pedestrian in the dark; you would expect LIDAR to be able to detect them and so the autonomous car should have had the same chances to spot this person in the daytime or the nighttime.


Well LIDAR certainly doesn't care it's dark.


>before anyone has even downloaded the recordings

The second paragraph of the article states "video footage taken from cameras equipped to the autonomous Volvo SUV potentially shift the blame to the victim herself."


It's video footage from cameras at night! Of course on the footage the victim comes from the shadows because cameras become notoriously useless at night when you still want to hit reasonable framerates (and you do).

That's why that thing has a very expensive LIDAR on top.


Modern dashcams have very good low light capability. Here's a cruise self driving video at night.

https://www.youtube.com/watch?v=KSRPmng1cmA


The point of that persons comment was that you asserted nobody has looked at videos. Clearly they have.


In France, the car is always at blame whatever the cyclist or pedestrian did. This the jurisprudence as set by a case where a drunk cyclist, driving at night without any light in the wrong lane and against traffic was hit by a car coming by. Though the driver obviously wasn't at fault, the decision was to always protect the vulnerable and feeble, i.e. the cyclist.


I came here to say the same, if a car ever hits a pedestrian the car is always at fault. Pedestrians follow the laws of physics, they don't just "come out of the shadows". It amazes me that Police could even think about issuing this statement, and also that people take it at face value. Guys, if you ever see a pedestrian in your path and you are not able to avoid them, you were driving too fast, simple as that.

And I would disagree with you that the driver was not at fault. He may have attenuating circumstances, but he was there to survey the environment and override the machine just in cases like that. The situation required him to lower the speed in order to avoid hitting a potential person "coming out of the shadows" and he did not do that. Once the person did show up, maybe it was too late for him to react, but he should have acted before. And let me take that back, he should at least hit the brakes; maybe it was too late to avoid the collision, but not hitting the brakes at all? Definitely at fault.


> if you ever see a pedestrian in your path and you are not able to avoid them, you were driving too fast, simple as that.

That flies in the face of biology and physics.

In the scenario where someone rapidly moves into the traffic from behind a parked car or some other visual obstruction: It takes up to 500ms for a human to take note (not react, just take note) of an obstacle, add reaction time to that and you'll arrive at a mean of over 1 second[0], usually more[1], before a driver can react to a new obstacle and this is for attentive drivers under good visibility.

Your argument relies on the fact that slower speeds result in shorter braking distances but braking distances are always non-zero, therefore it stands that there is a distance at which someone could move in front of the car and get hit even if the reaction time was 0ms, instead of ~1300.

[0] https://www.researchgate.net/publication/274973324_Braking_R...

[1]http://copradar.com/redlight/factors/index.html


Pedestrians don't teleport in front of your car. It takes them time to cover the distance from the sidewalk to the middle of your lane. At the normal walking speed (3 mph) it takes 2.73 seconds to cover the width of a lane (12 feet). No matter how "out of the shadows" someone comes, no way that distance is less than 6 feet, that's 1.36 seconds. Situations where the distance a pedestrian has to cover to go from invisible to in front of your car are numerous, like people crossing the street in front of a stopped bus. But that's when you slow down, and instead of 38 mph, you do 5 mph.

Look, you can quote science, and studies, or you can talk to anyone who has a driver license. In case you yourself have one, then all I'm saying here should not be news to you, or unreasonable.


You are referring to someone walking from the sidewalk. The parent is referring to someone leaping from behind a parked car or visual obstruction.

The distance to travel in the parent’s situation is basically zero, and is incredibly common in most cities with a parking lane.


Arizona is in bed with Uber on this, they were enthusiastic about getting this on the road, so political cover is much needed. I read this like a joint press release by the Governor and Uber.


Your comments on this heated topic stand out as being particularly unsubstantive and flamebaity. Could you please not do that here? It isn't just that they're bad comments for HN, it's that they encourage worse.

If you'd (re-)read https://news.ycombinator.com/newsguidelines.html and take the spirit of this site to heart when commenting here, we'd appreciate it.


I've been trying to find video evidence of the crash, and have been unable to do so. The closest I've found is a myriad of news reports showing the stopped car and police scene.

On reviewing this evidence however, there is one thing that sticks out: the damage to the car appears to be entirely on the passenger side. If the woman "darted out from the median," then she managed to clear at least half of the traffic lane before being struck. That makes it more likely that certain prompt evasive actions could have prevented the fatality, which does not look good on Uber's part.

I, for one, am glad that NTSB is looking at the crash.


Yea, there just seems to be so many things that aren't adding up yet. The news footage also showed a damaged bicycle but there is not mention of a cyclist in the article. I hope they release footage soon.


There was mention of her walking her bike.


Which means she wasn't moving quickly.


"walking your bike" is just an expression, you can certainly run while "walking" your bike – especially across a street


Sure, but starting and stopping takes time, which means she did not "suddenly" run into the street.


Video evidence hasn't been released. I don't even know if it's been said what lane the Uber vehicle was in. It's possible she walked out from the center median and far enough that the impact would be on the passenger side of the Uber AV. But that would suggest she did so at a distance that the Uber vehicle could at least begin braking, even if it was still too late (in the same way train operators can hit the brakes despite it being way too late when first seeing a car on the tracks).


The massive earlier discussion is https://news.ycombinator.com/item?id=16619917, but since the current thread is rising we can let it have the next shift on the front page.

Edit: some people have objected that it's biased to let the discussion shift to "Police say Uber is likely not at fault" in place of the original report about the death. That's a fair point, so we've taken the downweight off the original story.

Just so it's clear: usually what we do when there's a major ongoing story is let discussion hop from an earlier submission to a later one as significant new information arises. We link to thread n-1 from thread n, so people who want to follow back can do so, but the idea is to have only one thread about the story—the latest one—on the front page at a time.

In this case, though, it's debatable whether "police say" counts as significant new information. And we don't want to be biased. So we'll restore the previous thread and err and have two on the front page for a while.


> “The driver said it was like a flash, the person walked out in front of them,” Moir said. “His first alert to the collision was the sound of the collision.”

I don't know if that is supposed to make anyone feel better about this, if anything it makes it seem worse. I thought one of the big benefits of these systems was they were supposed to be better at exactly these kinds of situations?

More like "autonomous cars are better than people at stopping in a flash" rather than "just like people this car didn't stop in a flash".

I'm really not trying to trash these systems, I'm just surprised at this explanation. Though, I guess based on the other comments now, maybe I shouldn't be.


Personally, I am constantly scanning for people near the edge of the road and anticipating their movements. I bet most of us have had the experience where someone stepped out unexpectedly but you already took action because you could tell they were not paying attention. Good luck programming that.


Yeah there are a lot of subtleties that are hard to program; that's why it must be extremely effective on things that _can_ be programmed such as quick breaking, quick steering; prefer lanes that are farther away from pedestrians near-by, beep when entering a non-illuminated zone and hundreds of other details.


Quick braking is the one thing you would have thought they could get right for sure, and they didn't manage even that on this occasion.


Quick braking can also be dangerous. Perhaps even LiDAR has to deal with false positives every now and then.


Then it shouldn't be deployed. I had several occasions where I overlooked somebody (luckily without consequences) but never one where I falsely registered someone or something to warrant to brake.


I don't think it will work. Instead the failing effort will be cover to add more and more rules in an attempt to hide the obvious while blaming the victim(s) for breaking the rules.

https://www.youtube.com/watch?v=UEIn8GJIg0E


Beeps are annoying. Better to artificially increase the engine sound.


I actually think this is a fairly trainable attribute. Given that Tesla does this kind of detection for objects and classification and records many miles of human drivers - it wouldn't be surprising that this may be a learned trait.


It's the corner cases... like Halloween, or body posture, or crossing behind a semi-transparent asymmetric object.


Fair point. This is far less likely.


A X,000lb car going 38mph takes a certain amount of time and distance to stop, even with a perfect reaction time. If she really did step out in front of the car with less than that distance to spare, than there wasn’t much to be done, sadly.


https://www.theverge.com/2018/3/19/17140936/uber-self-drivin...

The brakes didn’t even start to engage. Just how fast do you think this woman moved exactly?


There's miles of difference between the brakes not engaging and not showing significant signs of slowing down.

It's enormously unlikely that the Uber car failed this badly. The most plausible scenario is that the car did brake but just didn't leave skid marks. The car not recognizing a human in its path would be a shocking failure.


Just an update, the Uber car did fail that badly, and it was a shocking failure.


If she was behind another parked car and stepped in front of the car that hit her, she could have been at a normal walking pace. The police stated that the car did not show signs of slowing down before the accident. I doubt the police had the records from the car’s system.


I doubt the police had the records from the car’s system.

With all due respect, then why are they using their bully pulpit to blame this woman for her death at this early stage? The vehicle made no attempt to stop, which seems worrying if the sensors and algorithms were working as intended, so maybe it would be worth looking at those systems.


I think you are confusing the legal term “at fault” with the subjective term of “blame”.


How does the police determine that a vehicle "made no attempt to stop" today, without video and sensor data?


I don’t know, but made no attempt to stop is according to the police.

Uber’s self-driving car was traveling at a speed of 40 mph when it struck a 49-year-old woman in Arizona Sunday night, and did not show significant signs of slowing down, police said today.

https://www.theverge.com/2018/3/19/17140936/uber-self-drivin...

If you’re saying that the whole “blame it on the homeless lady” routine is premature the im with you though.


Yes, I think it is premature. What I am trying to say is that I don't think the police used the proper data from the car to give the results. If they were looking for break marks from the tires on the road, for instance (I have no idea what method they use), that doesn't necessarily mean the car wasn't breaking.


Where the skid marks start (if there are skid marks at all), probably. Witness reports might help, too.


A modern car that makes skidmarks under heavy breaking is a car with a broken brake and safety system. Modern ABS brakes actuate fast enough that they don't leave skidmarks under maximum braking.


They might have meant skid marks indicating an emergency stop though with the prevalence of ABS that is no longer a surefire sign.


The same way they do it now??


That’s what I am asking. How do they do it now?


But isn't this the reason why most drivers, when seeing someone walking along a median (with a bike full of junk, no less) or even biking in a dedicated lane, will move into another lane to give some clearance? It doesn't sound like there were any other cars near the Uber AV, and it was relatively far from the intersection (in case it needed to make a left turn). So I'm wondering why it didn't make the decision to change into the right lane temporarily to make it less likely to be surprised by unexpected abrupt behavior from the pedestrian.


It didn't even attempt to brake, so likely it never had any idea the person was there.

Also I know I wouldn't be programming my self-driving car to jump to a different lane every time it saw a pedestrian in the distance.

But then again I wouldn't be programming my self-driving car to speed either...


If somebody jumped in front of the car at the last second, it would be physically impossible to break in time.

I assume what people mean by "stopping in a flash" is closer to taking the right actions faster (e.g 40ms vs 800ms) when there's still a chance of changing the outcome.


She was walking a bicycle and had several bags of stuff on the bicycle or her person, probably wasn't darting in and out of traffic, but don't need to move that far from a median to get struck by a car.


If somebody jumped in front of the car at the last second, it would be physically impossible to break in time.

...and if somehow the car was able to stop that quickly, its occupants would likely be the ones fatally injured if they weren't solidly strapped in.


If only there were some sort of safety harness system installed in the car to prevent that sort of thing...

(Yeah, I get what you're trying to say, but I couldn't resist.)


800ms is pretty slow for a sober human I think

https://www.humanbenchmark.com/tests/reactiontime/statistics


800ms looks like it is actually pretty fast for a sober human driving [1]:

"This study examined the effect of brake and accelerator pedal configuration on braking response time to an unexpected obstacle. One hundred subjects drove in the Dynamic Research, Inc. (DRI) Interactive Driving Simulator through a simulated neighborhood 21 times, each time with a different pedal configuration. Each subject was presented with an unexpected obstacle only one time, for one of three previously selected pedal configurations, to which he or she was instructed to brake as quickly as possible. Foot movements were recorded with a video camera mounted above the pedals. Data were analyzed manually, using time and course location information superimposed on the video data. Response times were analyzed using ANOVA to determine effects of pedal configuration and various driver factors. Response times ranged from 0.81 sec to 2.44 sec with a mean of 1.33 sec and a standard deviation of 0.27 sec. There was no significant effect of pedal configuration on response time. Driver age was significant, with increased age corresponding to increased response time. Car normally driven, gender, driver height, and shoe size had no significant effect"

[1] "Braking Response Times for 100 Drivers in the Avoidance of an Unexpected Obstacle as Measured in a Driving Simulator"

https://www.researchgate.net/publication/274973324_Braking_R...


Isn't that linking to the results of people playing a game where they're paying attention and looking for something to react to?


That's with people expecting to click, ~500ms is more realistic for a scenario where you need to recognize, decide and react.


What is the driver supposed to say in this scenario?

"Actually officer, I saw her coming from miles away, but I thought to myself: let's just see what happens..."

Of course a "driver" in such a vehicle is going to report the victim came out of nowhere, otherwise they're basically committing homicide or implicating themselves in not paying attention and the car in not sensing them...it literally adds nothing to the evidence base...


Since it was dark out, the story seems pretty plausible. It's really difficult to see pedestrians at night, even when they use a crosswalk much less if they are crossing in the middle of the road.


The point isn't that its plausible (which it is), its that its the only story you would expect a driver who wasn't already confessing to negligence/manslaughter to say: irrespective of what actually happened.

Because if he had seen her before hand (i.e. she didn't "come out of nowhere"), he would have presumably taken control, breaked, swerved, etc.

We already know that the car appeared not to have done that, so the profession from the driver literally adds no further evidence, since that is exactly what we would expect every driver who mows down a pedestrian unwillingly to say...


You would think LIDAR might help in such a situation...


The point is that if someone runs in front of your car, physics dictates that you can't stop a 1,000+ pound vehicle on a dime. You can lock the brakes up and stop the axle from moving (ABS braking won't actually allow this usually) and the car will still continue forward for some time. In addition many other avoidance strategies won't work. Sure, you could swerve, but what's in your path then?

Machines don't get tired or bored, they don't screw around with the radio or put on eyeliner, or get distracted by text messages. They have other issues but we hope they'll be less prevalent.


You can sure as hell swerve.


Just for posterity; most cars are > 3000 lb

The volvo XC90 (car used by Uber ~ 4400lb before you add their HW)


> More like "autonomous cars are better than people at stopping in a flash" rather than "just like people this car didn't stop in a flash".

It's a single example. You can't extrapolate to better or worse from it. A self driving system better than people by 2 orders of magnitude will still kill people. But that's neither a comfortable thought, nor something media will get used to over the next decade or so.

This specific car shouldn't matter outside of engineering analysis, which is happening now. What should matter for others is a good statistic over millions of miles.


3.2 fatalities per billion km driven by humans. So far 1 fatalities per 4 million miles by automated vehicles, which when you do the conversions... isn’t better. It’s not a lot to work with, but for the people speaking in stats, the actual stats don’t support them. A lot people in this thread seem to be cashing in a human life against future hypothetical gains which have in no way been demonstrated. I sincerely hope that I’m not the only one disturbed by that.


Not sure it's that clear. Using data from Wikipedia, I got 11 deaths per billion km. The 4 million miles are just from Weymo, not all autonomous cars in total. (6 million+ https://medium.com/self-driving-cars/miles-driven-677bda21b0...) And the confidence level on the 1 event over 6+ million miles is not great either.


Again, I’m responding to people touting the safety of autonomous vehicles based on a tiny sample. Live by the sword, die by the sword. At this point the truth is that we’re surrounded by unknowns, and yet post after post features people claiming inhuman safety records for a nascent technology.

The data, such as it is, does not support that.


A single death doesn’t tell you all that much statistically. But I’m with you on how sad it is, and I’m a bit appalled by some of the comments...


Here's what my hypothesis: when it's physically impossible to stop (which sounds like it was the case, car moving at 38mph and you have a split second), my hypothesis is that the robot would plan to swerve around the obstacle instead of trying to stop. But the car can't turn 90 degrees immediately either so that could also be dynamically infeasible, but maybe it has higher probability to avoid the collision?


You're opening yet another can of worms when you try to evade. Sometimes it's the right/optimal thing to do, sometimes it's worse (e.g. switch to the lane going in the opposite direction because it seems free - until another car your computer eyes could not see comes quickly around a corner and there are four dead instead of one; that's one of the reasons driving instructors tell us, at least in my country, to do a full brake and avoid evasion).


But that is due to the limited ability of humans to asses all those conditions instantly in a sudden unpredictable situation. The machines should in theory be much better at that.


Grab some introduction to ethics book and think how you would implement what's written in there - and how you choose which one is the right ethical theory for the situation you're in. We had a lecture on that (lecturer had MSc in C.S. and was doing a PhD in philosophy), and many problems are very hard to reason about.

Even if you don't plan to build such software, I can really recommend any computer scientist to get an idea of ethics; in addition to the lecturers slides we used the book "The Cambridge Handbook of Information and Computer Ethics". Research in that direction is only gaining traction right now (not only, but also the "how do you implement an ethical decision automaton?"-part).

Remember: We humans can be lucky to have intuition and reflexes, which kind of put us out of the ethical issue: Usually accidents happen so fast, you can't think about the ethically acceptable reaction and just act more or less random/whatever your training and reflexes tell you to do (or you're so in shock when you realize what's about to happen you can not react at all). If you run over a child because you evade an elder person that's a bad situation to be in either way (assuming your driving was within legal parameters); if the computer actively decides to run over person X instead of person Y due to some algorithmic decision (or coin flipping, doesn't really matter) that's basically worse - because it ultimately decided person Y deserves the right to live more than person X.

That's why there was/is that large hype involving the trolley problem: https://en.wikipedia.org/wiki/Trolley_problem


My comment was largely about a situation where there there is no ethical choice involved.

For example: something comes in front of a car and a decision needs to be made, whether to turn left into the opposite lane, or whether that is dangerous and only brakes should be applied. Humans have a hard time making this decision, because it's hard to asses situation quickly.

The decision to only apply brakes (a common advice to people) - is only safe in the sense that in most cases it will be the safe thing. There will be cases where it will not be safe - but they will be in minority.

A machine will be better equipped at assessing whether driving onto the opposite lane is dangerous in that specific situation. If it is not dangerous for anyone - the machine will be expected to do that, instead of applying brakes, because in that specific situation it will be the safest thing.

The choice here is not an ethical one, it's purely computational one. There are situations where there is no one in the opposite lane, and it is simply safe to turn there. But humans are generally not trusted with being able to make that choice - because of poor computational ability (in such restricted timeframes).

Of course there are ethical considerations for other situations.


Self-driving cars are supposed to be a better driver than you on average, not primarily on the rare cases. There are lots of accidents that are due to carelessness with manual driving.


is pretty hard to beat human drivers in the danger category. But ok let’s play this out a little longer before you start dismissing it


> According to the Chronicle, the preliminary investigation found the Uber car was driving at 38 mph in a 35 mph zone and did not attempt to brake. Herzberg is said to have abruptly walked from a center median into a lane with traffic.

The autonomous vehicle was speeding and made no attempt to decelerate given the sudden presence of an obstruction in its path? I would expect some kind of reaction from the car within seconds, or milliseconds.


> given the sudden presence of an obstruction in its path

It doesn't even need to have been in its path; it should have detected a pedestrian near the edge of the road, concluded that they might step into the road (it's a possibility), and decelerated. Before she even stepped into the road, the Uber was at fault.


The cars would creep along at 5mph every time there's a pedestrian on the sidewalk then, obstructing traffic and causing accidents.


3mph above the speed limit


That is speeding and also the car was inattentive to potential hazards outside of its vision. Normal human drivers slow down when they see a wild lady on the median looking like she might dart into the road. Normal people slow down when there are kids playing with a ball on the sidewalk. Do these cars do that? Do they even drive the speed limit?

edit: Also accidentally killing someone with your car because of minor speeding or inattentiveness usually results in a misdemeanor vehicular manslaughter charge. Who is getting that charge here?


If I'm imagining this scenario correctly, I've experienced it tons of times and never seen traffic slow down. It's a road with a median and a person attempting to cross the road has temporarily stopped in the median waiting for a break in traffic so they can cross. Maybe I've just lived in 2 crazy states, or maybe I'm not imagining this scenario correctly, but I can't recall ever having seen traffic slow down for this situation.

It's possible to believe that the sensors/AI failed (no braking) while also acknowledging that the vehicle/driver is not at fault for the death. I suspect that may ultimately be the case here.


Where have you lived? Here in DC you regularly have to slow down because tourists and homeless people will blithely jump into the street. Same thing in Philadelphia or New York.


Traffic might not slow down, but I'd definitely keep an eye out for the person in the median.


True, but you also might not have time to react and hit the brakes if someone darts out in front of you.


Unless you were texting while driving, changing a song, eating, talking with a passenger, or talking on the phone (even handsfree).

Or, it was dark, your (night) vision isn't the best - but passes the drivers test, or you were tired.


You might also rest your foot on the brake in anticipation. But neither attention nor anticipating controls is a concept that really exists in these autonomous system. They're "always ready". (yeah "attention" can exist in perception neural networks, but I am not sure that has much to do with keeping an eye out for a risky event)


> Normal people slow down when there are kids playing with a ball on the sidewalk.

The driving-exam manual I had to read before getting my license was mentioning this exact scenario, i.e. if one sees kids playing on the sidewalk close to the road that the driver should slow down. It was even included among the questions given at the written exam itself.

But, then again, the authorities over here in Europe are more attentive when handing out driving licenses to people who are about to handle 2-ton pieces of metal inside populated areas. In this case it seems like the "US driving mentality" (only the driver counts, damn be the pedestrians) has also affected the actual engineers who have programmed these "AI" cars.


Sign in the area was actually 45 mph, so the car was 7 mph under the limit, it seems.


Yup. Here's a picture of the sign [1].

I found this based on the photo of the accident location in this article [2]. I went to Tempe, AZ on Google Maps, entered the name of the building in the background of the photo ("First Solar"), and from there it was pretty easy to find the photo location. Then it was just a matter of going backward until the speed limit sign for that road in that direction was found.

[1] https://www.google.com/maps/@33.4350531,-111.941492,3a,75y,3...

[2] https://www.reuters.com/article/us-autos-selfdriving-uber/se...


But you see, this is a computerized car. Whereas for a human, there is gauge observation error, there’s really no excuse for a machine, especially a machine who has our lives in its hands


But, in the US at least, the speed of traffic will be much faster than the speed limit, so driving the limit is unsafe.


Driving 38 vs 35 is hardly an unsafe difference...

These are machines. They should obey the speed limit.

For people learning to drive -- the first step is learning how to obey the rules. The second step is learning which rules can be bent. Let's focus on step one at the moment.


Where I'm from, at least, for people learning to drive the first thing you learn is that, if you go the speed limit, you are going too slow and will irritate everyone around you.


Driving the speed limit is never unsafe. And especially not in a residential area 35 MPH zone.


Driving the speed limit is frequently unsafe, e.g. in adverse weather conditions or unusual street conditions (like when passing an adjacent block party with lots of adults and children milling about.

The speed limit is the maximum speed limit in ideal conditions.


Driving the speed limit is frequently unsafe. I drive on highway 401 in Ontario frequently, where the speed limit is 100kmh, but traffic flows at 120kmh or above, with faster traffic going more like 130. Going the speed limit means you are going 20kmh slower than the speed of traffic, and 30kmh slower than faster moving cars, which can definitely be dangerous since accidents are more often caused by speed differential, rather than absolute speed.


Not long ago I seem to remember people listing the lightning fast speed at which an automated car could drive as just one of the many blessings the technology would gift to us. Now we want them to drive the speed limit? Boring.


Someone else has noted that the sign in the area indicates 45 mph, so it might've been a typo in the article.


That is immaterial.

The promise of autonomous cars is that they will be better drivers than human drivers.

That autonomous car, by going 3mph above the speed limit, already failed at the most fundamental level: following the rules.

And it was not even an obscure or ambiguous rule at that, as I assume that there are signs prominently displaying the legal speed limit. But even if there aren't, the rule book covers those situations in an unambiguous way (or at least it does in my country).

Accidents are excusable. Shit happens.

But deliberately failing to follow the rules? That cannot be tolerated.


For some reason I'm not at all surprised to learn that Uber programmed their vehicles to speed. Of course they did. This is literally "move fast and break things" at work.


Studies have shown that the rate of driving accidents is minimized when a vehicle travels at the same speed as other vehicles on the road. This means that in many cases, “speeding” with the flow of traffic provides the safest outcome. Although in the case of this accident, it appears that there was no surrounding traffic.

https://en.m.wikipedia.org/wiki/Solomon_curve


Odd the damage should be on the passenger side if she darted from the center median.

Odd they see the need to add the conjecture that she may have been homeless.

Curious too is that the center divider where she likely was has has a nice walkway on it with a street lamp[2]

I'm suddenly quite curious about what is going on.

Accident site: https://www.google.com/maps/@33.4369934,-111.9429875,3a,75y,... from https://imgur.com/a/QH98H

[2] https://www.google.com/maps/@33.4366949,-111.9427747,3a,75y,...


You appear to be suggesting that multiple levels of police watched the videos, GPS tracking, lidar scene, all recorded by the on-board systems and have deliberately suppressed what really happened. I can believe an Uber employee being paid to cover this up but whatever your views of the police, I don't believe they would suppress something that could so obviously be a major public health issue.

And it's common in police reports to identify the victim and where they're from. The point about the carrier bags and the homelessness is that they've been unable to verify her sitation; they're explaining why. Nothing strange here.

But I guess there's no convincing you without providing you the snuff film of it all happening.


I believe there a multiple layers involved:

a bizarre decisions about median divider design that attracts jay walking.

a governor with a stake in this subject.

an inexplicable disinterest in even minimal government oversight of the tests.

a company with a history of bad behavior.

a possibly reckless race to get a new technology market where trade secrets outweigh sharing safety results.

Clearly the data was not closely reviewed before issuing the statement. But I doubt the data retrieval is done by a police officer. More likely by an Uber employee; yet another potential issue.

The police report could easily be perfectly normal. Or perhaps there was call from the Governor. Or perhaps the police saw the data Uber wanted them to see. Perhaps the city doesn't want to be sued about it's median divider. I honestly have no idea what is going in in such a complex and invisible system but I sure am curious.


I'm in the weird situation where I have to say that it looks like you might have been right, but for the wrong reasons.

My point was that it would have been stupid to attempt to cover anything up. This was never going to go like a normal car accident. Every layer of local law and order would have seen it, the governor would have seen it, Uber employees (people, not an amorphous evil blob) and then the public would have eventually seen it. Hiding anything would have been a PR disaster, perhaps with criminal after-effects, and there were too many people —whose own safety depends on this— for a cover-up to work.

So it's even more ridiculous now that I've seen the video that shows an event that an able human would have spotted, braked and swerved that this Uber did not. Just using light, this should have been a near miss. But these vehicles supposedly have multiple sorts of radar-type equipment, don't they? They should have seen this well ahead of what a human could.

But yes, now there also needs to be a very serious investigation into the events that lead to the statement saying it was the pedestrians fault, or unavoidable.


Because historically money has had massive influence over the decisions made in the states. I side with the human over the faceless multi billion company.


There is no walkway in that center median. Source: I live a few blocks from there and drive on that road to my office every day.


I didn't notice it mentioned in this piece, but a different article on HN is suggesting she may have been homeless and was crossing where there was no crosswalk.

Homeless individuals frequently sleep under bridges, on the sides of off ramps and other infrastructure near highways. They are often walking where no pedestrian should be at all.

I would like to see the affordable housing crisis solved. Currently, there are only 30 affordable homes on the West Coast for every 100 poor families.* But it is entirely possible she did basically walk out in front of the car at a place where a driver had no reason to expect to see a pedestrian.

It would be nice if people would wait to see what evidence comes forth rather than jumping to conclusions about police conspiracies to hand wave away blame.

* https://www.geekwire.com/2018/every-100-families-living-pove...


It's like a story out of a dystopian sci-fi novel. A society that can make self-driving cars but cannot solve homelessness.


America seems to be poisoned by some poor mental models. People seem to see no real connection between homelessness and the lack of affordable housing. People also tell me there is no money in solving homelessness, which tells me how messed up our mental models are.


Strictly speaking, we can. We just choose not to.


Even worse, she was crossing at a crosswalk, but the city/state DOT "cancelled" the crosswalk. but left a paved walkway leading up to the "not"-crosswlak


That intersection isn't really. There's an overpass over Mill that she may well have lived under. In that case, it would not be well lit at all.


The point about the woman being homeless made something click for me. If you drive in DC, you’ll see panhandlers and people who are very erratic. You have to concentrate on their body language to anticipate whether they’re about to leap into traffic. You often slow down because you’re not sure what they’re going to do. How does a self-driving car do that?

It’s quite possible that a human driver going 38 wouldn’t have been able to stop in time. But could a human driver been going 25 because they saw someone acting erratically on the median?


I don’t think that the opinion of the police is at all relevant here, since they have no tools for investigating how self-driving cars react in accidents.

The sensors and reaction time of a self-driving car are entirely different from a human, so it’s possible that a crash for which a human is not at fault could be averted by properly programmed self-driving car software.


You don't think the opinion of the police is relevant in a homicide? The opinion of people who have investigated hundreds if not thousands of similar accidents?

Who meets your criteria for expertise on saying whether the car could have stopped or not? Uber?


The police investigators have no way of knowing if a fault exists in the software. I hope the raw sensor data will be released to experts who can simulate how different versions of software would have handled the situation.


If there is data to be reviewed, that’s your best bet against someone who was not even there.


There is data...in the form of a video taken by the car that hit her. You don't need a team of PhDs to watch it and make a judgment call.


What about the other sensor data? Did the police watch the LIDAR pointclouds and the RADAR maps too? Or are they basing their judgement on only one sense input? Seems to be the latter which means they're spouting off without knowing what they're talking about.


But you need access to that data first.


Can't the police fairly easily determine who had the right of way? Given how simple the rule is, it seems like one of the first things a police officer would have done.


Right of way doesn't absolve you of fault. If you have the right of way, but a pedestrian goes on the street anyway, you have the duty to stop/slow down to the best of what's possible.


Pushing a bicycle laden with plastic shopping bags, a woman abruptly walked from a center median into a lane of traffic and was struck by a self-driving Uber operating in autonomous mode.

From viewing the videos, “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway,” Moir said.

The "driver" (more like attendant) had no warning until the sound of impact. The vehicle was doing 38 in a 35 zone which I would've thought was odd - would the vehicles be programmed to push the limit a little as most other drivers often do?

Edit: Ignore last comment. Some have suggested the limit was actually 45 mph per signage.


Apparently my stored links on this topic are dead but two worhtwhile pieces of information on that point

1) drivers who consistently drive below the average speed of other drivers on a road (i.e., who religiously drive the speed limit) are more likely to be involved in crashes

2) More accidents are caused by speed differential than by speed by itself.


Your assertion (true or not) is irrelevant to this case. This was a car impacting a pedestrian, not another car on the road. In any impact between a car and a pedestrian the speed of the car is critical in determining whether the pedestrian will be killed.


I didn't mean to suggest that it was relevant in this case. I was merely providing some context for the parent post's observation about an automous car exceeding the speed limit


It may have been difficult to avoid the collision but if they were driving at a lower speed the collision may not have been fatal. I do not mean 3mph lower to be within the speed limit. I mean significantly lower if passing in close proximity to a pedestrian who may step out onto the road.


FWIW, this seems to be where the accident took place, based on news photos:

https://www.google.com/maps/place/N+Mill+Ave+%26+E+Curry+Rd,...

It's not completely unlighted, though I'm not sure that's supposed to matter given the AV's sensors? Even if Uber nor the human driver is not legally at fault, I'd still be interested in knowing of Uber's LIDAR and other sensors even detected the woman, and whether it's part of the algorithm to be cautious when someone is that close to the roadway at night?

Can the AV's sensors even tell the difference between a person slowly loitering on a median vs stationary lightpoles and (slightly swaying) vegetation? I suppose you don't want the AV slowing down just every time it senses a solid object near the roadway, but does it make different decisions if a stationary object happens to be a human vs light post?

edit: the article doesn't say what lane the Uber vehicle was in. Since the victim was walking down the median and yet managed to abruptly surprise the AV, then we can assume it was in the left lane? But don't most human drivers, on an otherwise empty road, move toward the far lane to avoid driving at full speed next to someone who is walking their bike near the road? I do that for bicyclists even when they have their own bike lane (never know when someone can abruptly fall). I would especially do that at the sight of someone walking their bike "laden with plastic shopping bags" late at night down a center median, because that is such an unusual situation.


I think I'll wait for the dust to settle a bit here, since it's hard to tell what happened. This is the wreck photo from the local news, showing a damaged bike on the sidewalk, and an Uber robo-car with front passenger-side damage, sitting in the right lane:

https://sharing.wcpo.com/shareknxv/photo/2018/03/19/poster_9...

It's hard to square that with someone darting over from a brushy median on the driver's side of the car. Hopefully NTSB will grab the sensor data and sort things out.

EDIT: This could get ugly...

https://www.azcentral.com/story/news/local/tempe-breaking/20...

tl;dr -- A homeless woman was walking her bike across the road. "Safety driver" of the robo-car may have a record for attempted armed robbery. Uber disputes the driver's identity.


This is the wreck photo from the local news, showing a damaged bike on the sidewalk, and an Uber robo-car with front passenger-side damage, sitting in the right lane

This looks like the corresponding street view: https://www.google.com/maps/place/N+Mill+Ave+%26+E+Curry+Rd,...

It's hard to square that with someone darting over from a brushy median on the driver's side of the car.

One possibility is that the pedestrian was trying to cross to the weird sidewalk/median thing and came out from behind the tree here: https://www.google.com/maps/place/N+Mill+Ave+%26+E+Curry+Rd,...

"Safety driver" of the robo-car may have a record for attempted armed robbery. Uber disputes the driver's identity.

Good grief. I would normally be surprised if a company denied a claim like that without having solid evidence, but Uber would be much less surprising.


There's a paved area in the median that seems pretty useless since there's no sidewalk or crosswalk that connects to the median at that point:

https://goo.gl/maps/9WKX2DYEN862

If the Uber car was about to turn left at the upcoming intersection it might have been getting into the leftmost turning lane at the point of impact. The pedestrian would have been obscured by a slight curve in the road, vegetation, a road sign, and their bicycle.


Yeah, I saw that too. It's such a weird construct since, as you say, there is no legal way for someone to follow the median paths and cross either of the streets. In fact, how are pedestrians supposed to get on the median in the first place (legally, I mean)?

From the New Times:

http://www.phoenixnewtimes.com/news/medical-cannabis-extract...

> That spot is east of the second, western-side Mill Avenue bridge that is restricted to southbound traffic, and east of the Marquee Theatre and a parking lot for the Tempe Town Lake. It can be a popular area for pedestrians, especially concertgoers, joggers, and lake visitors. Mid-street crossing is common there, and a walkway in the median between the two one-way roads across the two bridges probably encourages the practice.


In fact, how are pedestrians supposed to get on the median in the first place (legally, I mean)?

There are even "no crossing" signs where the paths meet the road. Apparently you're supposed to go to the crosswalk, cross to the median, then navigate down a thin strip which contains a bush that you have to step over and a bunch of loose rocks that you could easily trip on. WTF.


> and their bicycle.

good point, could it be that the sensors failed to spot her because of the plastic bags on her bicycle?


So here's one thing about this incident where humans do have a system for this already.

People are very good at reading body language. It's possible to gather someone's likely intentions (wrt moving) in a near instant. A defensive driver will see body language that could dart out into the street and slow down, just in case, and give more attention to the person. We are really good at this.

I know you can say it was dark and therefore maybe harder to see this, but a self-driving car shouldn't take its eyes off anything anyway, so that seems irrelevant. Unless it's almost pitch black, people can read this sort of information.

So what gives? Uber's self-driving system clearly lacks the full suite of decisions made by human drivers. This makes me skeptical that their cars can be significantly safer than human drivers.


How do you know that this is not possible? It is not useful to make broad generalizations about the field because of a single company, especially with so little information.


I never said anything like that. Did you mean to reply to another comment?


The chief of police says: "“It’s very clear it would have been difficult to avoid this collision in any kind of mode [autonomous or human-driven] based on how she came from the shadows right into the roadway,”"

I'm curious what you know that he doesn't.


It's cliche in America to absolve drivers of any culpability when killing pedestrians and cyclists let alone a case involving a local government that was actively courting a known unethical company.

Until they release footage I'll be taking anything from Uber and Tempe officials with a grain of salt, they'll spin this as hard they can.

P.S. I'm surprised he used the word shadows considering the car was equipped with a fancy lidar unit.


> "It’s very clear it would have been difficult to avoid this collision in any kind of mode [autonomous or human-driven] based on how she came from the shadows right into the roadway,"

How the hell would the chief of police know how to estimate what is possible for an autonomous vehicle to avoid?

Has he even seen that video of the Tesla predicting an accident two cars ahead?


If we're not holding autonomous drivers to a higher standard than human drivers, I'm not convinced on the long term benefit of this technology.


This is a very important point. I made this point above. Someone already mentioned the car was speeding 3 mph above the speed limit and defending it because...you do it?

If our we allow autonomous cars to be as reckless and dangerous as human drivers (which is already dangerous enough to make it one of the top killers in this country) then what is the point of AV's anyway?


Driving no faster than the speed limit under any circumstances causes safety issues. You get a continuous stream of cars cutting in front, some of them irate about having to accommodate the SDV's law-abiding nature. I'm relatively sure you can replicate this effect easily in most American metropolitan regions.


Long term benefit of self driving car is that I don't have to worry about the tedium of driving. I doubt uber (and most people) gives a shit about improving driving standards.


I think it's worth referencing Sweden's Vision Zero [1] here, a project which aims to reduce all fatalities from automobiles to zero.

They have a system of maximum traffic speeds for different zones depending on what impact risks there are in those zones. For the zone where there are possible conflicts between pedestrians and cars the maximum speed is 19mph. The rationale for this is that below that speed the risk of a fatality from an impact between a car and a pedestrian is low but above that speed the risk of fatality rises very quickly.

[1] https://en.wikipedia.org/wiki/Vision_Zero


I wonder if it is not a coincidence that the speed at which it starts to become fatal is around the maximum speed that humans can run.


>speeding

>didn't engage breaks

Definitely gives me confidence in self-driving cars, at least she "may have been homeless" so I can sleep easy tonight. Thanks Uber and Tempe!


>speeding

>didn't engage breaks

>not ubers fault

one of these things is not like the other.

The basic expectation is that if people are on the edge of a roadway you slow down and pay more attention. What exactly was the person in the drivers seat doing?


What exactly was the person in the drivers seat doing?

Nothing, because the car was clearly supposed to be driving itself.


The car didn't have a driver's license. The driver did, regardless of the amount of assistance provided by the car. What is troubling is that the breaks were never applied.


It's not that simple, I believe. Many governments have issues special registrations to self-driving cars, which waive off the requirement of an active driver to some extent. In that sense, the car actually has a "license" while the person sitting in the "driver's" seat is more of an attendant.


That's a very good point. Thank you for pointing that out. AZ does allow self-driving cars without drivers, as long as they follow the traditional laws and rules of the road.

I would argue that in this particular case, from the information that has been released, the car may have followed the laws of the road but it didn't appear to follow the traditional defensive driving rules of the road that parents teach their children, such as, "see that person with the bike standing on the median? Be careful. Don't hit them."


If the edge of the roadway was full of parked cars, the person might not have been visible (to either man or machine) until she walked out from between them.


Years of driving a big truck have taught me to take irrational behavior very seriously. And sometimes you can only anticipate that by looking at people’s faces, reactions, past behaviors, thinks like that. I wonder how do you teach that to AI.


What bothers me is that there was no braking happening, and maybe even apparently no notification to the car that it hit something.


Extremely tragic that this happened but I’ve heard of many cases where homeless people, generally mentally ill, will run into traffic.

My friend almost lost his life in SF a few years ago when a homeless person ran in front of him as he was riding his motorcycle. My friend hit the homeless person and went flying off his motorcycle and broke his arm and pelvis. The homeless person ran off (not literally but they couldn’t find him).


At some level, at some time, we're going to have to come to terms with the fact that these machines can be dangerous in certain ways. Probably ways that are slightly different than human controlled machines. Maybe safer on the whole. But certainly different.

Cars have been a menace for a century. We accept and adapt.


Are there plans for a beacon one could place on themself to make self driving cars recognize them better? Could be a light, a radio, or something else.

Perhaps there's a slippery slope before such things are required by law, but I am interested.


>"...the incident occurred roughly 100 years from a crosswalk."

How slow are Uber's cars driving?


I think driving 10-20% slower than the posted speed limit at night is completely sane. In Colorado we have some rural highways that are double posted with lower speed limits at night. If AI cannot make identical collision avoidance guarantees at night and during the day, then it should be law that they self limit their speed below the posted speed limit until there is separation parity.


> Police believe she may have been homeless.

Move along folks, Arizona cops said Uber and the victims story matches, victim isn't gonna sue, case closed.


There's going to be a bunch of these "OMG a robot car hit someone!!!" stories that then morph into "the robot actually had nothing to do with it." Perhaps it will finally cue people in to how intrinsically dangerous cars are.

I mean, even if autonomous cars are x100 times safer to be around than human-driven cars, they're still orders of magnitude more of a risk factor for ordinary (U.S.) citizens than say, terrorists, or sharks.


Given just how poorly we react to terrorism and sharks, I’m not sure if that’s the direction I’d want to be headed in. I’m in favor of automation, but not with Uber’s “move fast and break things” attitude. Responsible, ethical, and careful developments of dangerous technology, even if the danger promises to be reduced compared to the norm in some future iteration, is a must.

Keep in mind that some version of “killled by robots” has been in the zeitgeist of fear for decades now. In that sense, “sharks amd terrorists” might not be a bad comparison. It will not take much to ruin a good thing (automation) with typically scummy practices. The important thing to remember is that automation doesn’t require the likes of Uber playing fast and loose with it to become a reality.


> Responsible, ethical, and careful developments of dangerous technology, even if the danger promises to be reduced compared to the norm in some future iteration, is a must.

Agree. And I'd like to see an approach similar to an NTSB investigation of an air crash. Not to place blame, but to identify root causes and mitigate the chance that they recur.

Air travel is exceptionally safe. Has been for many decades. Yet we still investigate any air crash because any crash means something went wrong, perhaps something that can be corrected.

Self driving cars need to be treated the same way, at least for now, and until well after they have demonstrated themselves to be safer than human drivers.


This is the way to go, and it would help (I think) to ease the calls I’ve seen around for holding individual devs responsible. The point is not to collect scalps, but prevent tragedy and malfeasance. Holding companies responsible, sure, but some random programmer who missed a bug being the scapegoat seems like a way to ensure companies avoid responsibility by foisting it off on a low totem member; thst can’t be allowed to happen with lives on the line. Like an aircraft, self driving car software is never going to be the work of a single Dev. The NTSB isn’t perfect, but as these things go, very admirable.

IIRC the NTSB is investigating this case too.


Responsible and ethical are usually not words which are used together with Uber, but in this case it seems like it was simply an accident. Computers cannot beat Physics, if the woman suddenly stepped in front of a car which is unable to brake, it’s a tragic accident.

The idea of the investigations compared to air-travel is pretty good and also much easier to execute given they record everything on camera.


With the level of tech commonly posted on this website (and in general), there seem to be many suprising dangers that the average lurker is not going to know about that the experienced developer / whatever will know about. It seems better to seek out and point out (if non-obvious): 'be careful about x,y,z' rather than hide such warnings in obscure blog posts / hope people will understand some kind of implicit message in whatever (especially since some times evidence can be conflicting). A good example of this is the BIS export warning for crypto (assuming in the US), I've talked in the past to several 'experienced' blockchain people who had no idea about this, but if you browse around on enough crypto github pages you can eventually find it.


This is commonly called 'attempting to saving someones butt with a reasonable explanation, rather than trying to make a quick buck / whatever to their detriment.'


Did you reply to the correct post? This seems unconnected to the topic at hand in the extreme, but maybe I’m missing something.


I was specifically responding to your comment "Responsible, ethical, and careful developments of dangerous technology, even if the danger promises to be reduced compared to the norm in some future iteration, is a must" and not responding to the topic of Uber or specifically self-driving cars.


How is Uber's experiment different than any other 20 companies working on same thing?


> Perhaps it will finally cue people in to how intrinsically dangerous cars are.

Until you consider the _billions_ of miles that cars in the US drive every month. Cars are exceptionally safe, and they're getting better every year.. because we keep driving _more_ miles year after year, yet fatalities continue to drop.

Also, how do you interpret facts like: California has 30% more people living in it than Texas, yet TX has a _greater_ number of roadway fatalities than CA?

> they're still orders of magnitude more of a risk factor for ordinary

The problem is: you absolutely cannot view vehicles as a _single_ risk factor. Thus, any comparison between human drivers and AI drivers on these terms is flawed from the outset.


Cars are a leading cause of death in the US, especially among young people. That's a fact worth taking notice of, regardless of how many miles are driven or whatever else.

Yes, it's great that fatalaties are trending downwards.


So cap the power, weight and speed of cars young people are allowed to drive. In the US you can only vote at 21, but you can join the army and kill people if you're 18 and in most states drive powerful cars at 16 (although subject to some conditions such as a curfew and limits on the number of underage passengers present in the vehicle). In most European countries one can only drive heavy quadricycles (<=450 kg curb weight, <=15kW, <=45 km/h by construction) at 16 years of age.

Same for autonomous vehicles: until the NTSB or whatever federal or state agency is responsible decides which tech is mature enough and issues licenses for a given AV system. A person has to have a driver's license, however AV systems do not. Why not certify which of them are able to operate heavier/more powerful/faster vehicles without human intervention (meaning they'd slow down to a stop when the driver would take his/her hands off the wheel).


The problem with the "billions of miles" metric, I have nothing to compare it to. If you compare it to safe of trains or buses (at least in countries that properly fund their public transport), how does it compare?


Cars are not anything close to exceptionally safe. Indeed, they are the exact opposite: They are the most dangeorus thing that most people encounter on a daily basis.

There are many age and other cohorts of the population for which the leading cause of death is vehicles.

And compare against commercial aviation, which has a per-mile death rate three orders of magnitude less than driving.


> They are the most dangeorus thing that most people encounter on a daily basis.

It's actually yourself, and it's worse when you're at work. Deaths due to vehicles: 36k per year. Deaths due to suicides: 44k per year. Deaths due to accidents: 190k per year.

> There are many age and other cohorts of the population for which the leading cause of death is vehicles.

This is not true for any cohort. [1] And yes, I'm aware that accidental deaths include vehicular deaths, but if you dig into the FARS data [2] and do a comparison between the two sets, you'll see that this statement can't be correct.

It is; however, true that young men 15-24 exhibit the highest risk rate in vehicles compared to anyone else.. it's even 8x higher than females of the same age, but it doesn't make vehicles the greatest risk they face.

> And compare against commercial aviation, which has a per-mile death rate three orders of magnitude less than driving.

Yea, but a "deaths-per-journey" that's three times worse [3]. By the way, the aviation insurance industry uses the per-journey statistic, not the per-mile. Which further reveals my point, I don't think you can't take a single axis view on this information and come to any meaningful conclusion.

[1]: https://www.cdc.gov/injury/wisqars/pdf/leading_causes_of_dea...

[2]: https://www-fars.nhtsa.dot.gov/Main/index.aspx

[3]: https://en.wikipedia.org/wiki/Aviation_safety#Transport_comp...


If we are considering deaths-per-journey, then cars look even worse when compared to planes: most people take many more journeys by car in comparison to planes. Let's say we take 500 journeys by car for every one journey we take by plane (not that outlandish), then cars are 1500 times worse than planes.

Edit: it seems like it is the other way around. Planes are 3 times as worse than cars per journey, so flip that number around a bit (3 times better, but 500 more trips, so cars are only 166 times more dangerous than planes).


> but 500 more trips, so cars are only 166 times more dangerous than planes

In terms of fatalities, yes. My data completely discounts non-fatal car accidents; however, there are far fewer "non fatal aircraft accidents" for obvious reasons. So, there's not a lot to glean from this, particularly given the highly regulated nature of aircraft operations in the USA and most of the world.

You could reduce vehicle fatalities by a huge amount just taking motorcycles off the road. They afford no protection to their passengers and increase fatalities just by existing.

You could reduce vehicle fatalities a huge amount by not allowing young men under the age of 24 to have a vehicle with more than 90 horsepower. They have a tendency to lose control of their vehicles and drive them off the road and into trees or other solid objects. Many also die because they weren't wearing a seatbelt and got ejected from the vehicle. Some survive all of that and die because while laying in the road injured, they got hit by a vehicle completely unrelated to their crash.

You could also get some reduction by creating safety systems that do less damage to elderly bodies in an accident. Protection of the heads and necks of adult passengers in the rear of the vehicle needs a lot of work currently. Way more elderly passengers die in the back of the car than in the front, even in low velocity (< 40mph) accidents.

You could eliminate 1/6 of all vehicle fatalities, 6000 per year, by removing pedestrians from the road. The fatalities are pretty well split across setting (rural/urban), sex, age and time of day. Sidewalks and traffic > 20mph is pure insanity.

You could also definitively figure out _why_ Texas has not only higher fatalities _by number_ but a much greater fatality _rate_ over California. A statistic I opened with because this is a _huge_ point. Seriously, look up the numbers per state; because, when I do, I _cannot_ come to a blanket conclusion like "cars are dangerous."

When really: Cars present _multiple_ and often _unrelated_ risk factors and we still have room for all kinds of incremental improvements in their overall safety.

Unrelated to my argument, but still a worthwhile thought: How much room is left for improvements in Aircraft safety? The most highly regulated and automated vehicle and transport system in service today?


Deaths per journey is also useless because a vehicle owner might make many trivial car trips per day, but when you're flying somewhere in a plane it's usually meaningful. For the same death rate, I'd much rather go on an awesome vacation across the world than on three errands.


> This is not true for any cohort.

I think it is, for a pretty reasonable slicing of the data. Here[1] you can find a table on pages 33-34 that contains cause of death by age cohort - the 15-24 group has only a single row that beats motor vehicle accidents, and it is "accidents", which motor vehicle accidents is considered a subset of.

I think part of the confusion from the table you linked to is that it doesn't contain motor vehicle deaths, and they are hidden in a weird way - in the 15-24 cohort, motor vehicle deaths are a substantial subset of all of the top 3 causes - unintentional injury, suicide, and homicide.

> By the way, the aviation insurance industry uses the per-journey statistic, not the per-mile.

That's perfectly reasonable for insurance purposes; a lot of the risk is concentrated in the takeoff and landing, so risk per journey is a more stable measure than risk per mile. But as a person who doesn't want to die, that's a nonsensical choice - I don't fly across the country because I wanted to take one journey; I fly across the country because I wanted to get somewhere that was 3000 miles away.


> think it is, for a pretty reasonable slicing of the data. Here[1]

You missed the reference.

> group has only a single row that beats motor vehicle accidents, and it is "accidents", which motor vehicle accidents is considered a subset of.

Right, but then you have to ask why? You might find a dataset which further breaks the cohort down into "Male" and "Female" categories; because when you do that, you'll see the data does not line up between the two at all. There will, last time I checked, be about an 8x difference between the two.

This highlights the fact that the driver of the vehicle carries the risk, _not_ some inherent property of the vehicle itself. Ask yourself this question: if we took cars away from young men, would they be any less prodigious when it comes to fatally injuring themselves? History and psychology suggest that they wouldn't fare any better.

> in the 15-24 cohort, motor vehicle deaths are a substantial subset of all of the top 3 causes - unintentional injury, suicide, and homicide.

Suicidal behavior is not a risk factor that you can then translate to vehicles, neither homicide for the reasons I stated above. Finally, when you compare total vehicle fatalities for the cohort to the accidental death category for the same cohort, you realize it's around 30% of the overall deaths. When the 15-24 group injures themselves, it most often _does not_ involve a vehicle.

> a lot of the risk is concentrated in the takeoff and landing

Aircraft bodies are only rated for a certain number of "pressure cycles." There's genuine risk baked right into the airframe.


Shit, I trimmed a paragraph at the end and dropped the reference with it. That's supposed to be https://www.cdc.gov/nchs/data/nvsr/nvsr66/nvsr66_06.pdf


> Until you consider the _billions_ of miles that cars in the US drive every month. Cars are exceptionally safe, and they're getting better every year.. because we keep driving _more_ miles year after year, yet fatalities continue to drop.

How many people do pedestrians kill per mile? Or bicycle? Or subway?

And either way, cars are still the number one killer if you look at miles per death.

https://en.wikipedia.org/wiki/Transportation_safety_in_the_U...

It is disgusting we allow this to encroach on society and kill so many people.


More like cue them into how dangerous walking out into traffic is.

Cars don't stop on a dime. Everyone knows that. If there's ok visibility there's no excuse for getting hit by one that isn't going absurdly fast (assuming you don't use a circular definition for "absurdly fast")

Sure, the car isn't supposed hit the pedestrian but the pedestrian also has an equal obligation not to just dart out into the road.

And before you ask, I don't drive to work.


If you can't tell that somebody isn't going to throw themselves under the wheel of your car, you need to drive more slowly, so you've got time to stop when it happens.


Agree with part of it but no, pedestrians do not have an equal obligation. With power should come responsibility. In America, we seem to do the opposite, disassociate responsibility with those in power and blame victims.


Pedestrians have a higher personal responsibility since they have a higher risk. Common sense dictates this.

When you cross the road you know there is a risk. Driving down the road you dont typically have a huge risk.

Especially when its legal to drive down the road and illegal to cross the road outside the crosswalks.


I think you should take a step back and think about what you're saying.

A car can kill people. A car can create a large amount of property damage if it is driven carelessly. Cars can even, in extreme circumstances, be fire and chemical hazards.

Are you really sure driving doesn't include a massive amount of risk to yourself and others compared to walking period?


Walking in front of a moving car has a high probability of death. You are responsible for your own life.

Do you stand on train tracks and blame the train ?

How is this difficult to understand ?


With power should come responsibility, sure, but "power" means "ability to affect the situation", not "joules per second." In the scenario of a pedestrian suddenly running out in front of a car which is driving at the speed limit, the pedestrian has far more power to affect the outcome (by choosing not to run into the path of the car) than the driver (who could not reasonably predict the pedestrian would behave in this manner).


Right, but since the driver has seen there's somebody on foot on the pavement, how come they haven't slowed down already? Perhaps the person could trip and fall into the road. Perhaps the person could have a fit and fall into the road. Perhaps the person is just crazy and is contemplating leaping into the road. Perhaps the person is drunk, hasn't seen you, and just fancies crossing the road there and then. It really doesn't matter. These are not capital offences.

What if that person were your son, daughter, wife, husband, parent, or whatever? Would you be so blasé over their losing their life, and all because somebody couldn't even be bothered to just move their foot a bit and press a pedal?


The power I'm talking about here isn't ability, it's literally the might and momentum of the car. And because of that power, drivers need to take more responsibility when they drive. A driver doesn't become responsible the milliseconds before they kill someone, they become responsible the moment they set foot in a car.

The ability that drivers do have is to drive defensively and keep a look out for pedestrians. That is what I'm talking about, not by frustrating Newton's 3rd law. And because of that extra ability to do damage, they must be subject to harsher scrutiny than those who can more easily suffer damage.


I only have one question: Why didn't the software controlling the vehicle detect the pedestrian and attempt to brake or avoid collision?


>Police believe she may have been homeless.

Police: "don't worry, she was an undesirable."

Next, let's watch them publicly leak her criminal background.


They already did, she was in prison for a year on charges relating to the 'dangerous' drug marijuana.

I called almost verbatim what the chief of police actually said in the other thread and had my comment downvoted and flagged for it.


If you mean https://news.ycombinator.com/item?id=16620402, it's unclear what it means and how you meant it. I'm sure many readers took it literally.


I mean I did mean it literally as a factual legal statement, not as a normative statement on how society should work.


Right—that was unclear, though, and if some readers took it as an endorsement I'm sure you can understand why they'd flag it.


Yeah, the “may have been homeless“ line really pisses me off. Such a blatant bullcrap attempt to devalue a human life.


I wonder how much money Uber is pouring into Tempe and how much they promised the local government in taxes and infrastructure, to get them to bend at the knees so fast


It’s interesting that so many people in the earlier post first reporting on this topic were very quick to rush to judgement.


Maybe we could put bumpers on these cars until they are official? They don't need to look cool right now.


Car forensics will be interesting. There will be data collected minutes/seconds before https://semiengineering.com/anatomy-of-an-autonomous-vehicle...


“The driver said it was like a flash, the person walked out in front of them,” Moir said. “His first alert to the collision was the sound of the collision.”

Notice how the chief of police refers to the human and AI as "they".


Once again Futurama predicted the future entirely correctly:

https://www.youtube.com/watch?v=0qBlPa-9v_M


Looking at the google street view I am astonished someone thought it was a good idea to allow a bunch of random plants to grow in the median.

It makes visibility insanely complicated, especially at night.


You're just saying the driver is even more to blame - going faster than the speed limit even though visibility is reduced should increase, not decrease, culpability.


Surely there's a blackbox somewhere that could clear this all up.


I don't want to hear any officer's opinion on this until the video and radar is publicly released. It's unclear on what evidence the officer bases this account of events.


Second paragraph:

> Chief of Police Sylvia Moir told the San Francisco Chronicle on Monday that video footage taken from cameras equipped to the autonomous Volvo SUV potentially shift the blame to the victim herself, 49-year-old Elaine Herzberg, rather than the vehicle.

How is that unclear?


How is it unclear that the person you're responding to wants to see the video and LIDAR data, not just take someone's word for it? "Potentially shifts the blame" as hearsay isn't good enough.


To answer your question, it is apparently deliberately unclear. "Potentially" is one hell of a weasel word.

We're at the point where the public no longer trusts its institutions to make impartial judgments - we all know the overriding societal bias towards cars, the tendency to blame the deceased, the political incentives of wanting Uber's continued support, etc. And if this initial analysis is indeed the correct one, then it's especially in everyone's interest for such doubt to be eliminated.

There are video(s). They should be publicly posted swiftly, before it can be edited down to show the most plausible narrative. Anything less feeds the ongoing breakdown in trust.


It's unclear, because no facts are presented just someone's interpretation of some camera footage that isn't shown.


Well, was she already walking in road where the car should have had time to stop or did she walk in front of the vehicle at the last second? Did the car make an effort to evade?


Who knows? But one thing we can be sure of is that the video nobody has seen will potentially shift the blame to the homeless lady.


'the incident occurred roughly 100 years from a crosswalk' - deep space?


Yeah saw that, rather than missing the word light, it’s probably yards. Based on what’s reported (not suggesting it’s true or the full story) she basically jumped out in front in which case this indeed would have been hard to avoid.


Given that the car apparently didn’t even start to brake, maybe she was moving at light speed?

https://www.theverge.com/2018/3/19/17140936/uber-self-drivin...

The LIDAR probably couldn’t see with all the shadows, or maybe she teleported, literally “coming from nowhere.” I sure hope if I’m killed the filth blame it on me in less than 24 hours.


She definitely "jumped out in front" given no actual evidence that "she jumped out in front." I might even venture that "she lunged" given absolutely nothing to offer in supporting evidence.

Carry on Uber!


FYI you can tweet at reporters letting them know there's a typo, and they typically make the edit quickly. They usually seem appreciative.


Just to continue my line of unpopular opinions about this: whoever chooses to put a car on the road instead of traveling in a way that hurts others less (walking, transit, bike), is responsible at least in part for any damage, injury or death, just by being there. How many people die by wandering out in front of somebody else walking along? Nobody even has to call a cop, much less determine whose fault it was, because nothing happens.


This is called comparative negligence. The theory being that there really is no such thing as an “accident” and that generally all parties involved share some negligence in causality. Obviously there are still freak occurrences. Courts adjust damages based on this all the time.

People slip and fall and hold business liable all the time.


Do you also believe that long haul trains are inherently immoral, given that they are more likely to cause bystander injuries/death than airplanes?


Who said anything about morality? I'm talking about responsibility. Nice attempt to misrepresent my viewpoint. I call foul. Responsibility lies with you, friend, not the machine. Besides, how many trains and planes do you pilot daily? Also a train, unlike a car, is not a frivolous convenience operated by a privileged individual who has several other less-objectionable choices available for the purpose. And ironically if everybody got out of cars and onto trains they would actually be cutting the danger to others by a large factor.


sent from your mobile phone, made possible by an 8 year old slave working in the cobalt mines.


Where's yours sent from, an angel's ass? Anyway let me know when all my evil deeds finally make you innocent so I can take a rest.


Uber's autonomous cars speed? Why?


The car was speeding,

"the Uber car was driving at 38 mph in a 35 mph zone"

End of story. Uber is at fault.


okay i haven't been following this deeply, but the person hit was a cyclist right? not a pedestrian.

if so. how does a self-driving car not identify a bicyclist? how does a bike come out from the shadows and "surprise" a self-driving car?


There's been conflicting reporting, but from what I've seen it was a person walking a bike.


From the article: "Herzberg is said to have abruptly walked from a center median into a lane with traffic."


No, the woman hit by the car was a pedestrian, not a cyclist.


So fucking predictable.


I’m often cynical, but I have to ask...what makes this “predictable” to you?

Are you commenting on the evolving narrative which shifts the responsibility from a vehicle-initiated accident to Elaine Herzberg’s actions, or are you decrying something else?


Maybe so, but please don't post unsubstantive comments here. It doesn't help, and makes this place worse.


When it comes to traffic, there's the concept of the right-of-way. I don't know about the rules in Tempe, but here's an excerpt from the rules in New York.

> (a) Every pedestrian crossing a roadway at any point other than within a marked crosswalk or within an unmarked crosswalk at an intersection shall yield the right of way to all vehicles upon the roadway.

And that kind of makes sense, because if pedestrians were able to cross wherever they chose, it would become difficulty for vehicles to move down a road with a stable velocity. (Increasing emissions, etc.)


Legally, the driver may not have been at fault. Even so, I was taught as a child learning to drive that when I see a person in the median, watch them. They may walk out in front of you. They might even fall into the road.

So, be prepared maybe even by changing lanes before that point in order to get out of harm's way. Apparently, this driver wasn't watching, choosing to let the car drive itself.


> Apparently, this driver wasn't watching, choosing to let the car drive itself.

And here I was, stupidly thinking that was the whole point of a self-driving car...

What's the benefit of a self-driving car that can't be trusted to drive itself again?


There are plenty of benefits, but you can also ask the dead person in AZ about the downside of defending a product that isn't ready. A self-driving car that drives itself without the most basic of defensive driving habits simply isn't ready. Ask yourself, why do the reports show that brakes weren't applied?


True. But if there was no crosswalk within 100m (per the article) and it wasn't a highway (only 35mph limit) it's reasonable to cross without a crosswalk. Still a pedestrian shouldn't put themselves in harm's way, but I don't think crossing off the crosswalk should be help against someone in this circumstance.


It's not. But as the car has the right of way, it's the pedestrian's responsibility to make sure they're not hit.


Any reasonable human driver would try everything to not hit a pedestrian, regardless of right of way. The laws were written with common sense in mind, and if self-driving cars can’t emulate that then they’re not ready. Cars that only follow the letter of the law would be absurd.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: