Because of the responsibility. In my country, if you kill someone on the road, you go to jail, and pay a huge fine. Which Google exec is going to jail if Waymo kill someone?
You prefer more dead people and more people in prison, rather than fewer of both?
Also, the same event (e.g. someone dying in a car crash) doesn't always have the same responsibilities behind it. If I kill someone by driving recklessly, I have more responsibility than if I kill someone when a bird crashes on my windshield. There are extreme cases where someone bears full responsibility, and extreme cases where an accident is just an accident and nobody is responsible. It may be that with self driving, a larger percentage of cases lean on the "true accident" side. (It's just an idea though, I agree there's an important question here that merits careful consideration.)
I don't prefer anything, I tell you you have to assign responsibilities, and that's will slow adoption.
If the car cause an accident because it fails to spot something, do you pass it on 'mechanical error'? Because in my country that would mean the code has to be audited, like every X kilometers a mechanic has to 'audit' my car to prevent mechanical failures, and take on the responsibility if something breaks and it kills someone.
I think Waymo won't accept code audits, so the company has to take on the responsibility if a car kills someone. The only way to be sure it ends well is if Waymo is 100% sure their cars can't cause any accidents.
I think it's two to four years if it's a driving mistake (distracted driving, failure to yield, reckless speed or for people too old to see), a bit more for DUI (up to 10, but it's often around 6), an intentional (road rage) can climb up to 20 but the only case I've heard of it was 12.