> Why would such a fund exist for something that can be pin-pointed directly at the party at fault?
People should still be compensated when accidents happen, without the drag caused by trial attorneys attempting to extract as much as possible from self-driving vehicle manufacturers (which is going to slow down progress).
The benefits of self-driving vehicles from reduced fatalities and accidents alone are so great that a process and funding needs to be in place to allow continued innovation (if done with safety put first).
> People should still be compensated when accidents happen, without the drag caused by trial attorneys attempting to extract as much as possible from self-driving vehicle manufacturers.
Of course but the current laws today can drag this out when people are involved; why would cars be any different?
> The benefits of self-driving vehicles from reduced fatalities and accidents alone are so great that a process and funding needs to be in place to allow continued innovation (if done with safety put first).
We don't do this in any other industry as far as I know. It's a weird mechanic to make the manufacturers of automatically driving cars off the hook from accidents.
I understand the intent but I don't know how that works within our current legal system and wouldn't that encourage cheap, shitty-built cars since companies won't need to be liable?
The problem with mandatory compensation programs (aside from granting legal immunity to private sector entities, which I disagree with), is that they tend to break the discovery process via way of circumvention.
The discovery phase of a case is how truly damning evidence often comes to light. A fantastic example of this would be the Toyota unintended acceleration debacle.[0] If it weren't for the discovery process in those cases, nobody would really know what a total mess Toyota's code was.
As far as self-driving cars are concerned, I've no doubt top tier companies like Google and Tesla are going to do the best job they can, but eventually everyone is going to be in the space, and when a company with an institutional disdain for proper safety-critical software engineering practices ends up killing people, I want their feet held to the fire.
> I understand the intent but I don't know how that works within our current legal system and wouldn't that encourage cheap, shitty-built cars since companies won't need to be liable?
It works if you allow self-driving vehicle algorithms to be patented. You could then open them for public examination by a government agency.
If the algorithm performed to regulation agency expectations, accident victims would still be compensated for losses without punitive damages exacted.
Regulation isn't a magic bullet though. I've seen countless companies check the boxes of regulation for their software in the government space only to have them fail spectacularly because it was done as cheaply as possible.
Regulation will never cover all possibilities of a company acting shitty; if companies find ways to doing things cheaper and still being able to check that box just so they have no liability then they will do it.
I don't think these types of get-out-of-jail-free-cards, even though they're very well intentioned, are ultimately a good thing.
People should still be compensated when accidents happen, without the drag caused by trial attorneys attempting to extract as much as possible from self-driving vehicle manufacturers (which is going to slow down progress).
The benefits of self-driving vehicles from reduced fatalities and accidents alone are so great that a process and funding needs to be in place to allow continued innovation (if done with safety put first).