Proprietary software doesn't stop people modifying software, it only slows them down.
And it doesn't slow people down anywhere near enough, hence all the talk about DRM, encryption, warranty voiding and anti-convention laws.
There is nothing wrong with allowing people to modify software, the issue is when people modify software to break laws (environmental law, in this case).
The solution is going to be some form of compliance certification/testing. The Manufacture provides the default software and gained a certification to prove it compiles with the laws. If the user wants to modify the software, they are going to have to prove that the replacement software also complies with the laws and get a certificate.
During annual vehicle inspections or licensing, they can check that certified software is installed. Or maybe the ECU's bootloader only allows certified software to be installed, unless a developer mode is enabled (which will require the owner to keep logs to ensure compliance)
"Proprietary software doesn't stop people modifying software, it only slows them down."
See DMCA and licensing agreements. Modifying your stuff might be a felony someday. Best to be sure the right to modify or repair is right there in the license. Irrevocably and transfering to next buyer automatically. Not happening right now.
Making hacking of your own property into a felony has already been attempted under current law, after a fashion. Sony's lawyers claimed that George Hotz's PS3 exploit was a violation of the CFAA because he hacked "the PS3 system", and "the PS3 system" belonged to Sony (apparently equivocating between individual consoles and PS3 as a platform) [1]. I'm not sure whether a judge ever actually addressed this argument, since Hotz settled. And of course that's a different situation than actually being indicted under the CFAA, but you can see the gears turning.
I seem to recall that a similar argument was used against Jon Johansen at some point, but I don't feel like trying to dig up early-2000s history on the web today (IME going back to ca. 2003 is okay, but before that a lot of things have fallen down the memory hole).
I am all in favor of safety-critical software being open source, but I dislike the argument of forcing the issue by claiming the right to tinker, or really any variation on "I bought it therefore I can do whatever I want to with it". It's not true when the hardware in question has safety implications. For example, when I buy a car, I do not have a right to do anything I want to with it. I can't drive it in ways that would be dangerous to others, and the fact that I am legally restricted as to where and how I may drive it does not feel like an inalienable right being taken away from me. That's clearly a strawman but bear with me.
In a similar vein, if I were to physically modify the engine in such a way as to make it faster but fail environmental or safety regulations, that could also have financial or legal repercussions for me (or Volkswagen, to cite a recent example :). The same argument could be made against modifying automotive software- vehicles are certified by government agencies around the world based on the idea that they will perform a certain way WRT safety and emissions, therefore software behavior has to be just as unchanging and predictable as hardware behavior. The fact that people can and do hack their cars all the time doesn't change the regulatory environment for car companies.
I do believe that embedded software should be as open, modifiable, and testable as possible, but I don't think people claiming the right to tinker as a basic human right are going to change anyone's minds about this. The biggest impediments to (for example) car companies making their ECUs open source and hackable have nothing to do with a dogmatic attachment to proprietary software and security-through-obscurity (although those concepts have their adherents, misguided as they may be), and much more to do with compliance with government regulations and minimizing their potential liability.
There is at least a perception that having open source, hackable ECU software would be all downside with no upside to the companies selling those vehicles. The way to change this perception is to show it as being demonstrably wrong, rather than to simply claim a right to tinker. The best way to get open source ECUs is to either make the consumer market care about this (not likely) or to update the regulatory environment in such a way that car companies have a non-abstract motivation to go open source.
This is not unlike the dreaded binary blobs required to use certain 802.11 chips on open source operating systems- they exist because of FCC requirements, not because the vendors love binary blobs. They are allowed to ship software-defined radios, but the only way they can guarantee that their SDRs don't broadcast on illegal frequencies is to lock down the software that controls the hardware. It's not an ideal solution, but its the only solution they've got aside from spending even more money to ensure that safety at the hardware level, which wouldn't do anyone any good.
"It's not true when the hardware in question has safety implications."
It's a sensible counterpoint. The modification would be, if it's safety-critical, that you have a right to get modifications by professionals who aren't the original seller. Or perform them yourself if you are such a professional. This covers the biggest use-cases of repairs and enhancements respectively.
EDIT to add:
" is to lock down the software that controls the hardware. It's not an ideal solution, but its the only solution they've got aside from spending even more money to ensure that safety at the hardware level"
Not true. They could do it right at hardware level with simple bounds-checks burned into logic. The checks' setting set at manufacture or OEM configuration using something like anti-fuses. The kind of tech that's in 8-bit and 16-bit MCU's that retail for a few bucks. Cost almost nothing. Also more secure given simplicity. They aren't doing it because they don't care given demand side of this issue. Plus, lock-in among oligopolies brings them long-term financial benefits. ;)
Making modifying stuff a felony still just slows down peoples' efforts, unless there's a reliable way of enforcing it, which is... not really possible.
Red herring. It will still be a felony and people will still go to jail over it. Most might get away with it. Might given increasing surveillance state with commercial and government partnerships. Regardless, buyers need to pick suppliers who voluntarily eliminate that risk with proper licensing unless they want to risk jail time. They should not trust anything less in a license given precedents so far.
So if a company tried to apply "Your code must meet emissions standards before you are allowed to install it" limitations via a license agreement, then it would conflict with both GPLv2 and v3. If a company only allowed you to instal after being signed by the company's own testing laboratories, it would conflict with GPLv3's TIVOization clauses.
But if the government says: "All software installed on car ECUs must meet emissions standards" then there is no conflict, infact this is already 'implied' by the current laws.
The GPLv3 'should' also be fine with a "All car ECUs must only accept software that is signed and approved by a 3rd party testing laboratory" law. You have the exact same barrier to installing software on the ECU as the actual manufacture of the car.
The main example where the GPLv3 might conflict with a government law, is when a law mandates "the car's ECU must be locked down in a way that only the original manufacture can change the software"
How can you certify a neural network?
If it was feasible you wouldn't need a neural network in the first place, because if the problem could be solved exactly with a well known and certified algorithm then why add the "uncertain" response of a NN?
Same way you certify a person. You train them, then you test them in simulated environments.
It won't be a fixed certification procedure, one car manufacturer might hire an independent auditor to check over every line of code. The car manufacture that stupidly uses a neural network in their emissions control system will have to hire a independent auditor who is comptable putting their stamp of approval on the neural network.
Actually it's way easier to certify a neural network than a human, you can simulate millions of situations. You could even certify the code which ran the training environment in a line by line situation.
As long as the government (or certifier) is happy with whatever method was used to create the report, a certificate is issued.
You need to certify every possible output for every possible combination of input.
It sounds impossible to me.
Even if it is not impossible then at that point you can substitute your NN with an algorithm with the same response mapping.
If you don't certify everything then I can't really see how an open source certified autopilot would have saved the guy on the tesla.
Certification is not a 100% solution. We can never be 100% sure that a neural network will produce the right result. Just like we can never be 100% sure that hand written software doesn't have bugs, certification and auditing can't do that either.
Even with the most stringent software development and testing procedures in the world, NASA is unable to produce bug-free software for their spacecraft (which is why they have procedures to recover from bugs and patch software).
All the certification is saying: No bugs were found during testing, based on the amount of testing and the quality of the code and the testing procedures he probability of a fatal bug is x% (where x is a really low number with lots of zeros) and we deam x% to be an acceptable risk. In the case of autopilots, we only really need x% to be below regular human driving.
No software changes could have saved that driver. Tesla have a class 2 autopilot (adaptive cruise control with lane following). It's not within the job description of an autopilot to avoid all possible crashes, it's the job of the driver to be alert and ready to take control at all times. Tesla's autopilot doesn't even use Neural Networks outside of the camera sensor.
To be a class 3 autopilot that would be expected to deal with this sort of incident, Tesla would need way more sensors and much smarter software, so the car would actually have hope of detecting the faulty data from one sensor and taking the correct answer (The truck was only within view of the camera sensor, the radar is calibrated to only detect things at bumper height).
In the linked article the author says that the death of the person on the tesla with autopilot could have been avoided if the autopilot software was open source.
That is simply absolutely false because you cannot ever certify the entire response mapping for a NN so complex.
And even if you could then there was no need for a NN because you could write an algorithm with exactly the same response mapping.
> Most importantly, it ignores the fact that proprietary software in cars is at least equally, if not more, dangerous.
To me it seems that he is suggesting that the proprietary software can be more dangerous than open source one.
So from there you can infer that he believes that open source software can be more safe than proprietary one.
If death(proprietary) >= death(OSS) then !death(OSS) <= !death(proprietary)
And it doesn't slow people down anywhere near enough, hence all the talk about DRM, encryption, warranty voiding and anti-convention laws.
There is nothing wrong with allowing people to modify software, the issue is when people modify software to break laws (environmental law, in this case).
The solution is going to be some form of compliance certification/testing. The Manufacture provides the default software and gained a certification to prove it compiles with the laws. If the user wants to modify the software, they are going to have to prove that the replacement software also complies with the laws and get a certificate.
During annual vehicle inspections or licensing, they can check that certified software is installed. Or maybe the ECU's bootloader only allows certified software to be installed, unless a developer mode is enabled (which will require the owner to keep logs to ensure compliance)