> If you read free code yourself it’s fine, but if a machine does it for you it’s not? We overvalue humans.
No, it's not fine. Apparently, you missed SCO & Oracle vs. Google cases. Both of these cases argued that somebody looked to the code, and copied it. In SCO case it was not true, but the argument stretched the timeline rather successfully. In Oracle vs. Google, copying function signatures opened a big can of worms.
So, just by copying the function signature without filling it the very same code with the original, even for interoperability, you're getting into a huge gray area in a legal sense.
Similarly, no sane Wine developer will read leaked Microsoft source code, yet alone copy it. Again, no sane emulator developer will read leaked Nintendo code.
Reading the code "colors" your creativity, and if you're tried at court and enough similarity is found in your code with the leaked code, it's game over.
So, reading code and copying is not guaranteed to be legal, depending on its license. When this is done by a robot, it's still illegal (you're breaching licenses during the code generation process), and immoral and unethical on top of it.
So, we don't overvalue humans, but overvalue AI, which is just informed search, BTW.
No, I didn't and don't read other people's code to understand how something works. I use books and official language/library documentation for that.
On the other hand, this is irrelevant to the issue at hand.
GitHub copilot is not a tool for education. It's tool for auto-completing code, which can be put to production, where licenses and other stuff come into play.
The issue is not code sharing per se. It's more of a legal problem, and an important one at that. In the software copyright sense, even reading code you can't import to a project (let it be leaked, not compatibly licensed or for any reason), puts you at risk of legal troubles. This is why we have methodologies like "clean room development".
In Copilot's case, you're possibly deriving a code from a source which contains many licenses, and some of them are not compatible with that you're doing. As a result, you're in direct breach of the license which is not compatible with your code.
On a more higher level, you're also breaching the ethics code and morality by using a code or its derivation with an incompatible license to your code, and disregarding other peoples desires codified as a case-tested and valid license.
As a result, if you think that using a derivative of a GPL licensed code in your closed source application is OK on every front, then the vice versa is true. I can disassemble and reverse every part of your code and re-implement it as GPL bug for bug and open it.
Because if you can breach my license, and expect no consequences, I can breach your license without consequences, as well. It's a two way street.
No, it's not fine. Apparently, you missed SCO & Oracle vs. Google cases. Both of these cases argued that somebody looked to the code, and copied it. In SCO case it was not true, but the argument stretched the timeline rather successfully. In Oracle vs. Google, copying function signatures opened a big can of worms.
So, just by copying the function signature without filling it the very same code with the original, even for interoperability, you're getting into a huge gray area in a legal sense.
Similarly, no sane Wine developer will read leaked Microsoft source code, yet alone copy it. Again, no sane emulator developer will read leaked Nintendo code.
Reading the code "colors" your creativity, and if you're tried at court and enough similarity is found in your code with the leaked code, it's game over.
So, reading code and copying is not guaranteed to be legal, depending on its license. When this is done by a robot, it's still illegal (you're breaching licenses during the code generation process), and immoral and unethical on top of it.
So, we don't overvalue humans, but overvalue AI, which is just informed search, BTW.