Hacker News new | past | comments | ask | show | jobs | submit login

I haven't looked into this extensively or tried Copilot myself, so I might be completely wrong about this. But from what I understand, the code Copilot generates is generally different enough from the source data that it shouldn't be an issue. In a sense, Copilot reading lots of code on Github to train a code-writing model is analogous to a human reading lots of code on GitHub and learning from some of the design patterns they see—as long as the output is not too similar to the input, it should be fine.



> But from what I understand, the code Copilot generates is generally different enough from the source data that it shouldn't be an issue.

It's a completely closed system and they refuse to let you know what they used as source, so you will never know which is the problem raised by the fine article included Microsoft's refusal to engage.

The premise is completely unfounded. If I read the Windows source code and then went to recreate Windows functionality in Wine Microsoft would completely sue the crap out of everyone even if I didn't copy/paste Windows code. Why should we give Microsoft leeway?

Even Amazon lets you understand what licences went into the code you are copy/pasting.


code output is unrelated to the issue.

the issue is if GitHub hosts your code with an open source license that does not allow for-profit reuse without concomitant sharing, like the GPL, they will incorporate that code into their product in violation of the license, and claim the license doesn’t apply because the code isn’t code but merely text.

they built their business on precise and detailed articulations of consent and its boundaries, but disregarded all of that, post-acquisition, because Microsoft has enough lawyers that they think they can get away with it.

so, for the subcommunities and creators who put their work on GitHub in the context of these very specific and fine-grained articulations of consent, this may be theft and is certainly betrayal.



But we never had humans who could read so much code and "learn" so much. :p Our old system's rules and logical conclusions, taken up a level, now make us uncomfortable.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: