> You write close sourced code, then a make an AI that learns from that code, shouldn't its output be licensed as well?
> Ask the above, and suddenly Microsoft will agree.
Does Microsoft actually agree? Many people have posted leaked/stolen Microsoft code (such as Windows, MS-DOS 6) to GitHub. Microsoft doesn't seem to make a very serious effort to stop it – sometimes they DMCA repos hosting it, but others have stayed up for ages. They could easily build some system to automatically detect and takedown leaks of their own code, but they haven't. Given this reality, if they trained GitHub Copilot on all public GitHub repos, it seems likely that its training included leaked Microsoft source code. If true, that means Microsoft doesn't actually have a problem with people using the outputs of an AI trained on their own closed source code.
> Ask the above, and suddenly Microsoft will agree.
Does Microsoft actually agree? Many people have posted leaked/stolen Microsoft code (such as Windows, MS-DOS 6) to GitHub. Microsoft doesn't seem to make a very serious effort to stop it – sometimes they DMCA repos hosting it, but others have stayed up for ages. They could easily build some system to automatically detect and takedown leaks of their own code, but they haven't. Given this reality, if they trained GitHub Copilot on all public GitHub repos, it seems likely that its training included leaked Microsoft source code. If true, that means Microsoft doesn't actually have a problem with people using the outputs of an AI trained on their own closed source code.