to utilize intellectual property from others (either directly in the model aka NYT, or indirectly via web searches) without any rights
... and put the liability for retrieving said property and hence the culpability for copyright infringement on the enduser:
Since the output would only be generated as a result of user inputs known as prompts, it was not the defendants, but the respective user who would be liable for it, OpenAI had argued.
But wait, isn't this what we want? This means the models can be very powerful and that people have to use their judgment when they produce output so that they are held accountable for whether or not they produced something that was infringing. Why is that a bad thing?
Can I ask you why we would the enduser be punishable for the pirating OpenAI did? That would mean governments have to take the next step to protect copyrighted material and what we face then I don't even dare to imagine.
... and put the liability for retrieving said property and hence the culpability for copyright infringement on the enduser:
Since the output would only be generated as a result of user inputs known as prompts, it was not the defendants, but the respective user who would be liable for it, OpenAI had argued.
https://www.reuters.com/world/german-court-sides-with-plaint...