This is old news. The copyright office already ruled that AI generative outputs are not copyrightable in January [1].
I think many have not understood the implications of the CO ruling. This means anything you build with llms you don't own. Your company doesn't own. If your using copilot and you have a copyright notice at the top of your source file if that ever goes to court you will learn that copyright is not valid. You cant even put an open source license on the output, like the GPL, because...drumroll...you don't own the copyright.
It doesnt say that, it says that anything thats solely produced by simply prompting is not owned. I have seen very few works that want copyright and are solely prompts.
From your own link:
"“To be sure,” the Court further explained, “the requisite
level of creativity is extremely low; even a slight amount will suffice."
"The Office agrees that there is an important distinction between using AI as a tool to assist in the creation of works and using AI as a stand-in for human creativity. "
"The Office concludes that, given current generally available technology, prompts alone do not provide sufficient human control to make users of an AI system the authors of the output. "
Where the US ruling differs from others:
"Repeatedly revising prompts does not change this analysis or provide a sufficient basis for claiming copyright in the output."
Where China has had 2 cases where it supported multiple prompt changes + watermark
Also they dont rule out a change:
"There may come a time when prompts can sufficiently control expressive elements in AI-generated outputs to reflect human authorship. If further advances in technology provide users with increased control over those expressive elements, a different conclusion may be called for"
^ I would (and have) suggested that the above would likely cover the masking tools available in most image generators.
Its certainly not a case that "AI generative outputs are not copyrightable".
Yes. And anyone who has stepped outside of the chat ecosystem and used something like NovelAI or Sudowrite will be familiar with the co-editing approach those tools use which is easily accounted for with the above.
"There may come a time when prompts can sufficiently control expressive elements in AI-generated outputs to reflect human authorship. If further advances in technology provide users with increased control over those expressive elements, a different conclusion may be called for"
Because any "advancement" in this space is predicated on getting tighter control over the requested outcome.
You can already script a local image generator to come up with random images based on text searches or LLM output. Thats already not copyrightable anywhere.
The "but" is literally in response to what you quoted.
For example if I code an entire application in c by myself without ai then told ai to redo the whole thing in rust I would retain copyright.
If you just prompt the same application from scratch and accept by in large the outputs. No copyrighht. This is how the vast majority are using it to create new systems not using it as a tool to enhance majority human generated code or images or books etc.
The more it creates from pure prompts the lesser chance you have to claim copyright.
Largely covered by the other quotes. I think it would be quite difficult to create a product worth protecting using "prompts alone".
No debugging? No editing? Who put the graphics on it? Who built the database and schema?
The co-author/co-editing approach is already blessed in the document linked earlier. Code is already subject to some of the best co-editing tools in the ecosystem. Even if someone manages to avoid co-editing tools, launch a product having used "prompts alone" and monetise it, how are you going to prove that they didn't take the co-editing approach to development? And how are you planning to challenge their claimed copyright? Why would you challenge their claimed copyright instead of just generating it yourself?
I could conceive of some kind of anti copyleft organization that dedicates itself to challenging every unskilled software development firm, using the discovery phase to pull records of what tools were used. But who would fund such a witch-hunt?
Or maybe every time some firm tries to assert their copyright, we will see lawyers hit back with "Prove you coded this and didnt generate it wholecloth via LLM" clogging up the courts for decades.
>The more it creates from pure prompts the lesser chance you have to claim copyright.
Yeah but unlike image generators and media articles its going to be a lot tougher to prove.
NovelAI has a feature where it does text highlighting based on:
"User wrote this"
"User edited this"
"Generated"
It sets this on a per sentence basis.
I have wondered for a long time whether this will become mandatory in some jurisdictions. But even then, if you copy the text, and paste it in a new window, bam its all considered user generated again.
The copyright office has already ruled recently that prompts are not enough to gain copyright no matter how detailed or how many iterations.
Furthermore, the Copyright Office stated that prompts alone do not provide sufficient human control, as AI models do not consistently follow instructions in the prompts and often "fill in the gaps" left by prompts and "generate multiple different outputs"
What's crazy is this was always seen as a left wing car. If you drove one of these into small blue collar towns there is a chance you would have been harassed a few short years ago.
Sounds made up. I live in a very lefty part of the Bay Area in California where Teslas abound, and there has never been any kind of campaign against them until recently, just some cynicism about Musk's grandiose claims.
My impression from reading your other comments is that you don't even live in the US. If that's wrong please correct me, but if not I think you should refrain from making counterfactual claims.
You don't need to read my comments, it's in my bio. I do work with US (Bay Area and now Texas) companies (contracting) for last 7 years. IDK why but I do get sucked into US affairs so much which I do not like.
Elon has been harassed online by woke and antifa for years now. Lately by actual people IRL too.
You just had 4 years of Lina Khan who was suppose to be the hard liner anti big tech chair, yet basically did nothing by picking loser cases like atvi.
Why? It's mainly so people of that party are the ones choosing the candidates for that party. Without closed the other side purposely tries to sabotage the otherside by voting for the worst candidate.
For example in the State of Oregon there are ~1M non party registrations. ~900k Dems, ~700k Republicans. Oregon is a closed primary state, thus the greater majority is not represented in the primary. https://sos.oregon.gov/elections/Pages/electionsstatistics.a...
How many of those non party voters even care enough to vote in a primary. My guess is very few and if they did then they should switch their non to a party.
My experience with primaries is only the most dedicated vote in them. For example in Nevada you have to sit in rooms in schools and listen to every nut that wants to give a ten minute speech to change your vote.
> I’ve found that almost no founders or friends I speak with have any vision for the future anymore.
I think in general there is a feeling that the time to get your bag is rapidly shrinking.
Once everything is built by these things there will be no reason to create anything as the platform owners (big tech) will be able to take everything for themselves and no longer have to share 70% with those pesky creators/small business/startups etc.
Copyright office recently ruled that outputs are generally not copyrightable without significant human changes. Prompts are also not copyrightable. I'm actually surprised that is not discussed around these parts more because it essentially means any app etc built by AI has no copyright protection. Thus code from llms cannot be opensourced using opensource licenses. Business cannot stop users from stealing their apps etc.
It also means that if you build something with a popular llm then big tech now has your inputs and outputs, and you have no IP to stop them from stealing either.
The next ruling from the copyright office is around fair use that big tech is pushing for.
I think many have not understood the implications of the CO ruling. This means anything you build with llms you don't own. Your company doesn't own. If your using copilot and you have a copyright notice at the top of your source file if that ever goes to court you will learn that copyright is not valid. You cant even put an open source license on the output, like the GPL, because...drumroll...you don't own the copyright.
[1] https://www.copyright.gov/ai/Copyright-and-Artificial-Intell...