Well actually, to me in 2025 "source available" sounds like you don't want to make it open source because you're scared that "someone will steal your work", but your more than happy to let your code go into the copyright laundering machines that are LLMs.
AI companies take people's work without permission. Some source has licenses or ELLA'S forbidding it, too. They don't care. So, that's not a fair claim.
Also, LLM's aren't very good at writing the kind of code that people need to steal. I doubt truly-valuable I.P. is usually laundered. I could see it happening to some one-of-a-kind, patented works.
> AI companies take people's work without permission.
Yes, which is exactly why making your work source-available results in letting AI companies take it. If you don't want AI companies to take it, keep it private.
> I doubt truly-valuable I.P. is usually laundered.
If that's what you think, then just open source your code. By stopping at source-available, you make sure that only AI companies can take it. Why would you purposefully screw open source users and help AI companies?
Well actually, to me in 2025 "source available" sounds like you don't want to make it open source because you're scared that "someone will steal your work", but your more than happy to let your code go into the copyright laundering machines that are LLMs.