>do laws adequately protect people (kids)? Can they? Will this force a shift towards actually chasing producers, distributors, and diddlers?
It's extremely complicated. Actual CSAM is very illegal, and for good reason. However, artistic depictions of such are... protected 1st Amendment expression[0]. So there's an argument - and I really hate that I'm even saying this - that AI generated CSAM is not prosecutable, as if the law works on SCP-096 rules or something. Furthermore, that's just a subset of all revenge porn, itself a subset of nonconsensual porn. In the US, there's no specific law banning this behavior unless children are involved. The EU doesn't have one either. A specific law targeted at nonconsensual porn is drastically needed, but people keep failing to draft one that isn't either a generalized censorship device or a damp squib.
You can cobble together other laws to target specific behavior - for example, there was a wave of women in the US copyrighting their nudes so they could file DMCA 512 takedown requests at Facebook. But that's got problems - first off, you have to put your nudes in the Library of Congress, which is an own goal; and it only works for revenge porn that the (adult) victim originally made, not all nonconsensual porn. I imagine EU GDPR might be usable for getting nonconsensual porn removed from online platforms, but I haven't seen this tried yet.
I'm disgusted, but not surprised, that teenage kids are generating CSAM like this. Even before we had diffusion models, we had GANs and deepfakes, which were almost immediately used for generating shittons of nonconsensual porn[1].
This is true, though "AI CSAM" is an oxymoron. There is no abuse in the creation of such works, and such it is not abuse material, unless of course real children are involved.
I get your argument, but there are definitely laws about cartoon underage characters. Agree or disagree the difference is that today you don't need to be a highly skilled artist to make something that people are going to fap to. (I definitely agree priority should be focused on physical abuse and the people making the content, but this whole subject is touchy).
Do non-consensual porn not qualify as defamation? That and obscenity laws if existed should be able to handle most hyperrealistic porn so that only speeches remain.
Good question. US defamation law is fairly weak[0], but all the usual exceptions that make it weak wouldn't apply. e.g. "truth is an absolute defense against defamation" doesn't apply because AI generated or photoshopped nonconsenual porn is fake. I'm not a lawyer, but I think a defamation case would at least survive a motion to dismiss.
[0] Which, to be clear, is a good thing. Strong defamation law is a generalized censorship primitive.
Could this perhaps fall under something like trademark, like an unauthorized use of self, I'm sure I've heard of some celebrity cases that were for similar.
> I'm disgusted, but not surprised, that teenage kids are generating CSAM like this. Even before we had diffusion models, we had GANs and deepfakes, which were almost immediately used for generating shittons of nonconsensual porn
I think the big difference now is that 1) it's much easier to do now, and 2) the computational requirements and (more importantly) technical skills have dramatically dropped.
We should also be explicitly aware that deep fakes are still new. GANs in 2014 were not creating high definition images. They were doing fuzzy black and white 28x28 faces, poorly, and 32x32 color images that if you squint hard enough you could see a dog (https://arxiv.org/abs/1406.2661). MNIST was a hard problem at that time and that's 10 years. It took another 4 years to get realistic faces and objects (https://arxiv.org/abs/1710.10196) (mind you, those images are not random samples), another year to get to high resolution, and another 2 to get to diffusion and another 2 before those exploded. Deep fakes were really only a thing within the last 5 years and certainly not on consumer hardware. I don't think the legal system moves much in 10 years let alone 5 or 2. I think a lot of us have not accurately encoded how quickly this whole space has changed. (image synthesis is my research area btw)
I'm not surprised that these teenagers in a small town did this. But the fact that all those adjectives exist in that order is distinct. Discussions of deep fakes like that Tom Scott video were barely a warning (5 years is not a long time). It quickly went from researchers thinking it can happen in the next decade and starting discussions to real world examples making the news in under their prediction time (I don't think anyone expected how much money and man hours would be dumped into AI).
It's extremely complicated. Actual CSAM is very illegal, and for good reason. However, artistic depictions of such are... protected 1st Amendment expression[0]. So there's an argument - and I really hate that I'm even saying this - that AI generated CSAM is not prosecutable, as if the law works on SCP-096 rules or something. Furthermore, that's just a subset of all revenge porn, itself a subset of nonconsensual porn. In the US, there's no specific law banning this behavior unless children are involved. The EU doesn't have one either. A specific law targeted at nonconsensual porn is drastically needed, but people keep failing to draft one that isn't either a generalized censorship device or a damp squib.
You can cobble together other laws to target specific behavior - for example, there was a wave of women in the US copyrighting their nudes so they could file DMCA 512 takedown requests at Facebook. But that's got problems - first off, you have to put your nudes in the Library of Congress, which is an own goal; and it only works for revenge porn that the (adult) victim originally made, not all nonconsensual porn. I imagine EU GDPR might be usable for getting nonconsensual porn removed from online platforms, but I haven't seen this tried yet.
I'm disgusted, but not surprised, that teenage kids are generating CSAM like this. Even before we had diffusion models, we had GANs and deepfakes, which were almost immediately used for generating shittons of nonconsensual porn[1].
[0] https://en.wikipedia.org/wiki/Ashcroft_v._Free_Speech_Coalit... and the later https://en.wikipedia.org/wiki/United_States_v._Handley
[1] https://www.youtube.com/watch?v=OCLaeBAkFAY