Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You're thinking copyright liability[0], but the real worry, straight from the mouths of the Stability people[1], is AI-generated CSAM. That will make the whole field of generative art legally radioactive.

At least with copyright law, there's an argument for training being fair use. If generative art becomes a notorious market for CSAM, everyone in the field goes to jail.

[0] Also, I'd like to know what your opinion is on GitHub Copilot. A lot of people decry Copilot for stealing code but love Stable Diffusion for being public, even though they're the same concept and trained in the same quasi-ethical way.

[1] https://www.reddit.com/r/StableDiffusion/comments/y9ga5s/com...



Maybe I’m naive, but isn’t AI generated CSAM a good outcome, actually - because it doesn’t require actual children to be hurt?


This area is already well explored just with fake CSAM generated by artists using photoshop, cartoons, etc. The modern thinking is that it supports and encourages a behavior that can lead to actual violence.

If you constantly watch videos of people eating cheeseburgers, you might want to eat a cheeseburger yourself.


>This area is already well explored just with fake CSAM generated by artists using photoshop, cartoons, etc.

I'm familiar with the research in this area, and that's not something you can say confidently; most work (and by that I mean 2 or 3 papers in total) has gone into investigating the role 'generated' depictions of CSAM play in the collections of hoarders. No psychological study, as far as I'm aware, has conducted an investigation on those who enjoy cartoon material akin to what you might find in a Japanese manga.

In fact, there's some evidence against what you're saying; anthropological research on fans of cartoon material ('lolicons' or 'shotacons') in Japan shows that their communities draw hard lines between '2D' and '3D' not just in this area of sexuality, but in their sexualities as a whole. This sexual inclination toward the 2D world is termed the 2D-complex and is akin to 'digital sexuality' or fictophilia, not pedophilia.

By way of analogy, perhaps BDSM would work as a good counter point to you. Many people (some studies suggest the majority of people) engage in 'rape fantasies' or other such fantasies of illegal or immoral nature, yet although actual depiction of rape is rightly banned by the state, its simulated variants are not, and we are comfortable to acknowledge that sexual desires do not always manifest in real life, and sometimes the thrill of fantasy itself is the attraction. To make it real would, ironically, defeat the whole point.


One issue with this is that the fake CSAM wouldn't be the cartoons of a Japanese Manga, it would (or could) be photo realistic. It could be photo realistic of real children. This is obviously possibly bad because it might fuel or encourage pedophiles, but it also has lots of other negative possibilities too.

One example of a bad thing - you could easily imagine an instagram bot that looks for pictures of people with their kids, then uses a Stable Diffusion like model to produce pictures of the people having sex with their kids, or horrible things happening to the kids, and reply to the target account. The bot might threaten to post the pictures and accuse the person of being a pedophile unless the person pays X in bitcoin (or whatever). Or, the bot could just post such pictures for fun.

I think we don't know if fake CSAM will have a good or bad effect on pedophiles and, sadly, there is no real way to reliably test that (so far as I know). Fake CSAM might placate pedophiles, or it might whet their appetite. It's hard to know what to do.

I think we will eventually get to the point where very good unrestricted image generation models are available to the general public. When that happens there will be chaos - you will live to see man-made horrors beyond your comprehension.


That's a good point, and I agree - I only wanted to pick up on the point about 'cartoons'. As for whether realistic generated images with real human data sources would have a good effect, it certainly wouldn't have a good effect on preventing further child abuse, and again, as far as I know there's no evidence that it would 'placate' them in the sense of the (widely debunked) catharsis theory.

And of course, I'm not looking forward to the world ushered in by free roam with this technology, mainly for the reasons you stated.


>I'm familiar with the research in this area, and that's not something you can say confidently

I’m referring to legally, sorry I should have specified.


> The modern thinking is that it supports and encourages a behavior that can lead to actual violence.

Hasn't this nonsense been thoroughly debunked by multiple studies at this point? I would assume evidence and "modern thinking" supports the exact opposite of what you claim, unless by modern thinking you mean the same thinking that tries to hide research they don't like.

Video games do not cause violence. End of story.


The pleasure centers activated by videogames and pornography are quite radically different; I would not assume that the reactions to simulated sexuality is the same as simulated violence.


Then ban porn especially skits that depict actions that society deems deplorable like suffocation and rape, or are the pleasure centers for those also different.

> I would not assume that the reactions to simulated sexuality is the same as simulated violence.

A would not assume anything. Conduct research and draw conclusions. Don't speculate.


People who had no clue what videogames were, were the ones arguing that playing a violent video game would make you want to commit actual violence. The counterargument that players made was that they could "tell reality from fiction" - i.e. that when they played Mortal Kombat or Call of Duty, they put their "Real Life" brain away and put on their "Fictional Video Game" brain, so videogames can't make people violent.

This is the right conclusion, but the logic is entirely wrong.

The reason why video games do not cause violence is that play violence is not anywhere close to the real thing, not that people firewall off fiction from reality. There's plenty of cases in which a piece of fiction has changed people's views! Crime shows are notorious for skewing how actual juries rule on cases. Perry Mason[0] taught them to expect dramatic confessions and CSI[1] taught them to weigh whiz-bang forensics over other kinds of evidence.

In the specific case of porn, there isn't really a difference between "play sex" and "real sex": they poke the same regions of your brain. And the people who are responsible for keeping actual pedophiles from reoffending are pretty much unanimous that the worst thing you can do is give them a bunch of, uh... let's call it "material". So if you're already a pedophile, giving you access to simulated CSAM won't substitute for the real thing. It'll just desensitize you to reoffending.

[0] https://en.wikipedia.org/wiki/Perry_Mason_syndrome

[1] https://en.wikipedia.org/wiki/CSI_effect


A lot of claims and no supporting research. My position is clear: You need to give clear evidence that X causes harmful Y before we can discuss banning X. We don't ban X because you and I find it deplorable.

>> Conduct research and draw conclusions. Don't speculate.


Just like how the incredible availability of porn on the internet has led to millennials being the generation that has the most sex ever: https://news.ycombinator.com/item?id=12433236


Correlation

There are way too many factors at play to simply point at porn, which is probably harder to obtain now in all honesty. I found many random porn magazines/pages as a child. Never did I ever go looking for it, but finding it was always a thrill.

People buy less magazines now (based on convenience store shelves increasingly excluding them.)


You think porn is harder to obtain with the internet? That seems... unlikely.


Modern thinking doesn't mean evidence-based thinking. To the contrary, it gets even more politicized, rather than becoming more evidence-based. Here's an evidence-based counterargument [0].

[0] Evidence Mounts: More Porn, Less Sexual Assault. https://www.psychologytoday.com/us/blog/all-about-sex/201601...


This is like claiming video games cause violence, which is absolutely not the case.

More likely people will just generate more synthetic content to consume.


IMO it's more like claiming that video games lead to video game fans and addicts. Which is true.


Yeah, people who like looking at synthetic images will have easy access to more synthetic images (and could even generate them on their own machines).

But they are not harming anyone else.


...but they're becoming child porn addicts


>If you constantly watch videos of people eating cheeseburgers, you might want to eat a cheeseburger yourself.

this is retarded, if you watch a movie, play a video game, read a book with crime events then you will become a criminal. We have a ton of shooter games and still no evidence that this caused more gun violence around the world.


> This area is already well explored

It's not well explored at all and you just made that up lmao.

That's akin to the idiotic arguments of the past that "allowing people to see homosexuality will make them homosexual!"

Completely ridiculous.


And having a lot of LGBTQ friends leads one to become LGBTQ?


So if I start watching gay porn, I can become gay (or at least bi)? Why don't more people do this and double their dating pool?


If you are not gay you won't enjoy the porn and it will impact you different.

Same with CP. You have to be sick to enjoy it. Very sick.


Then why is GTA 5 legal? Or Hannibal? Is it ever possible to trust anyone with self-determination?


Which is why, after playing so many RPGs, I've become a sword-swinging serial killer. /s


That's what they said about video games. We know how that played out.


If a boy constantly watch media of pretty girl, he may want to become a pretty girl himself. Which is fine IMO but traditional parents are not worried about this possibility much.


Shouldn't call of duty and game of thrones be illegal then?


So long as the AI can remain creative. Once it has exhausted itself of that while the niche consumers still crave for more variation. That's when children start to get hurt again.


They were shamed into working with a nonprofit aimed at protecting children (Thorn) whose executive director stated publicly at the Stanford conference a few weeks ago that her organization is against the concept of synthetic images.


What an utterly predictable development. I was happy that Stability put their model out there without any waffling about "concerns" and "communities", but I was always skeptical they'd last. And well, now they're folding like cardboard when faced with a criticism that they should've seen coming. The most concerning thing here is that there's no conceivable approach they can take to prevent CP while keeping their model open; either it is open, and people can use/re-train it to make CP, or it is closed.


> If generative art becomes a notorious market for CSAM, everyone in the field goes to jail.

No one will go to jail, except maybe some people who get caught creating, distributing or collecting those images.


I thought there was a case that "virtual" images were already legal. Does that not apply here because real images are used as the training dataset or something? If no illegal images are used as input, I don't see how the output could be (or should be) illegal. There's no nexus to any real person being harmed.


We can likely make the very strong assumption that the training data didn't contain any CSAM so it would be more difficult for the model to produce CSAM. Also, I would imagine they trained the model without porn too, so inferring CSAM based on legal adult porn would also be quite difficult. Am I missing something?


Actually I saw Stable Diffusion-generated semi-stylized / semi-photorealistic (kind of like photorealistic-ish anime) CSAM on 4chan literally a day or so ago when I randomly decided to go to 4chan and saw that AI art threads are super popular there right now.

Keep in mind, there already are a lot of illustrated/anime style pictures of CSAM on that site of years though (something that is legal in many countries), so it's sort of becoming a blurred area as these AI art generators are still somewhat like that but now are getting to be more photorealistic.

As far as the models not being trained on NSFW content, there was already leaked models that were, and there are unofficial models trained by outsiders using SD that are specifically trained on for example adult image websites.


These models are intended to converge to the capabilities of human artists and beyond.

A human artist is obviously capable of generating CSAM, even if they have never seen that before.

Filtering of training data is countered by increasing capabilities to generalize:

Two years ago, that was a viable strategy: models could barely produce what was in the training data again.

Today models can generalize much better and compose concepts they have been trained on into new concepts that they haven’t.

Two years from now, filtering will be irrelevant.


Not only that, but with techniques like in-painting you could start with something that wasn't CP and then progressively make the model generate parts of it which then make up such an image right now. Stability saying they want to release an open version of SD that can't make CP is like a pen maker selling a pen that can't make CP: horseshit.


The child porn problem is a double edged sword.

Detection becomes easier - is it pornography with a child in it?

Generation starts to become trivial - this video, but this person has the features of an X year old.

At least in the latter case no-one's actually getting raped.


Note that generated child porn that depicts no real children is actually legal in much of the world. The UK is more the exception than the rule.


The way I see things, it all starts from the interests of the participants. Stable diffusion got their publicity from opening their model, but their interests are squeezing a maximal profit from it. And then there is Dall-E and midjourney, with similar incentives.

Then there are narratives. They are weaved so that the suggested actions and solutions will somehow fit the interests of the participants. The narrative can be CSAM, it can be copyright of artists and owners of the training set, the narrative can be disinformation. The narrative doesn't care that current laws do not prohibit anything and that it's all legal. The narrative justifies actions the participants wanted to do because of their interests.

And finally there are actions. They can push legislations, but that's not the only tool (and yes it's slow). Companies can always comply and cooperate, especially when their interests align. Google itself is a participant, with Imagen. They can create a restrictive policy and kick things off their search engine, because that is in their interests too, not because of a narrative or legislation. Just like they profited in YouTube for every piracy site suppressed.

The interests of every single company is stacked against individuals running this at home for free. There are enough narratives to be weaved to justify actions which would stop that.

For decades, and in many countries even today, just getting paid to drive someone in your car is illegal, and you need a "taxi license". It doesn't need to make sense. We could end up with required license to use generative AI in 10 years and nobody would bat an eye after decades of propaganda and narratives.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: