25 arrested so far for sharing AI sex images of minors in largest EU crackdown

Post content hidden for low score. Show…
Can't wait for all the "no one is being hurt by my decision to generate AI CSAM locally and not share it! This is thought crime!" apologists that are so fond of these articles
If you do that locally and dont share it, police have no way of finding out...
Im not quite sure about definition of csam, but i suspect some anime lovers might get arrested :)
 
Upvote
46 (49 / -3)

deadman12-4

Ars Scholae Palatinae
2,765
There are alot of feelings about this, but I Would propose considering - whats the point of making csam illegal? If its to protect children from harm, then why this?
Story says one model was trained on real csam, which means the rest were not. Are we making csam illegal because people don't like it (very bad way to pass laws), or to protect people. If no one was hurt in the creation of most of it (all the models NOT trained on real csam), then is it bad even if its disliked by most?

Reminder - the whole making it illegal cause most dissaprove of it thing, many people disapprove of jews or muslims in america. Should they be illegal? There needs to be a logical reason for laws, not illogical cultural ones.

edit: Its impossible to prove a negative. The concept of "prove it WASN'T trained on real csam" is impossible. But the same can be said about anything. Prove you haven't murdered someone with that gun you own and hid the body. It sounds preposterous but its the same logic. Hence why law is about proving, not disproving.
 
Upvote
62 (88 / -26)
Hmm I kind of have mixed feelings on this, initially my gut said good get them. But thinking for 3 more seconds... I think the focus should remain on real life abuse since those are the people causing... actual physical and mental harm. This harms no one and in a way may let people with this issue redirect there pedo energy into something harmless???

I just watched a movie with a CGI nuclear bomb. Should we arrest all the artists who worked on that too since it normalizes nuclear weapons and mass destruction? Also should arrest a good chunk of anime community next.
 
Upvote
52 (65 / -13)

CoryS

Smack-Fu Master, in training
77
This kind of thing is always awkward to argue against, but I'll try.

CSAM has been around for as long as people could create images. Unfortunately, usually harming children beyond my ability to fathom in the process. If we can't get rid of it, why not let it move to a process that doesn't continue destroying children's lives.
 
Upvote
5 (21 / -16)
Post content hidden for low score. Show…
There are alot of feelings about this, but I Would propose considering - whats the point of making csam illegal? If its to protect children from harm, then why this?
Story says one model was trained on real csam, which means the rest were not. Are we making csam illegal because people don't like it (very bad way to pass laws), or to protect people. If no one was hurt in the creation of most of it (all the models NOT trained on real csam), then is it bad even if its disliked by most?

Reminder - the whole making it illegal cause most dissaprove of it thing, many people disapprove of jews or muslims in america. Should they be illegal? There needs to be a logical reason for laws, not illogical cultural ones.

edit: Its impossible to prove a negative. The concept of "prove it WASN'T trained on real csam" is impossible. But the same can be said about anything. Prove you haven't murdered someone with that gun you own and hid the body. It sounds preposterous but its the same logic. Hence why law is about proving, not disproving.

I would agree with you. The weasel word argument at the end that this causes people to harm kids has no follow-up and sounds like the arguments about video games from the 90s and the Tipper Gore brigade.
 
Upvote
60 (74 / -14)

Shadowen09

Smack-Fu Master, in training
35
Subscriptor
“And there's growing consensus globally that, in general, AI-generated CSAM harms kids by normalizing child sex abuse through the increased prevalence of CSAM online.”

It would be nice to see the citations for this claim. I’m not familiar with the literature regarding the AI angle of this issue. But claims just like this have been made ad nauseam for decades without any genuine proof.

What little research has been done here hasn’t demonstrated a clear causation. Furthermore, the argument is prima facie spurious based on the copious amounts of research done on violence and video games as well as the fact that all manner of porn is readily available and consumed on the Internet but sexual habits haven’t been radically altered to a similar degree.

For once, I would just like to see this claim (or any of the others like it) backed up with good data and research.
 
Upvote
83 (90 / -7)
By the argument of "the AI imagery normalizes an activity therefore should be illegal"

No AI imagery of violence of any kind. No AI imagery of a cat drinking a beer (that's animal abuse you heathens!!) etc etc etc

Why stop at AI imagery? The GTA games are now illegal because they normalize all the criminal things you can do in those games (including soliciting prostitutes and then committing violence against them after to get your money back.)

Next apply to movies and TV shows. Books. Etc.

Granted, that's a slippery slope that probably wouldn't actually happen. Oh wait, they've tried to ban video games before? They used to censor the hell out of comic books? Oh...
 
Upvote
69 (79 / -10)

Oldmanalex

Ars Legatus Legionis
10,818
Subscriptor++
There are alot of feelings about this, but I Would propose considering - whats the point of making csam illegal? If its to protect children from harm, then why this?
Story says one model was trained on real csam, which means the rest were not. Are we making csam illegal because people don't like it (very bad way to pass laws), or to protect people. If no one was hurt in the creation of most of it (all the models NOT trained on real csam), then is it bad even if its disliked by most?

Reminder - the whole making it illegal cause most dissaprove of it thing, many people disapprove of jews or muslims in america. Should they be illegal? There needs to be a logical reason for laws, not illogical cultural ones.
There are hills to die on, and hills not to die on. Trying to block the rounding up and murdering of racial or religious minorities is a righteous, and meaningful, hill to die on. CSAM hills in any form, seem to be much less meaningful, and probably difficult to get far up the righteousness bell curve. I agree that one can certainly conceive of photorealistic child pornography that involved no prior injury to any child, but nobody can guarantee a future trajectory for consumers thereof, and just as being drunk in a motor vehicle is an offense, although one probably would not have gone onto splatting a few pedestrians, I can see a reasonable argument for not legalizing any form of CSAM, regardless of how "antiseptic" the origin was. And as for the camel's nose argument, all but total morons should have noticed by now that the tent has become a camel garage.
 
Upvote
-17 (18 / -35)
There are hills to die on, and hills not to die on. Trying to block the rounding up and murdering of racial or religious minorities is a righteous, and meaningful, hill to die on. CSAM hills in any form, seem to be much less meaningful, and probably difficult to get far up the righteousness bell curve. I agree that one can certainly conceive of photorealistic child pornography that involved no prior injury to any child, but nobody can guarantee a future trajectory for consumers thereof, and just as being drunk in a motor vehicle is an offense, although one probably would not have gone onto splatting a few pedestrians, I can see a reasonable argument for not legalizing any form of CSAM, regardless of how "antiseptic" the origin was. And as for the camel's nose argument, all but total morons should have noticed by now that the tent has become a camel garage.
the trouble is "for the children" is the most common battle cry for taking away countless freedoms and spying on you and everyone else. Don't forget, trans people are becoming illegal "for the children". We MUST stop ANY blanket "for the children" rally cry, because its NEVER for the children, ever.
 
Upvote
57 (70 / -13)

Kazper

Ars Praefectus
4,182
Subscriptor
My main concern here is the edge cases. Obviously some such created images would be very clear but how do you distinguish an AI image of a 14 year old and an 18 year old given you already have real (porn)actresses that blur that line?

Also I agree with others that I'd like to see the research on why this of all things normalizes behavior but all other fictional descriptions of illegal acts do not?

I do agree that as AI images get more and more realistic it might reach a point where you can't definitively say if an image is real or AI made. And in that case and given there is no doubt at all it is a child, it would have to be prosecuted as if real. You can't always expect to be able to identify the victims after all.
 
Upvote
18 (21 / -3)

enilc

Ars Praefectus
3,783
Subscriptor++
Reminder - the whole making it illegal cause most dissaprove of it thing, many people disapprove of jews or muslims in america. Should they be illegal? There needs to be a logical reason for laws, not illogical cultural ones.

First they came for the pedophiles...

? Is that where the 'slippery slope' began?
 
Upvote
10 (20 / -10)
I get why investigators do this: they can investigate this case from their offices, the evidence is data rather than verbal testimony of minors, there are no messy family issues to deal with, no minor offenders, no need to get social workers involved.

It's all the benefits of being able to brag about protecting children without any of the difficulties of actually investigating and prosecuting child sexual abuse.
 
Upvote
59 (65 / -6)

Castellum Excors

Ars Praetorian
529
Subscriptor++
There are alot of feelings about this, but I Would propose considering - whats the point of making csam illegal? If its to protect children from harm, then why this?
Story says one model was trained on real csam, which means the rest were not. Are we making csam illegal because people don't like it (very bad way to pass laws), or to protect people. If no one was hurt in the creation of most of it (all the models NOT trained on real csam), then is it bad even if its disliked by most?

Reminder - the whole making it illegal cause most dissaprove of it thing, many people disapprove of jews or muslims in america. Should they be illegal? There needs to be a logical reason for laws, not illogical cultural ones.

edit: Its impossible to prove a negative. The concept of "prove it WASN'T trained on real csam" is impossible. But the same can be said about anything. Prove you haven't murdered someone with that gun you own and hid the body. It sounds preposterous but its the same logic. Hence why law is about proving, not disproving.

The whole line of questioning is moot. With Epstein, Diddy, etc.,. it just shows people in power seem to be pedophiles, themselves. It's fine for the rich and powerful. I don't see Interpol going after Prince Andrew, for instance. It feels like this whole thing is more of a smokescreen, if anything. Heck, we can't even get Epstein's list release in full. Hell, even the FBI cannot seem to get the full documentation from itself! But Interpol is going to go after AI generated images, instead.
 
Upvote
49 (50 / -1)
Post content hidden for low score. Show…

Amarillo3

Wise, Aged Ars Veteran
159
Subscriptor
What I've read is that some CSAM viewers start out as compulsive porn users (e.g., addicts) who seek out more and more extreme/illegal content over time as they become desensitized. If that's true, I could see viewing AI-generated CSAM as a step on the path to "real" CSAM. If someone thinks the AI-generated CSAM is safe and harmless viewing, they'd be more likely to take that step, I would guess, and end up where the harm is not arguable.

(Not an expert!)
 
Upvote
-13 (11 / -24)
If you do that locally and dont share it, police have no way of finding out...
Im not quite sure about definition of csam, but i suspect some anime lovers might get arrested :)
for better or worse, there IS no definition. The supreme count itself defined porn as "I'll know it when I see it". Its subjective turtles stacked all the way down.
edit: So is a picture of a teenager on the beach porn? Or just someone getting a tan? there is no specific line that can be drawn because the same image can be sexual to some and non-sexual to another. Hence when you look at any "legal definition", its well.... subjective. Yes there are cut and dry cases of obvious sexual activity (read RAPE), but then many other cases where its not. Its up to the investigator to rule on it as judge and jury.
This isn't an argument against making csam illegal, just pointing out that it can be incredibly subjective at times.
What I've read is that some CSAM viewers start out as compulsive porn users (e.g., addicts) who seek out more and more extreme/illegal content over time as they become desensitized. If that's true, I could see viewing AI-generated CSAM as a step on the path to "real" CSAM. If someone thinks the AI-generated CSAM is harmless, they'd be more likely to take that step, I would guess, and end up where the harm is not arguable.

(Not an expert!)
This is the "slippery slope" fallacy, which is well... a logical fallacy. I like eating meat, but chicken is boring. I'll try more exotic meats. 6 months later I'm a cannibal.....
 
Upvote
26 (33 / -7)
What I've read is that some CSAM viewers start out as compulsive porn users (e.g., addicts) who seek out more and more extreme/illegal content over time as they become desensitized. If that's true, I could see viewing AI-generated CSAM as a step on the path to "real" CSAM. If someone thinks the AI-generated CSAM is safe and harmless viewing, they'd be more likely to take that step, I would guess, and end up where the harm is not arguable.

(Not an expert!)
I had to go to dealers to get weed. I never was interested in coke or heroin. I could have got it. I didnt.
 
Upvote
28 (30 / -2)

Amarillo3

Wise, Aged Ars Veteran
159
Subscriptor
for better or worse, there IS no definition. The supreme count itself defined porn as "I'll know it when I see it". Its subjective turtles stacked all the way down.
edit: So is a picture of a teenager on the beach porn? Or just someone getting a tan? there is no specific line that can be drawn because the same image can be sexual to some and non-sexual to another. Hence when you look at any "legal definition", its well.... subjective. Yes there are cut and dry cases of obvious sexual activity (read RAPE), but then many other cases where its not. Its up to the investigator to rule on it as judge and jury.
This isn't an argument against making csam illegal, just pointing out that it can be incredibly subjective at times.

This is the "slippery slope" fallacy, which is well... a logical fallacy. I like eating meat, but chicken is boring. I'll try more exotic meats. 6 months later I'm a cannibal.....
What I'm talking about is an addiction or a compulsion. Is meat addiction a thing?
 
Upvote
-9 (4 / -13)
I didn't write the article, Ms. Belanger outlined what she views as specific harms to real people. If you disagree with her, you should tell her. I'm just quoting what she wrote
Ms. Belanger is obviously wrong if you take into account pretty much any sociological research. E.g. BDSM is a harm free way to (consensually) release violent inhibitions that would otherwise be bad. Cant see any argument why its not the same causation with csam.
But its not exactly my point. Im more concerned with csam definition with relation to teens. Not even as porn category, but as thousands of ruined teen lifes due to becoming criminals by (stupid but still) jokes... Thats for everyone who still remembers whats being teen feels like ;)
 
Upvote
30 (31 / -1)

shockdoktor

Smack-Fu Master, in training
5
Look we can all agree that CSAM is abhorrent, and I want to be very clear here that I am a survivor of childhood sexual abuse where CSAM was created when I say what I say next. As disgusting as CSA is, it is also objectively true that prohibition makes most problems worse. Want fewer abortions? Uphold women's bodily and financial autonomy, teach human sexuality and birth control early and often, and watch the abortion rate decline like we have these last thirty eyars. Want to reduce teen pregnancy, or even teen sexual activity? Change the school messaging from abstinence to healthy sexuality and ubiquitize porn and watch the pregnancy rate plummet like we have for the last thirty years. Want to reduce the harms of drug trafficking and the toll of destroyed lives via drugs? Legalize, regulate, establish needle exchanges and drug testing centers, and take the savings from enforcement and put it into science-based treatment and universal mental health care, and watch the "drug problem" evaporate. These are objective facts.

It is the case that in every one of the instances above, arguments about "normalizing" the behavior were used to support prohibitionist and incarcerationist viewpoints. These arguments have been proven wrong time and time again, there is no reason to expect that CSA+M is any different.

The stigma around acknowledging children's early sexuality means that the overwhelming majority of victims do not seek help both as children or as adults (the average age of acknoledging CSA is 48) and feel lifelong shame that leads them to drug abuse, suicide, and increased rates of a host of other mental health problems. The stigma around the condition of being attracted to children is so powerful that men (overwhelming majority of abusers) do not seek help for the condition from a mental health professional. Addicts without clinical and community support are an order of magnitude more likely to fall back into old habits, and people attracted to children are no different. So the stigma itself - the reflexive disgust we all feel when we countenance this condition - leads to increased CSA.

The stigma around CSAM is so powerful that there is precious little actual scholarship about its effects. In the 90s it was theorized that CSAM actually reduces sexual abuse by providing addicts an outlet for their urges. Certainly there was no evidence linking CSAM consumption with increased rates of CSA back then (that may have changed with the advent of dark web CSAM forums, more on those later). I don't think this was ever really researched, as the idea was obviously problematic for many: those images weren't victimless. But even that was not as black-and-white as you might think. Personally, if the photos taken of me as a child are still circulating, I hope that they are helping to keep another child safe by sublimating an addicts' urges. After all, I did nothing wrong. I have nothing to be ashamed of. If given the chance to allow my photos to be used in a treatment program or study, I would unequivocally accept. Because the incredible stigma and illegality around CSAM forces users into online communities. In many cases, the door only opens when you provide evidence of having abused a child. Think about that: CSAM prohibition directly leads to CSA, in a real, tangible way, while the "normalization" argument is purely theoretical. Further, these communities do the work that male-dominated illegal subcultures like gangs/cartels, prisons, and radical religious communities do in the real world, radicalizing recruits by normalizing increased violence and younger and more vulnerable victims.

There are no good options here. But this is absolutely a mental health issue being forced into a frankly medieval prohibition/punishment context that doesn't work at all to better public health. Why would we imagine it to work here?
 
Upvote
66 (66 / 0)
Post content hidden for low score. Show…
As I said, there are hills to die on, and hills not to die on. Fighting banning vaccinations "for the children" will unequivocally be a more worthy death hill than banning ersatz child porn "for the children". Not a difficult concept.
Disagree, this is tacitly endorsing thought crimes by saying to let it happen.
 
Upvote
10 (10 / 0)
Post content hidden for low score. Show…

Num Lock

Ars Centurion
374
Subscriptor
What I've read is that some CSAM viewers start out as compulsive porn users (e.g., addicts) who seek out more and more extreme/illegal content over time as they become desensitized. If that's true, I could see viewing AI-generated CSAM as a step on the path to "real" CSAM. If someone thinks the AI-generated CSAM is safe and harmless viewing, they'd be more likely to take that step, I would guess, and end up where the harm is not arguable.

(Not an expert!)
This makes no sense. By that logic the people who are forced to view CSAM to take it down/investigate/prosecute offenders would all become pedophiles by exposure to the content.

What seems more likely to me is that some percentage of the population is predisposed to be into CSA, and some subset of that group is going to decide to move past consumption and to action. Consuming AI-generated CSAM is absolutely a red flag, and if it provably helps catch these people before they harm a real child or finds abusers who might otherwise not be caught, I lean towards criminalizing it.
 
Upvote
-11 (4 / -15)

42Kodiak42

Ars Scholae Palatinae
807
There are alot of feelings about this, but I Would propose considering - whats the point of making csam illegal? If its to protect children from harm, then why this?
Story says one model was trained on real csam, which means the rest were not. Are we making csam illegal because people don't like it (very bad way to pass laws), or to protect people. If no one was hurt in the creation of most of it (all the models NOT trained on real csam), then is it bad even if its disliked by most?

Reminder - the whole making it illegal cause most dissaprove of it thing, many people disapprove of jews or muslims in america. Should they be illegal? There needs to be a logical reason for laws, not illogical cultural ones.

edit: Its impossible to prove a negative. The concept of "prove it WASN'T trained on real csam" is impossible. But the same can be said about anything. Prove you haven't murdered someone with that gun you own and hid the body. It sounds preposterous but its the same logic. Hence why law is about proving, not disproving.
My general approach on it has been to examine the practical challenges and effects of both stances. And there are very good practical reasons to take a stance of "If it looks real, treat it like it's real"

The biggest practical reason why legalizing it is a bad idea is because asking both law abiding citizens and law enforcement to differentiate between real images and images that are made to look as realistic as possible is only going to get more difficult and unreliable as technology moves forward, and allowing things that look real but aren't real makes abiding by the law much harder for citizens trying to enjoy the full extent of their freedoms. Law enforcement is going to be a lot more capable and motivated when it comes to obtaining proof that an image is actually real illegal material. And some law abiding citizens are going to be sent to court for theoretically legal activities anyway; 'innocent until proven guilty' just means that twelve people in a row need to believe flawed evidence to send someone to jail.

The next big reason is that generators may provide a means to 'launder' illegal materials (including the significantly worse activities that may create those materials) for distribution with plausible deniability. That plausible deniability creates a massive problem if someone who would prefer to follow the law believes it, because then they start partaking in an illegal activity (and possibly funding further illegal activities) without realizing. Without going into what I think would happen in each scenario, there aren't really any good outcomes in any combination of the perpetrator, law enforcement, or jury buying into that plausible deniability.

The worst downside of "If it looks real, treat it as though it were" is that sometimes a generator will unexpectedly produce material that is illegal by that definition, and even then, something that's deleted 15 seconds after it's created and never spoken about isn't going found by police.
Law abiding citizens who would like to enjoy fake images are going to miss out as well, and it can make criminals out of people who otherwise would have followed the law.

But it gives us a law that's easy to follow and easy to test, and sometimes that's more important than theoretically allowing everything that doesn't cross our moral standards.

Edit Since you do mention the difficulties of "prove it wasn't" explicitly, I think I should also be a little more explicit in what can go wrong with that: Someone who wants to enjoy fake images would need court-admissible proof that those materials are fake to safely enjoy them even if this isn't explicitly stated in the law. Not because its their duty to prove their innocence to a jury, but because if those materials were actually real, the police might come across evidence that they're real and illegal that the image enjoyer didn't find. Juries aren't going to buy the excuse of "I didn't know it was illegal" if the prosecution has material evidence that they were real illegal images. Having court-admissible proof that their activities were within the bounds of the law prevents them from unknowingly partaking in illegal activities in the first place, and helps a lot if the prosecution brings up evidence that seems to prove that person guilty due to some logical fallacy that nobody in the defense, prosecution, or jury recognizes because that actually has happened before in a courtroom.
 
Last edited:
Upvote
4 (6 / -2)
So none of you read the article detailing the specific harms to actual people, huh?
Like others have said. This is typical government propaganda using moral outrage slippery slope style. While we can't prove this has harmed anyone and you can't prove us wrong, we just KNOW! that someone who plays violent video games will become violent. Or in this case it has to be true as no one would dare argue its false! This is just more big government becoming helpful big brother. We know there was a crime somewhere. No one would argue that we need to protect the children, human trafficking, equity and all sorts of other nonsense are used to take away peoples freedoms. Just like we need CSAM scanning of all text messages to protect the children...
 
Upvote
12 (16 / -4)

TimeWinder

Ars Scholae Palatinae
1,763
Subscriptor
As I said, there are hills to die on, and hills not to die on. Fighting banning vaccinations "for the children" will unequivocally be a more worthy death hill than banning ersatz child porn "for the children". Not a difficult concept.
We contain multitudes. We don't have to choose between these.
 
Upvote
7 (8 / -1)
Hmm I kind of have mixed feelings on this, initially my gut said good get them. But thinking for 3 more seconds... I think the focus should remain on real life abuse since those are the people causing... actual physical and mental harm. This harms no one and in a way may let people with this issue redirect there pedo energy into something harmless???

I just watched a movie with a CGI nuclear bomb. Should we arrest all the artists who worked on that too since it normalizes nuclear weapons and mass destruction? Also should arrest a good chunk of anime community next.
Yeah, if nothing else, this is a diversion of resources away from extremely victimizing crimes to victimless ones.

Police resources are, for better and for worse, limited and zero sum. Please spend them on rescuing and protecting actual children.
 
Upvote
12 (16 / -4)

AnonyGuy

Seniorius Lurkius
9
The stigma around CSAM is so powerful that there is precious little actual scholarship about its effects. In the 90s it was theorized that CSAM actually reduces sexual abuse by providing addicts an outlet for their urges.
Thanks for everything you wrote. I tend to agree with all of it and find it very informative.

This part here is really critical. There absolutely should be more study on this without judgment and instead a focus on the impact.

While I tend to agree with the theory, I think we should be very careful here in terms of simply transferring what works elsewhere to this. For example, I've read articles about the impact of porn and the TLDR is that a lot of guys who would've gone out and date raped stayed home and jerked off to porn instead.

However when it comes to CSAM, or AI CSAM, legalizing it doesn't just mean allowing it to exist, it also means allowing it to be promoted. This is a bit different from porn in that porn (not all of it) is depicting normal sexual behavior between consenting adults (there are other laws about non-consent porn). Promoting normal sexual behavior between consenting adults is one thing, but promoting child sexual abuse is very different.

One can theorize about this one way or the other, but we should definitely put aside other issues and study the impact it has on those attracted to CSAM as well as the victims of it.
 
Upvote
6 (6 / 0)
This makes no sense. By that logic the people who are forced to view CSAM to take it down/investigate/prosecute offenders would all become pedophiles by exposure to the content.

What seems more likely to me is that some percentage of the population is predisposed to be into CSA, and some subset of that group is going to decide to move past consumption and to action. Consuming AI-generated CSAM is absolutely a red flag, and if it provably helps catch these people before they harm a real child or finds abusers who might otherwise not be caught, I lean towards criminalizing it.
deleted
 
Last edited:
Upvote
-2 (0 / -2)

julesverne

Ars Scholae Palatinae
1,153
...there's growing consensus globally that, in general, AI-generated CSAM harms kids by normalizing child sex abuse through the increased prevalence of CSAM online.
This is as tiresome and predictable as it is unscientific.
"Normalising" is the codeword for increased danger for children. There is, however, little evidence that porn, CSAM or otherwise, increases sexual crimes. On the contrary, multiple studies suggest that consumption of pornography actually decreases the prevalence of sexual violence. This is of course not a plea for permissiveness when it comes to CSAM which itself is a crime. But the "normalization" trope needs to be rebutted.

As far as AI CSAM is concerned, the term is an oximoron. If it's ai, it's not strictly CSAM. Even if studies have proven that training data usually contain isolated CSAM elements, the generation of the images is build on the bulk of the data, namely normal images. Real CSAM isn't needed to create an AI facsimile. So, again the arguments put forward are unscientific.

Should "AI CSAM" be criminalized? As distasteful as it is, the weight of evidence suggests it shouldn't be. Will the data influence policy? Of course not. Too tempting is the "think-of-the-children" trophy. When the topic is highly emotional, logic has always had a tough time.
 
Last edited:
Upvote
15 (19 / -4)