If you do that locally and dont share it, police have no way of finding out...Can't wait for all the "no one is being hurt by my decision to generate AI CSAM locally and not share it! This is thought crime!" apologists that are so fond of these articles
There are alot of feelings about this, but I Would propose considering - whats the point of making csam illegal? If its to protect children from harm, then why this?
Story says one model was trained on real csam, which means the rest were not. Are we making csam illegal because people don't like it (very bad way to pass laws), or to protect people. If no one was hurt in the creation of most of it (all the models NOT trained on real csam), then is it bad even if its disliked by most?
Reminder - the whole making it illegal cause most dissaprove of it thing, many people disapprove of jews or muslims in america. Should they be illegal? There needs to be a logical reason for laws, not illogical cultural ones.
edit: Its impossible to prove a negative. The concept of "prove it WASN'T trained on real csam" is impossible. But the same can be said about anything. Prove you haven't murdered someone with that gun you own and hid the body. It sounds preposterous but its the same logic. Hence why law is about proving, not disproving.
There are hills to die on, and hills not to die on. Trying to block the rounding up and murdering of racial or religious minorities is a righteous, and meaningful, hill to die on. CSAM hills in any form, seem to be much less meaningful, and probably difficult to get far up the righteousness bell curve. I agree that one can certainly conceive of photorealistic child pornography that involved no prior injury to any child, but nobody can guarantee a future trajectory for consumers thereof, and just as being drunk in a motor vehicle is an offense, although one probably would not have gone onto splatting a few pedestrians, I can see a reasonable argument for not legalizing any form of CSAM, regardless of how "antiseptic" the origin was. And as for the camel's nose argument, all but total morons should have noticed by now that the tent has become a camel garage.There are alot of feelings about this, but I Would propose considering - whats the point of making csam illegal? If its to protect children from harm, then why this?
Story says one model was trained on real csam, which means the rest were not. Are we making csam illegal because people don't like it (very bad way to pass laws), or to protect people. If no one was hurt in the creation of most of it (all the models NOT trained on real csam), then is it bad even if its disliked by most?
Reminder - the whole making it illegal cause most dissaprove of it thing, many people disapprove of jews or muslims in america. Should they be illegal? There needs to be a logical reason for laws, not illogical cultural ones.
Its bullshit. Whats next? BDSM?So none of you read the article detailing the specific harms to actual people, huh?
the trouble is "for the children" is the most common battle cry for taking away countless freedoms and spying on you and everyone else. Don't forget, trans people are becoming illegal "for the children". We MUST stop ANY blanket "for the children" rally cry, because its NEVER for the children, ever.There are hills to die on, and hills not to die on. Trying to block the rounding up and murdering of racial or religious minorities is a righteous, and meaningful, hill to die on. CSAM hills in any form, seem to be much less meaningful, and probably difficult to get far up the righteousness bell curve. I agree that one can certainly conceive of photorealistic child pornography that involved no prior injury to any child, but nobody can guarantee a future trajectory for consumers thereof, and just as being drunk in a motor vehicle is an offense, although one probably would not have gone onto splatting a few pedestrians, I can see a reasonable argument for not legalizing any form of CSAM, regardless of how "antiseptic" the origin was. And as for the camel's nose argument, all but total morons should have noticed by now that the tent has become a camel garage.
Reminder - the whole making it illegal cause most dissaprove of it thing, many people disapprove of jews or muslims in america. Should they be illegal? There needs to be a logical reason for laws, not illogical cultural ones.
There are alot of feelings about this, but I Would propose considering - whats the point of making csam illegal? If its to protect children from harm, then why this?
Story says one model was trained on real csam, which means the rest were not. Are we making csam illegal because people don't like it (very bad way to pass laws), or to protect people. If no one was hurt in the creation of most of it (all the models NOT trained on real csam), then is it bad even if its disliked by most?
Reminder - the whole making it illegal cause most dissaprove of it thing, many people disapprove of jews or muslims in america. Should they be illegal? There needs to be a logical reason for laws, not illogical cultural ones.
edit: Its impossible to prove a negative. The concept of "prove it WASN'T trained on real csam" is impossible. But the same can be said about anything. Prove you haven't murdered someone with that gun you own and hid the body. It sounds preposterous but its the same logic. Hence why law is about proving, not disproving.
for better or worse, there IS no definition. The supreme count itself defined porn as "I'll know it when I see it". Its subjective turtles stacked all the way down.If you do that locally and dont share it, police have no way of finding out...
Im not quite sure about definition of csam, but i suspect some anime lovers might get arrested![]()
This is the "slippery slope" fallacy, which is well... a logical fallacy. I like eating meat, but chicken is boring. I'll try more exotic meats. 6 months later I'm a cannibal.....What I've read is that some CSAM viewers start out as compulsive porn users (e.g., addicts) who seek out more and more extreme/illegal content over time as they become desensitized. If that's true, I could see viewing AI-generated CSAM as a step on the path to "real" CSAM. If someone thinks the AI-generated CSAM is harmless, they'd be more likely to take that step, I would guess, and end up where the harm is not arguable.
(Not an expert!)
I had to go to dealers to get weed. I never was interested in coke or heroin. I could have got it. I didnt.What I've read is that some CSAM viewers start out as compulsive porn users (e.g., addicts) who seek out more and more extreme/illegal content over time as they become desensitized. If that's true, I could see viewing AI-generated CSAM as a step on the path to "real" CSAM. If someone thinks the AI-generated CSAM is safe and harmless viewing, they'd be more likely to take that step, I would guess, and end up where the harm is not arguable.
(Not an expert!)
What I'm talking about is an addiction or a compulsion. Is meat addiction a thing?for better or worse, there IS no definition. The supreme count itself defined porn as "I'll know it when I see it". Its subjective turtles stacked all the way down.
edit: So is a picture of a teenager on the beach porn? Or just someone getting a tan? there is no specific line that can be drawn because the same image can be sexual to some and non-sexual to another. Hence when you look at any "legal definition", its well.... subjective. Yes there are cut and dry cases of obvious sexual activity (read RAPE), but then many other cases where its not. Its up to the investigator to rule on it as judge and jury.
This isn't an argument against making csam illegal, just pointing out that it can be incredibly subjective at times.
This is the "slippery slope" fallacy, which is well... a logical fallacy. I like eating meat, but chicken is boring. I'll try more exotic meats. 6 months later I'm a cannibal.....
Ms. Belanger is obviously wrong if you take into account pretty much any sociological research. E.g. BDSM is a harm free way to (consensually) release violent inhibitions that would otherwise be bad. Cant see any argument why its not the same causation with csam.I didn't write the article, Ms. Belanger outlined what she views as specific harms to real people. If you disagree with her, you should tell her. I'm just quoting what she wrote
What you described is the "slippery slope". Someone does something innocuous, but wants more and different. More and different, and bam they are doing something terrible.What I'm talking about is an addiction or a compulsion. Is meat addiction a thing?
Disagree, this is tacitly endorsing thought crimes by saying to let it happen.As I said, there are hills to die on, and hills not to die on. Fighting banning vaccinations "for the children" will unequivocally be a more worthy death hill than banning ersatz child porn "for the children". Not a difficult concept.
Training a generative AI model to produce CSAM seems like action quite outside the range of mere "thought". Maybe find a better metaphor.Disagree, this is tacitly endorsing thought crimes by saying to let it happen.
This makes no sense. By that logic the people who are forced to view CSAM to take it down/investigate/prosecute offenders would all become pedophiles by exposure to the content.What I've read is that some CSAM viewers start out as compulsive porn users (e.g., addicts) who seek out more and more extreme/illegal content over time as they become desensitized. If that's true, I could see viewing AI-generated CSAM as a step on the path to "real" CSAM. If someone thinks the AI-generated CSAM is safe and harmless viewing, they'd be more likely to take that step, I would guess, and end up where the harm is not arguable.
(Not an expert!)
My general approach on it has been to examine the practical challenges and effects of both stances. And there are very good practical reasons to take a stance of "If it looks real, treat it like it's real"There are alot of feelings about this, but I Would propose considering - whats the point of making csam illegal? If its to protect children from harm, then why this?
Story says one model was trained on real csam, which means the rest were not. Are we making csam illegal because people don't like it (very bad way to pass laws), or to protect people. If no one was hurt in the creation of most of it (all the models NOT trained on real csam), then is it bad even if its disliked by most?
Reminder - the whole making it illegal cause most dissaprove of it thing, many people disapprove of jews or muslims in america. Should they be illegal? There needs to be a logical reason for laws, not illogical cultural ones.
edit: Its impossible to prove a negative. The concept of "prove it WASN'T trained on real csam" is impossible. But the same can be said about anything. Prove you haven't murdered someone with that gun you own and hid the body. It sounds preposterous but its the same logic. Hence why law is about proving, not disproving.
Like others have said. This is typical government propaganda using moral outrage slippery slope style. While we can't prove this has harmed anyone and you can't prove us wrong, we just KNOW! that someone who plays violent video games will become violent. Or in this case it has to be true as no one would dare argue its false! This is just more big government becoming helpful big brother. We know there was a crime somewhere. No one would argue that we need to protect the children, human trafficking, equity and all sorts of other nonsense are used to take away peoples freedoms. Just like we need CSAM scanning of all text messages to protect the children...So none of you read the article detailing the specific harms to actual people, huh?
We contain multitudes. We don't have to choose between these.As I said, there are hills to die on, and hills not to die on. Fighting banning vaccinations "for the children" will unequivocally be a more worthy death hill than banning ersatz child porn "for the children". Not a difficult concept.
Yeah, if nothing else, this is a diversion of resources away from extremely victimizing crimes to victimless ones.Hmm I kind of have mixed feelings on this, initially my gut said good get them. But thinking for 3 more seconds... I think the focus should remain on real life abuse since those are the people causing... actual physical and mental harm. This harms no one and in a way may let people with this issue redirect there pedo energy into something harmless???
I just watched a movie with a CGI nuclear bomb. Should we arrest all the artists who worked on that too since it normalizes nuclear weapons and mass destruction? Also should arrest a good chunk of anime community next.
Thanks for everything you wrote. I tend to agree with all of it and find it very informative.The stigma around CSAM is so powerful that there is precious little actual scholarship about its effects. In the 90s it was theorized that CSAM actually reduces sexual abuse by providing addicts an outlet for their urges.
deletedThis makes no sense. By that logic the people who are forced to view CSAM to take it down/investigate/prosecute offenders would all become pedophiles by exposure to the content.
What seems more likely to me is that some percentage of the population is predisposed to be into CSA, and some subset of that group is going to decide to move past consumption and to action. Consuming AI-generated CSAM is absolutely a red flag, and if it provably helps catch these people before they harm a real child or finds abusers who might otherwise not be caught, I lean towards criminalizing it.
This is as tiresome and predictable as it is unscientific....there's growing consensus globally that, in general, AI-generated CSAM harms kids by normalizing child sex abuse through the increased prevalence of CSAM online.