So if legalized porn reduces rapes as studies show, how to we figure out if this existing allows for less abuse to kids, or if it spawns long term interest
Cartoon csam has been legal in Japan for decades. They have a lower CSA per Capita than the USA.
There are some brain studies that show the area of the brain that is responsible for caring for children is butt up next to the part of the brain that is responsible for sexual pleasures. The study suggests that there might be a misfiring of the synapse between the different sides of the brain that might cause someone to be a pedo. These people don’t experience sexual pleasures without thinking about kids. It’s literally a disability.
My opinion is that we don’t know if removing AI generated csam would make things worse for real life kids or not. But flat out banning it without proper research would be irresponsible.
But flat out banning it without proper research would be irresponsible.
I think the whole argument is moot. AI image generation is available to pretty much everyone. It’s impossible to control what what people are doing with it
Thanks for the thoughts on such, the way people were only downvoting originally and not providing any actual explanation to why, had me thinking it was just going to have been dumb to ask.
So if legalized porn reduces rapes as studies show, how to we figure out if this existing allows for less abuse to kids, or if it spawns long term interest
deleted by creator
Cartoon csam has been legal in Japan for decades. They have a lower CSA per Capita than the USA.
There are some brain studies that show the area of the brain that is responsible for caring for children is butt up next to the part of the brain that is responsible for sexual pleasures. The study suggests that there might be a misfiring of the synapse between the different sides of the brain that might cause someone to be a pedo. These people don’t experience sexual pleasures without thinking about kids. It’s literally a disability.
My opinion is that we don’t know if removing AI generated csam would make things worse for real life kids or not. But flat out banning it without proper research would be irresponsible.
I think the whole argument is moot. AI image generation is available to pretty much everyone. It’s impossible to control what what people are doing with it
Maybe if self hosted, but if the AI is hosted by someone else… I imagine it would be as easy as key words being reported/flagged
Self hosting is trivial these days. Any modern NVIDIA card and hundreds of models available online.
Thanks for the thoughts on such, the way people were only downvoting originally and not providing any actual explanation to why, had me thinking it was just going to have been dumb to ask.