*Trigger Warning* AI used to make CSAM?!

*Trigger Warning* AI used to make CSAM?!

nerograves

Registrant
*Trigger Warning* I know AI is a hot button but this is more about something else related to AI. Apparently Grok AI is being by some users to make CSAM. For me, I accepted that the photos my perp took of me (and possibly other kids) after the molestations are likely floating around out there. I don't know what he did with them as all my school did was just let him work through his contract term and leave. But to know that some users are using certain AI tools to make this type of imagery without any regulations pisses me off.

It especially irritates me after I learned these tools can not just amalgamate vectors to make mishmash images or videos, but they now recreate the entire likeness of someone (still with noticeable flaws) and make it into NSFW content (without consent from the person in the photo) or CSAM.

I work with AI, so I knew this kind of thing was possible but to have no regulations against some of these tools (some tools filter this out immediately) and allowing people to just to make this is bullshit, like what the hell!

Grok's 'sexy' settings: https://www.businessinsider.com/elon-musk-grok-explicit-content-data-annotation-2025-9

More recent news: https://www.theverge.com/news/853191/grok-explicit-bikini-pictures-minors

Sorry for the language, this was more of a rant to just get this off my chest. But this just bothers the hell out of me.
 
Some states are ahead of the curve & have already enacted laws barring AI generated CSAM. Would be nice if some of these companies/developers would put some sort of morals ahead of the almighty dollar though.
 
Some states are ahead of the curve & have already enacted laws barring AI generated CSAM. Would be nice if some of these companies/developers would put some sort of morals ahead of the almighty dollar though.
Thanks for sharing that. I believe its only a few states that don't criminalize yet, as a majority of states do fortunately. It just irritated me because a backend filter would lock that ability to generate CSAM. But like you mentioned, some of these companies/developers put money ahead of morals.

I guess for some of them, trying to justify the billions in debt to low profits they have outweighs morality.
 
I would rather people jerk off to AI kids verses real kids.

But, here's the thing with AI. If you are going to use it, read the fine print and terms of service. It's important to know where they are pulling content from and what they do with the info you dump into it. So many people are using it to create fun AI slop and don't realize that every photo you upload can potentially "train" the model.
Big text for emphasis:
It's up to us as users to be responsible with it just as much as it is for the companies to put up safeguards.

For what it is worth though, most models do not allow sexual content already. I've used AI in a therapeutic way and I am unable to create anything that refers to or implies sexual abuse or trafficking. It will not generate the content and flag it for human review. At that point you can make a case for why you want the content to be generated but I didn't get that far. I just canceled the request and pivoted the project.
 
The EU is, I think, the first making laws explicitly for AI use. I am not always ok how they handle things that should be free to use, and one wrong use doesn’t make the whole an issue. Like internet: I feel to have the right to use it as anonymously as possible, for my safety and privacy, even if some use it for bad stuff.
But when it comes to AI I see how “old” I am starting to get. I just don’t trust it. Also because even if I am good at seeing details when many don’t… That AI stuff is getting incredibly good. For now it’s often minimal like something off in movements or the missing “human” look.
 
I would rather people jerk off to AI kids verses real kids.
And here is where me thinking this makes me feel uneasy. I think many pedos would never touch a child, and use online stuff to work off steam, be it real CSAM, stories, or now this AI stuff. I ask: if someone is aware to have that “preference”, knows that it is wrong ESPECIALLY if he/she would act out on it, AI created material could even be helpful.
The real criminals won’t stop… the real abuse neither. When they make laws to control online CSAM, I ask myself if they really don’t know how most predators would never record something and risk to get caught. Mine already were that intelligent…
 
It's up to us as users to be responsible with it just as much as it is for the companies to put up safeguards.
True, users need to be responsible just as much as companies. Its why I tell some people I know to be careful uploading images to LLMs as that data will be retained to train the model. Where I work, we keep historical data for about 5 to so on years to train the models for forecasting. We anonymize but that data is kept for fraud prevention and analytics.
 
And here is where me thinking this makes me feel uneasy. I think many pedos would never touch a child, and use online stuff to work off steam, be it real CSAM, stories, or now this AI stuff. I ask: if someone is aware to have that “preference”, knows that it is wrong ESPECIALLY if he/she would act out on it, AI created material could even be helpful.
The real criminals won’t stop… the real abuse neither. When they make laws to control online CSAM, I ask myself if they really don’t know how most predators would never record something and risk to get caught. Mine already were that intelligent…
I think we need to be clear here that people who jerk off to any sort of CSAM are "real" criminals too. It is illegal. Full stop. Just because a pedo is not touching children does not mean he is not a criminal.

But, I do understand what you are saying. AI might be a tool to prevent the need to create content using real children.
 
Just because a pedo is not touching children does not mean he is not a criminal.
But, I do understand what you are saying. AI might be a tool to prevent the need to create content using real children.
I did not really think about CSAM being created for any other need than to “have a video of what I did” and in some cases “can’t wait to show my buddy”. I hope the sarcasm in what I say is felt. But the thought that it is largely produced to be sold, to make money, seeing in those children nothing more than a tool and thing needed for the product to be made. This… I can’t wrap my head around this… And writing this out now after having read too many stories in here, and after having this gift of deep conversations with people who has been through hell and now is helping me to get over my hell? My heart is with one and each of you since I am here.

What I want to say is that a man who does not rape a woman is not a criminal. Even if he things day by day how good it must be to rape one. He watches rape porn. But would never do it. So he is not a criminal.
I have read a lot about what this attraction to kids is. Or could be, being there so many different studies and views and opinions it is not clear. And I think many of you think that a man abusing a child does not mean he is attracted sexually to the child. Very often it is a power thing. I am sure my preps love women, yet…

You are absolutely right, CSAM is illegal, and the law makes it a crime to consume. My take is that law is not always tight to right. Who is convicted because of use did not the horrible things the producer did. Nor did he probably pay for it. What gets me confused is to see how many years the producers, the abusers get. After damaging human children for life.
Therefore my logical take on all of it together is that IF a written or computer generated or drawn story could be enough for the big majority of those who are attracted to children to never touch a child all their life, would that make the prohibition of it more harmful than useful?

(Getting a little too philosophical here, sorry guys. Just getting exited when discussing with people who make great point and listen, not just talk)
 
I’m going to choose the block this thread option on this one . It and the discussion is giving me the creeps so bad my phone is shaking out of my hand . No reason for any child porn of any kind and not justified reason for watching. Billy
If you find you need to look at or create child porn you may have been abused, But you are not ready to be here!
 
Last edited:
I did not really think about CSAM being created for any other need than to “have a video of what I did” and in some cases “can’t wait to show my buddy”. I hope the sarcasm in what I say is felt. But the thought that it is largely produced to be sold, to make money, seeing in those children nothing more than a tool and thing needed for the product to be made. This… I can’t wrap my head around this… And writing this out now after having read too many stories in here, and after having this gift of deep conversations with people who has been through hell and now is helping me to get over my hell? My heart is with one and each of you since I am here.
I don't want to call you naive, but CSAM is not just home videos shot by creepy dudes in a basement of their sessions with boys. It's a huge organized industry. Be thankful you can't wrap your head around it.

What I want to say is that a man who does not rape a woman is not a criminal. Even if he things day by day how good it must be to rape one. He watches rape porn. But would never do it. So he is not a criminal.
Correct. A man who jacks off to rape porn of ADULTS is not a criminal. Rape porn is a type of fetish and because there is a market for it, it is produced. The key word here is "produced." It uses actors to simulate rape story lines and scenarios. The actors are both consenting adults. Are there videos of actual rape? Probably. But if you are getting content from legal porn websites, they do a really good job in filtering out that content.

Again, to be clear, it is entirely different when there are underage kids in the video. A person who watches videos of children having sex IS A CRIMINAL. This is NOT a gray area. This is the law. It does not matter if the person watching never touches kids in real life. IT IS ILLEGAL AND THE PERSON WATCHING IS A CRIMINAL.

You are absolutely right, CSAM is illegal, and the law makes it a crime to consume. My take is that law is not always tight to right. Who is convicted because of use did not the horrible things the producer did. Nor did he probably pay for it. What gets me confused is to see how many years the producers, the abusers get. After damaging human children for life.
Child porn is an industry. It is made for the consumer not the producers. There's a saying "don't eat where you shit." Those who are involved in the production of child porn (I'm talking the actual industry side, not the pervy guy in basement with a camcorder) they are business men. They are not in it to get their rocks off. They are not in it because they are attracted to kids. They are in it because there's big money to be made. Boys are the product. They are selling the product. The product wouldn't need to be sold if there was not a consumer wanting to buy it.

Therefore my logical take on all of it together is that IF a written or computer generated or drawn story could be enough for the big majority of those who are attracted to children to never touch a child all their life, would that make the prohibition of it more harmful than useful?
That's the debate. And as I said before, I would rather someone jack off to videos of robot kids than real kids.
 
Thanks for the intelligent and respectful discussion. I see though the need to refrain myself from these sensitive topics, because I really feel bad when a thought process I have is doing harm to anyone.
I hope at least that it is clear that I am by no means in favour of any productions when it comes to CSAM.
I also know that LAW is making something criminal or not. That was not what I wanted to say.
And sorry to whom got triggered
 
I have no idea what Grok is or CSAM, or any other abbreviations.
Well, Grok in this case is an AI chatbot on Twitter or X now. CSAM is abbreviated for Child Sexual Abuse Material. Most AI chatbots block/restrict Not Safe For Work (NSFW) or CSAM generated images r prompts. Apparently though,, some user use Grok to make these images. Not specifically related to grok AI only, but The National Center for Missing & Exploited Children and police flagged it as a troubling rise.
 
Any use of CSAM is a criminal offense. It's the equivalent of trading in stolen property. I don't care if it's a one-off, home-made job, or professionally shot, criticalinals are profiting from its use. And the children suffered, and many continue to suffer their entire lives. What's worse is it's almost in perpetuity...it doesn't ever go away.

As for AI as a means of preventing the need for real abuse, BS. The models were built by crunching this stuff. Any AI output will always be guilty of those original crimes.
 
Child porn is an industry. It is made for the consumer not the producers. There's a saying "don't eat where you shit." Those who are involved in the production of child porn (I'm talking the actual industry side, not the pervy guy in basement with a camcorder) they are business men. They are not in it to get their rocks off. They are not in it because they are attracted to kids. They are in it because there's big money to be made. Boys are the product. They are selling the product. The product wouldn't need to be sold if there was not a consumer wanting to buy it.
I would like to say something in regards to this caricaturization.
There’s a famous line in a Chris Farley movie. “If you want a good T-bone 🥩 , you can stick your head up a bulls ass to find the best or you can take the butcher’s word for it!”.
That’s a dream told to a child! It’s the equivalent of saying that bankers don’t touch money or butcher’s are vegan! They prefer the Oprah and only give to the best charities also. Peace Billy
 
Back
Top