General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region Forums'I Want That Sweet Baby': AI-Generated Kids Draw Predators On TikTok And Instagram (Forbes)
https://www.forbes.com/sites/alexandralevine/2024/05/20/ai-generated-kids-tiktok-instagram-social-media-child-safety-predators/I Want That Sweet Baby: AI-Generated Kids Draw Predators On TikTok And Instagram
Images of AI children on TikTok and Instagram are becoming magnets for many with a sexual interest in minors. But when this content is legal and depicts fake people, it falls into a messy, troubling gray area.
By Alexandra S. Levine, Forbes Staff
-snip-
Theyre pictured in lace and leather, bikinis and crop tops. Theyre dressed suggestively as nurses, superheroes, ballerinas and french maids. Some wear bunny ears or devil horns; others, pigtails and oversized glasses. Theyre black, white and Asian, blondes, redheads and brunettes. They were all made with AI, and theyve become magnets for the attention of a troubling audience on some of the biggest social media apps in the worldolder men.
-snip-
If this is AI-generated, does it make me bad to say shes hot as everything put together? another TikToker wrote on a slideshow of fully clothed little girls in Spider-Man costumes. Shes not real, but Ive got a great imagination.
-snip-
Looks tasty. Do you do home delivery? The perfect age to be taken advantage of. I want that sweet baby. Can you do a test where she jumps out of my phone into my bed? said others on TikTok and Instagram. Forbes found hundreds of posts and comments like these on images of AI-generated kids on the platforms from 2024 alone. Many were tagged to musical hitslike Beyonces Texas Hold Em, Taylor Swifts Shake It Off and Tracy Chapmans Fast Carto help them reach more eyeballs.
Child predators have prowled most every major social media appwhere they can hide behind screens and anonymous usernamesbut TikTok and Instagrams popularity with teens and minors has made them both top destinations. And though platforms struggle to crack down on child sexual abuse material (or CSAM) predates todays AI boom, AI text-to-image generators are making it even easier for predators to find or create exactly what theyre looking for.
-snip-
Unfortunately the images aren't illegal yet.
Report any you see anyway.
The National Center for Missing and Exploited Children told Forbes these AI images should be taken down, that the AI creating these images have been.trained on CSAM, child sexual abuse material. That includes Stable Diffusion and Midjourney. (See https://www.forbes.com/sites/alexandralevine/2023/12/20/stable-diffusion-child-sexual-abuse-material-stanford-internet-observatory/ .)
TikTok and Instagram did remove the images and videos, accounts and comments that Forbes alerted them to. They told Forbes what they removed violated their policies.
But before Forbes contacted them, one of the accounts had 80,000 followers and some posts with half a million views. Most of the followers of that account appeared to be older men.
So please report any of these posts you see, and ask your friends on those platforms to report them.
ProfessorGAC
(65,828 posts)The first link is paywalled but I got plenty from your snip.
Was this being done to snare abusers?
usonian
(10,208 posts)highplainsdem
(49,279 posts)highplainsdem
(49,279 posts)Posts like those the article described - where the comments on these AI images of sexualized children often get private DM responses from people offering what might be strictly illegal - encourage pedophiles.
ProfessorGAC
(65,828 posts)Like I said, weirder & weirder.
highplainsdem
(49,279 posts)Igel
(35,443 posts)I distinctly remember the law from nearly 30 years ago saying images that were or appeared to be of minors engaged in sexually explicit acts or conduct.
I missed the 2002 SCOTUS denial of the "appears to be" part's constitutionality, which took ignoring the first dozen or so links my preferred search engine provided me to locate.
I stand corrected. Given Ashcroft, apparently these do pass constitutional muster. Reprehensible, but a lot of speech deemed protected is usually reprehensible.