baseball
The rise in child sexual abuse material (CSAM) has been one of the darkest Internet trends, but after years of covering CSAM cases […] I’ve never seen anyone who, when arrested, had three Samsung Galaxy phones filled with “tens of thousands of videos and images” depicting CSAM, all of it hidden behind a secrecy-focused, password-protected app called “Calculator Photo Vault.” Nor have I seen anyone arrested for CSAM having used Potato Chat, Enigma, nandbox, Telegram, TOR, and Web-based generative AI tools/chatbots. […] not only he used all of these tools to store and download CSAM, but he also created his own—and in two disturbing varieties. First, he allegedly recorded nude minor children himself and later “zoomed in on and enhanced those images using AI-powered technology.” Secondly, he took this imagery he had created and then “turned to AI chatbots to ensure these minor victims would be depicted as if they had engaged in the type of sexual contact he wanted to see.” In other words, he created fake AI CSAM—but using imagery of real kids. The material was allegedly stored behind password protection on his phone(s) but also on Mega and on Telegram, where Herrera is said to have “created his own public Telegram group to store his CSAM.” He also joined “multiple CSAM-related Enigma groups” and frequented dark websites with taglines like “The Only Child Porn Site you need!” Despite all the precautions, his home was searched and his phones were seized […] he was eventually arrested on August 23.
On June 26th 2024, I launched a website called One Million Checkboxes