The rise in child sexual abuse material (CSAM) has been one of the darkest Internet trends, but after years of covering CSAM cases, I’ve found that few of those arrested show deep technical sophistication. (Perhaps this is simply because the technically sophisticated are better at avoiding arrest.)
Most understand that what they are doing is illegal and that password protection is required, both for their devices and online communities. Some can also use tools like TOR (The Onion Router). And, increasingly, encrypted (or at least encrypted-capable) chat apps might be in play.
But I’ve never seen anyone who, when arrested, had three Samsung Galaxy phones filled with “tens of thousands of videos and images” depicting CSAM, all of it hidden behind a secrecy-focused, password-protected app called “Calculator Photo Vault.” Nor have I seen anyone arrested for CSAM having used all of the following:
Potato Chat (“Use the most advanced encryption technology to ensure information security.”)
Enigma (“The server only stores the encrypted message, and only the users client can decrypt it.”)
nandbox [presumably the Messenger app] (“Free Secured Calls & Messages,”)
Telegram (“To this day, we have disclosed 0 bytes of user data to third parties, including governments.”)
TOR (“Browse Privately. Explore Freely.”)
Mega NZ (“We use zero-knowledge encryption.”)
Web-based generative AI tools/chatbots
That’s what made this week’s indictment in Alaska of a heavy vehicle driver for the US military so unusual.
According to the government, Seth Herrera not only used all of these tools to store and download CSAM, but he also created his own—and in two disturbing varieties. First, he allegedly recorded nude minor children himself and later “zoomed in on and enhanced those images using AI-powered technology.”
Secondly, he took this imagery he had created and then “turned to AI chatbots to ensure these minor victims would be depicted as if they had engaged in the type of sexual contact he wanted to see.” In other words, he created fake AI CSAM—but using imagery of real kids.
The material was allegedly stored behind password protection on his phone(s) but also on Mega and on Telegram, where Herrera is said to have “created his own public Telegram group to store his CSAM.” He also joined “multiple CSAM-related Enigma groups” and frequented dark websites with taglines like “The Only Child Porn Site you need!”
Despite all the precautions, Herrera’s home was searched and his phones were seized by Homeland Security Investigations; he was eventually arrested on August 23. In a court filing that day, a government attorney noted that Herrera “was arrested this morning with another smartphone—the same make and model as one of his previously seized devices.”
Caught anyway
The government is cagey about how, exactly, this criminal activity was unearthed, noting only that Herrera “tried to access a link containing apparent CSAM.” Presumably, this “apparent” CSAM was a government honeypot file or web-based redirect that logged the IP address and any other relevant information of anyone who clicked on it.
In the end, given that fatal click, none of the “I’ll hide it behind an encrypted app that looks like a calculator!” technical sophistication accomplished much. Forensic reviews of Herrera’s three phones now form the primary basis for the charges against him, and Herrera himself allegedly “admitted to seeing CSAM online for the past year and a half” in an interview with the feds.
Since Herrera himself has a young daughter, and since there are “six children living within his fourplex alone” on Joint Base Elmendorf-Richardson, the government has asked a judge not to release Herrera on bail before his trial.
Source link : http://www.bing.com/news/apiclick.aspx?ref=FexRss&aid=&tid=66d106aa31784c5b8e805dfe996d1664&url=https%3A%2F%2Farstechnica.com%2Ftech-policy%2F2024%2F08%2Fmilitary-driver-arrested-for-creating-fake-ai-sex-abuse-images-but-using-real-kids-he-knew%2F&c=13303914916767855754&mkt=en-us
Author :
Publish date : 2024-08-29 10:25:00
Copyright for syndicated content belongs to the linked Source.