When Elon Musk bought Twitter in 2022, he argued that the surge in child sexual abuse material on the platform was his number one priority. Then last year he said X would do more to combat child exploitation “more than any other platform.” But that doesn’t seem to be a fact that backs him up.
In fact, it appears that the opposite is true.
Hailey McNamara, senior vice president of the National Centre for Sexual Exploitation’s Strategic Initiatives and Programs, told CBN News that the spread of illegal content to X was “worse than ever.”
Her comments follow a bombshell NBC News report detailing the rise in seemingly automated X accounts, flooding certain hashtags with hundreds of posts per hour, each promoting illegal child sex abuse material (CSAM) or child pornography sales.
The apparent propagation of CSAM on X occurs because the platform defaults to pay for Thorn. It is a nonprofit organization that provides high-tech companies with services to identify and process child sexual abuse content.
A representative for Thorn told NBC News that he was forced to terminate his contract with X after months of non-payment for its services. X claims to CSAM that “the safety engineering team (members) have built cutting-edge systems to further strengthen their enforcement capabilities.”
However, we still can’t see what it specifically does.
Listen to the latest episodes of “Quick Start”
In the first half of 2024, the National Center for Missing and Exploited Children received 370,588 reports from X regarding the content of child abuse, with approximately 2.8 million accounts being suspended for child safety reasons.
X CEO Linda Yaccarino told Congress in January 2024 that the platform “suspended 12.4 million accounts in 2023 for violating its policy of sexual exploitation of children.”
He emphasized that much of the issue is concentrated on hashtags, specific phrases that bad actors use to drive predators into illegal material.
“This has been a long-standing problem,” she said. “Unfortunately, it seems to be getting worse… Obviously, we know that CSAM sharing happens on the dark web. But it’s happening in public spaces.
McNamara said X “deliberately allow porn content to be shared.” It has become an explicit “hub” of sexual material, but “does not take steps to confirm the age, consent or identity of people in porn videos on the platform.
“If X wants to act like a porn website and allow this kind of thing, they should be regulated in the same way,” she said.
The key elements that effectively combat CSAM on sites like X are the technology known as “hash matching,” a major strategy for tracking child abuse materials on the Internet.
According to Cometchat, “Hash Matching” has proven to be “the only truly scalable solution of today for identifying and blocking known CSAM content.” This technology works by converting illegal videos and images into separate digital footprints (or “hash”). The platform then runs an algorithm embedded in a “hash” on websites such as X, and removes content flagged as CSAM the moment it is uploaded, possibly stopping the content from spreading further.
“Hash Matching” is undoubtedly a much-needed tool in the arsenal for protecting children. Adult victims of sexplotting, the so-called “venge porn”, and even deep fakes, are just part of the solution.
She explained that previously shared content cycles are what the “hash matching” track is a “infinitely small” part of CSAM issues with sites like X.
“Most of what’s out there is novelty. They’re kids who are abused today, and the video is shared online, so the system is missing out on it,” she said of the technology used by many platforms.
“It doesn’t seem like they’re using it or prioritizing it,” she added X.
Former porn star Joshua Bloom has since become a Christian and advocate for biblical sexual ethics, but said CBN news platforms like Musk’s X should be “responsible for housing what’s on their website.”
Bloom is tackling a small number of issues, and supports the U.S. Supreme Court decision, and supports Texas statutes to request age verification for accessing porn online.
Bloom and McNamara praised the Take It Down Act for requesting the site to remove known CSAM and NCII within 48 hours of a valid request.
However, Bloom argued that the law needs to go further. Ultimately, he wants to see the exclusion of Section 230(c)(1), which states that “the provider or user of an interactive computer service is not treated as a publisher or speaker of information provided by another information content provider.”
“It allows them to have immunity,” he explained. “They said, ‘Hey, we didn’t post it, so even if we made a direct profit from it, we shouldn’t be held responsible for it. It’s on our hard drive, we’re making money from it, but we didn’t put it there. Well, you have to be responsible for it.”
It abolished or significantly revived Section 230, continuing with “the next step in the right direction.”
“It’s insane that sites like X are not responsible for having hundreds of thousands of child porn,” Bloom said. “Literally, everyone else in the world will go to prison – no doubt anyone in the US will go to prison – for what’s on their site every day. But for Section 230, they are not responsible.”
As the number of voices facing big technology censorship continues to grow, sign up for Faithwire’s daily newsletter and download the CBN news app developed by the parent company to keep you up to date with the latest news from a clear Christian perspective.