Cybercrime authorities in Paris raided the office of Elon Musk’s Mr. This includes illegal data extraction and the production and possession of child pornography.
Additionally, the French public prosecutor’s office announced that both Musk and former chief executive Linda Yaccarino have been ordered to appear at a hearing in April, according to the BBC.
The raid, which Musk called a “political attack,” followed an analysis released by the UK-based Internet Watch Foundation (IWF), which found a 26,000% spike in videos and images depicting AI-generated child sexual abuse material (CSAM).
In an interview with CBN News, Benjamin Bull, general counsel for the National Center on Sexual Exploitation (NCOSE), made some bold distinctions about Musk’s handling of CSAM on X (formerly Twitter).
When Musk first bought Twitter, he promised to tackle rampant child pornography on the social media platform, even writing that “ending child exploitation will be a top priority” in 2022. Nevertheless, this problem is still prevalent on the site and seems to be made worse by Grok.
“He’ll say whatever it takes to get through this,” Bull said of Musk. “If it suits his purpose and he’s under intense scrutiny, he’ll say whatever it takes to present a winning position. He’s done nothing to remove CSAM from his platform. If anything, it’s made the problem worse.”
Bull went on to argue that Musk’s on-and-off relationship with President Donald Trump and the multibillion-dollar investments by big tech companies like Facebook and Instagram’s parent company Meta have significantly clouded the way the issue of online child abuse is addressed.
In 2024, House Speaker Mike Johnson blocked the Kids Online Safety Act (KOSA) in the House after it successfully passed the Senate. The bill, co-sponsored by Sens. Marsha Blackburn (R-Tenn.) and Richard Blumenthal (D-Conn.), faced scrutiny from some conservatives, including Johnson and House Majority Leader Steve Scalise (R-Louisiana), who feared the proposal could inadvertently lead to internet censorship and government overreach. They argued that the law could force powerful technology companies to over-moderate content to avoid lawsuits and could result in government officials interpreting “harmful content” through an ideological lens in the future.
Bull sees it differently.
Johnson’s decision to block a vote on KOSA came around the same time Meta announced plans to build a $10 billion data center in Louisiana. Notably, the speaker was pressed on Fox News’ “Fox & Friends” to propose a quid pro quo deal and denied it.
Mr Bull said Mr Johnson “wanted to do the right thing” but was being forced into “hardball politics”.
One of the most alarming concerns about CSAM or revenge porn, especially those created entirely using artificial intelligence, is its persistence. So once it’s on the internet, it’s there forever. In fact, author and activist Laila Mikelwaite told CBN News that porn sites like Pornhub are virtual crime scenes that house images and videos of CSAM, sextortion, and non-consensual victims.
“Once it’s published online, it can be downloaded and re-uploaded. … Once it’s published on the Internet, there’s a very good chance it will never be removed again,” Bull explained.
Bull called the issue “extremely dangerous” and called on the Federal Trade Commission and U.S. law enforcement agencies to take similar action as the Paris cybercrime division.
Donna Rice-Hughes, CEO of Enough Is Enough, similarly sounded the alarm in an interview with CBN News, saying that with advances in AI, “you don’t need anything” other than a person’s image or likeness to victimize that person.
“You can take a picture of anyone’s child in a Christmas photo, and you can deepfake it. So you can actually take off that child’s clothes, you can actually age that child, you can make that child do whatever you want sexually,” she warned. “And that’s what’s happening with AI.”
In addition to the moral problems and sinfulness of pornography, especially its illegal content that victimizes children, Hughes pointed out that even artificially generated CSAM could very well lead to the abuse of real children as predators’ desires escalate.
“It just fuels that desire,” she says. “Data and research shows that when someone uses child abuse material, they get hooked on it and then their desire to take action against real children increases. That’s just a fact. Everything fuels it.”
Hughes and Bull both called on Americans to call their representatives and advance legislation on these issues.
Bull also urged parents to actively protect their children from the dangers associated with accessing smart devices, suggesting that children should not actually use smartphones until at least their late teens.
You can watch the full conversation between Mr. Hughes and Mr. Bull in the video above.
As the number of voices facing censorship from big tech companies continues to grow, sign up for FaithWire’s daily newsletter and download the CBN News app, created by our parent company, to get the latest news from a distinctly Christian perspective.
