When Elon Musk purchased Twitter in 2022, he claimed the proliferation of child sexual abuse material on the platform was his top priority. Then, last year, he said X does more to fight child exploitation “than any other platform by far.” But it doesn’t seem the facts back him up.
In fact, it appears the opposite is true.
Haley McNamara, senior vice president of strategic initiatives and programs at the National Center of Sexual Exploitation, told CBN News the spread of illicit content on X is “worse potentially than ever before.”
Her comments follow a bombshell NBC News report detailing the rise in seemingly automated X accounts flooding specific hashtags with hundreds of posts per hour, each advertising the sale of illegal child sexual abuse material (CSAM) — or child pornography.
The apparent propagation of CSAM on X comes as the platform has defaulted on its payments to Thorn, a nonprofit providing tech companies with services to identify and handle child sexual abuse content.
A representative for Thorn told NBC News it was forced to terminate its contract with X after months of nonpayment for its services. X, for its part, is claiming its “safety engineering team [members] … have built state of the art systems to further strengthen our enforcement capabilities” against CSAM.
What specifically it’s doing, though, remains to be seen.
Listen to the latest episode of “Quick Start” 👇
In the first half of 2024, the National Center for Missing and Exploited Children received 370,588 reports of child abuse content from X and nearly 2.8 million accounts were suspended for child safety reasons.
The CEO of X, Linda Yaccarino, told Congress in January 2024 that the platform “suspended 12.4 million accounts for violating our [child sexual exploitation] policies” in 2023.
Much of the problem, McNamara highlighted, centers on hashtags, specific phrases used by bad actors to drive predators toward illicit material.
“This has been a longstanding problem,” she said. “Unfortunately, it seems to be getting worse. … Obviously, we know the sharing of CSAM happens on the dark web. But it’s happening in open spaces, publicly. Hashtags are very public.”
McNamara went on to say X “intentionally allows the sharing of pornographic content” and has become “a hub” for explicit sexual material, but has taken no steps “to verify the age, consent, or identity of the people in pornographic videos on their platform.”
“If X wants to act like it’s a pornography website and allow this kind of thing, then they should be regulated that same way,” she said.
A major component in effectively combatting CSAM on sites like X is the use of AI and a technology known as “hash matching,” the leading strategy for tracking child abuse material on the internet.
According to CometChat, “hash matching” has proven to be “the only truly scalable solution we have today for identifying and blocking known CSAM content.” The technology works by converting illicit videos and images into individualized digital footprints (or “hashes”). Platforms then run algorithms embedded with those “hashes” on websites like X, removing content previously flagged as CSAM the moment it is uploaded, presumably stopping the content from spreading further.
While “hash matching” is undoubtedly a much-needed tool in the arsenal for protecting children — as well as adult victims of sexploitation, so-called “revenge porn,” and even deep fakes — it is only part of the solution, according to McNamara.
She explained that the circulation of previously shared content, which is what “hash matching” tracks, is an “infinitesimally small” part of the issue of CSAM on sites like X.
“Most of what’s out there is novel — it’s a child who is being abused today, and that video is being shared online — and so their systems are missing it,” she said of the tech many platforms are using, noting there are technologies available that can identify and track nascent content “with greater than 90% accuracy.”
“It just doesn’t seem like they’re using it or prioritizing it,” she added of X.
Joshua Broome, a former pornography star who has since become a Christian and an advocate for a biblical sexual ethic, told CBN News platforms like Musk’s X ought to be “held liable for housing what’s on their websites.”
Broome was addressing a handful of issues, speaking out in support of the U.S. Supreme Court’s ruling in favor of Texas’ statute requiring age verification to access pornography online, the recently enacted Take It Down Act, and social media platforms’ use of Section 230 to skirt responsibility for distributing both CSAM as well as non-consensual intimate imagery (NCII).
Both Broome and McNamara praised the Take It Down Act for requiring sites to remove known CSAM and NCII within 48 hours of a valid request.
Broome argued, though, that the law needs to go further. Ultimately, he would like to see the elimination of Section 230(c)(1), which states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”
“It allows them to have immunity,” he explained. “They’re saying, ‘Hey, we didn’t post it, so we shouldn’t be held liable for it, even if we benefit from it directly. It’s on our hard drive, we’re making money off of it, but we didn’t put it there; we’re just taking our cut.’ Well, you need to be held responsible for that.”
Repealing or significantly retooling Section 230, he continued, “is the next step in the right direction.”
“For sites like X to not be held liable for having hundreds of thousands of pieces of child pornography on it is insane,” Broome said. “Literally, anyone else in the world would go to prison — most certainly anyone in the United States would go to prison — for stuff that’s on their site each and every day. Yet, because of Section 230, they’re not held liable.”
As the number of voices facing big-tech censorship continues to grow, please sign up for Faithwire’s daily newsletter and download the CBN News app, developed by our parent company, to stay up-to-date with the latest news from a distinctly Christian perspective.