- The Washington Times - Updated: 6:56 p.m. on Thursday, April 9, 2026

Social media has made a mess of child exploitation, spawning a massive increase in reports but leaving the major technology companies struggling to turn over the kinds of details investigators need to track down culprits and rescue victims, said a report Thursday by the Senate’s senior lawmaker.

Some tech giants overreported, flooding investigators with “innocuous” images of children or adults and distracting them from valid cases. Others underreported, flagging images but failing to turn over actionable information to identify who was involved or where the exploitation was taking place.

Meta was faulted for failing to take action in instances where children contemplated suicide in chat logs.



Amazon AI Services submitted more than 1 million reports to the National Center for Missing and Exploited Children, but none included locations or suspects, rendering them useless to investigators.

“As a result of this deficient reporting, zero reports submitted by Amazon AI Services were actionable when made available to law enforcement in 2025,” NCMEC told Sen. Charles E. Grassley, president pro tempore of the Senate and chairman of its Judiciary Committee.

The Iowa Republican called the findings “disturbing” and fired off letters to tech companies saying he was “alarmed.”

“I am concerned that some companies have not provided NCMEC and law enforcement with sufficient data needed to protect kids and prosecute suspected predators,” he wrote.

Getting reports right is crucial. Missing a real instance of child exploitation hurts victims, but false positives overwhelm investigators, who then waste resources tracking down bogus leads.

Advertisement
Advertisement

NCMEC said eight businesses accounted for 17 million reports to its CyberTipline, or 81% of the total.

Some tried to cooperate but struggled with what they sent. Others seemed largely indifferent.

The companies, dubbed Electronic Service Providers, are required by federal law to submit reports of actual child sexual exploitation to NCMEC. The organization then passes along the reports about where the abuse is believed to be taking place to law enforcement.

The nature of those reports is not dictated. Without location or suspect information, the reports can be fairly useless.

Mr. Grassley said Amazon’s systems were intentionally designed not to collect information about users.

Advertisement
Advertisement

Grindr, a social media network specializing in LGBTQ users, appeared uncooperative in NCMEC’s data.

It made 111,334 tip line reports last year, but most lacked location information. The problem became worse over time. In 2024, 35% of Grindr’s reports had locations; by 2025, that had dropped to just 4%.

Even 93% of reports Grindr deemed “high priority” lacked location information.

When NCMEC tried to coax the company into doing better, it said, “Grindr is generally unresponsive or provides passive responses.” It refused an invitation to NCMEC’s tip line roundtable in November.

Advertisement
Advertisement

In a statement to The Washington Times, the company said it is part of several efforts to protect children, including the Tech Coalition, which battles online child sex abuse, and use of Safer by Thorn, which scans user-uploaded images to spot child sex abuse material, or CSAM as it’s known in the industry.

“Grindr is exclusively for adults aged 18 or over, and we take preventing CSAM with the utmost seriousness, including maintaining a substantial moderation team to identify and ban accounts at the device level if they appear to discuss topics related to minors, and deployment of AI and machine learning technology to proactively identify and ban similar accounts,” the company said.

Discord was dinged for both overreporting and underreporting. It would flag content depicting adults or animal abuse as child exploitation, even as it “frequently” failed to provide locations or account information for the reports it did make.

NCMEC said Discord has shown better success in limiting adult and nonpertinent gore and violence reports in 2026.

Advertisement
Advertisement

Meta was by far the biggest player with nearly 11 million reports, or more than half the traffic on the CyberTipline last year, from across its many platforms.

Nearly 1.2 million of those reports last year were related to sex trafficking and online child enticement. Meta often failed to turn over enough information to give investigators leads. Some 28% of Facebook reports and 36% of Instagram reports were closed by police investigators because of a lack of information to open an investigation.

Stephanie Otway, a Meta spokesperson, said the company is “committed” to improvements.

Child exploitation is a horrific crime, and we work tirelessly to protect children from it and bring the criminals involved to justice,” she said. “We will continue making refinements to improve our reporting process.”

Advertisement
Advertisement

TikTok was cited as an overreporter, sending content featuring adults or clothed children, which NCMEC said “causes major issues” in sorting out valid cases. When the issue was raised with TikTok, NCMEC said the company said it had other priorities.

Snap, X.AI and Roblox rounded out the eight top platforms identified by NCMEC.

• Stephen Dinan can be reached at sdinan@washingtontimes.com.

Copyright © 2026 The Washington Times, LLC. Click here for reprint permission.

Please read our comment policy before commenting.