
New Mexico Attorney General Raúl Torrez filed for injunctive relief against Meta today, seeking sweeping court-ordered changes to how the company operates its platforms for children. Meta responded by threatening to pull Facebook, Instagram, and WhatsApp from the state entirely.
“Meta is showing the world how little it cares about child safety,” Torrez said Thursday. “Meta’s refusal to follow the laws that protect our kids tells you everything you need to know about this company and the character of its leaders.”
Ahead of the bench trial that begins May 4, Meta responded to Torrez’s statement on Thursday.
“Despite Attorney General Torrez’s claims, the State’s demands are technically impractical, impossible for any company to meet and disregard the realities of the internet,” the company said in a statement to Fortune. “In targeting a single platform, the State ignores the hundreds of other apps teens use, leaving parents without the comprehensive support they actually deserve.”
“While it is not in Meta’s interests to do so, if a workable solution to Attorney General Torrez’s demands is not reached, we may have no choice but to remove access to its platforms for users in New Mexico entirely.”
Torrez dismissed the threat as a “PR stunt” and said Meta’s argument about technical capability doesn’t hold: “For years the company has rewritten its own rules, redesigned its products, and even bent to the demands of dictators to preserve market access. This is not about technological capability. Meta simply refuses to place the safety of children ahead of engagement, advertising revenue, and profit.”
An undercover operation
The confrontation this week is the latest chapter in a case that began with a fake teenage girl.
In 2023, investigators from the New Mexico Department of Justice created a social media profile posing as a 13-year-old, and found the account was almost immediately flooded with images, messages, and targeted solicitations from adults seeking to exploit a child. The investigators said no algorithm flagged the contact and no safety system caught it.
The undercover operation became the foundation of a lawsuit accusing Meta of making false or misleading statements about platform safety, enabling child sexual exploitation through deliberate design choices, and intentionally engineering its apps to addict young users. Section 230, a federal statute that has long shielded platforms from liability for user-generated content, New Mexico prosecutors used a state consumer protection law to pursue charges against the company.
In March 2026, a Santa Fe jury found Meta liable for 75,000 violations of New Mexico’s Unfair Practices Act and ordered the company to pay $375 million in civil penalties, the maximum allowed under state law. New Mexico became the first state in the nation to win at trial against a major technology company for endangering children.
The six-week trial showed Meta’s own internal documents in which employees calculated that Zuckerberg’s 2019 decision to roll out end-to-end encryption on Facebook Messenger by default would affect their ability to detect and report approximately 7.5 million child sexual abuse material cases to law enforcement. One Meta researcher had flagged as many as 500,000 child exploitation cases daily across Facebook and Instagram.
Injunctive relief
When the new bench trial begins on May 4, Chief Judge Bryan Biedscheid will hear the state’s public nuisance claim and decide whether to grant injunctive relief that would fundamentally restructure how Meta operates for users under 18 in the state.
On age verification, Meta would be required to block children under 13 from its platforms, delete their existing accounts and data, and link every minor’s account to a guardian account. On exploitation prevention, adults not directly connected to a minor could not message that minor. Meta also will not be allowed to recommend minor accounts to adult users, and any adult found to have engaged in child sexual exploitation would face a permanent one-strike ban, blocking them from creating new accounts on the same device, IP address, or phone number.
End-to-end encryption for users under 18 would be eliminated. Recommendation algorithms for minors would be required to optimize for what the state calls “integrity” rather than engagement. The state is also requesting a ban to infinite scroll, autoplay, and push notifications during school and sleep hours, and a hard monthly cap of 90 hours of platform access for minor users.
Lastly, the state is requesting a reinstatement of undercover accounts on the Meta platform and a court-appointed Child Safety Monitor that is funded entirely by Meta, which would oversee compliance for a minimum of five years. The monitor will have the authority to investigate Meta’s internal systems, receive confidential reports from Meta employees, and publish regular public reports.
Meta’s defense
A Meta spokesperson pushed back on both the scope of the demands and the strategy behind the upcoming case: “The New Mexico Attorney General’s focus on a single platform is a misguided strategy that ignores the hundreds of other apps teens use daily. Rather than providing comprehensive protections, the state’s proposed mandates infringe on parental rights and stifle free expression for all New Mexicans. Regardless, we remain committed to providing safe, age-appropriate experiences and have already launched many of the protections the state seeks, including 13 safety measures this past year.”
Meta has sought to delay or stop the case entirely, claiming Section 230 immunity and then a postponement of the bench trial, but the court denied the requests each time.
More than 40 state attorneys general have filed lawsuits against Meta over child safety. The Children’s Online Privacy Protection Act was passed in 1998 and has not been meaningfully updated—even as the FTC promises a newly revamped COPPA 2.0. Federal legislation on platform liability for minors, age verification, and addictive algorithms has stalled repeatedly.

