Taped depositions from Meta CEO Mark Zuckerberg and Instagram leader Adam Mosseri were played in a New Mexico courtroom this week, as jurors heard arguments over whether harms to children are an unavoidable byproduct of platforms used by billions of people or a preventable outcome of product design choices.
The case pits New Mexico Attorney General Raul Torrez against Meta, with the state alleging the company prioritized profit and engagement over child safety. Meta disputes the claims and points to years of investment in safety measures, as well as newer protections introduced for teens.
Trial Focuses on Scale, Safety, and Accountability
During the recorded testimony, Zuckerberg said that with platforms serving billions, a small fraction of users will engage in criminal behavior and that Meta works to stop it, while also acknowledging systems are not perfect. Meta’s apps, including Facebook, Instagram, and WhatsApp, each have about 3 billion monthly active users, underscoring the scale at the center of the dispute.
Prosecutors argued that internal company findings and historic product decisions showed the company understood risks to young users and did not consistently prevent harmful interactions. The jury also heard testimony that family members of Meta employees had received inappropriate messages on Instagram, as prosecutors sought to illustrate the breadth of the problem.
Recommendations Flagged as a Driver of Risk
A major theme at trial was how account recommendation tools can connect adults and minors. Prosecutors presented evidence that the People you may know feature was identified internally as a key pathway by which bad actors could discover potential victims in a large share of cases examined in earlier years.
Meta said it uses signals to detect potentially suspicious adult accounts and limit their ability to find or interact with teens. Mosseri stated that Meta has technology to identify accounts showing warning signs, such as patterns of being blocked by younger users, and then restricts those accounts from reaching teen profiles through recommendation features.
According to Meta, in 2025 it used those signals to identify more than 265 million Facebook accounts and more than 135 million Instagram accounts that had shown potentially suspicious behavior, and proactively limited their interactions with teens.
Encryption Debated as Privacy Versus Protection
Jurors also heard about Meta’s decision to expand end to end encryption for Facebook Messenger, which prevents anyone other than the sender and recipient from reading messages. Prosecutors said child safety groups raised concerns that encryption could reduce the ability to detect abuse and that reports to authorities declined after encryption changes. Zuckerberg said users value privacy and that encryption is something people want.
Meta said it can still act when users report harmful content in encrypted chats, though the company does not store readable message content on its servers by default under encryption.
Teen Accounts and Remaining Gaps
The trial also revisited changes Meta rolled out in September 2024, when it introduced Teen Accounts that automatically place users under 18 into stricter settings on Instagram, Facebook, and Messenger. These settings include private profiles by default and tighter limits on who can message teens.
Prosecutors highlighted internal discussions suggesting that protections were not always applied consistently in earlier periods, including audits indicating that recommendations could still surface teen accounts to some adults. The record presented at trial also referenced research pointing to gaps in protections, such as exposure to harmful content through recommendations and hashtags, and instances where safety features did not work as intended.
Mosseri said the company aims to address serious issues and noted that connecting billions of people means both positive and negative outcomes can occur, a framing that the state is challenging in court by arguing more harm could have been prevented through stronger design, enforcement, and guardrails.
The trial began in early February and is expected to run for about seven weeks.
