Harms to children, such as sexual exploitation and detriments to mental health, are inevitable on Meta’s platforms, the company’s CEO Mark Zuckerberg and Instagram leader Adam Mosseri said in taped depositions played at a trial in New Mexico on Tuesday and Wednesday.
“I just think if you’re serving billions of people, the unfortunate reality is that some very small percent of them are going to be criminals, and we should work as hard as we can to stop that activity from happening,” said Zuckerberg. “I don’t think that the standard for our platforms would be that you should assume that it will ever be perfect.”
Meta’s apps, which include Facebook, Instagram, and WhatsApp, are among the most popular in the world, each with 3 billion monthly active users.
The trial has set the social media giant against New Mexico’s attorney general, who alleges that Meta’s platforms put profits and user engagement over child safety. Raul Torrez has accused the company of knowingly enabling predators to use Facebook and Instagram to exploit children. Meta disputes the allegations, citing changes it has introduced, including teen accounts with default protections that debuted in 2024. The trial, which began in early February, is expected to last about seven weeks.
“We have strict, longstanding rules against child exploitation and have invested billions to fight it, both through proactive detection technology and safety features designed to prevent harm,” said a Meta spokesperson.
“We provide industry-leading transparency, regularly sharing data on how much violating content we remove and how much we miss. No system can ever be perfect, and we’ve never claimed to be”
Jurors were shown recorded depositions of Zuckerberg and Mosseri filmed between March and July last year. The jury also heard that family members of Meta employees had experienced sexual solicitation on Instagram.
Prosecutors also presented evidence that the company estimated in 2020 that 500,000 children were receiving sexually inappropriate communications on Instagram each day, including grooming, in which adults attempt to build relationships with minors for sexual purposes.
In a statement, a Meta spokesperson said the technology the company used at the time was overly wide and cautious, and as such, interactions that were not inappropriate were included in the count.
The company identified the “People you may know” algorithm – which recommends accounts for users to connect with – as a main driver of these interactions, with the tool used to discover victims in 79% of identified cases in 2018. At the time, about 30% of adults whose accounts were disabled for targeting children had returned to the platform and resumed that behavior, the court heard.
Jurors heard that Zuckerberg authorized end-to-end encryption for Facebook Messenger in 2023 despite warnings from child safety groups Thorn and the National Center for Missing and Exploited Children (NCMEC) that the move could pose risks to children. In a taped deposition played at trial, he said the privacy encryption affords users was a more pressing issue. Encryption prevents anyone other than the sender and intended recipient from viewing messages by converting text and images into unreadable ciphers that are decoded on receipt. The content is not stored on Meta’s servers.
A company spokesperson added that Meta can still review and take action on encrypted messages if they are reported by a user.
Child safety groups and law enforcement have warned that encrypting Messenger enables predators to share child sexual abuse imagery without detection. Earlier in the trial, a law enforcement officer testified that reports of child sexual abuse material from the platform decreased following encryption.
“I think that end-to-end encryption messaging services are what people want,” said Zuckerberg in a taped deposition filmed in March 2025. “They really care about privacy.”
Mosseri said in his deposition that the company has “developed technology that allows us to find accounts that have shown potentially suspicious behavior, for example, an adult account that might have been blocked by another young person, and to stop those accounts from interacting with young people’s accounts”.
“We use a range of signals to identify adults who have shown potentially suspicious behavior and avoid recommending these accounts to teens through Facebook’s ‘People you may know’ and Instagram’s ‘Accounts you should follow’ features,” said a Meta spokesperson.
“In 2025, we used these signals to identify more than 265 million Facebook accounts and more than 135 million Instagram accounts that had shown potentially suspicious behavior, and proactively prevent them from finding, following or interacting with teens.”
An internal presentation discussed at trial stated that Instagram’s wellbeing safety team did not always prevent teen accounts from being recommended to potential violators and vice versa. A December 2022 internal audit showed Meta continued to recommend minor accounts to some adults.
In September 2024, Meta introduced Teen Accounts, which automatically place users under 18 into stricter settings on Instagram, Facebook and Messenger, including making profiles private by default and limiting who can message them. Researchers have identified gaps in those protections, including exposure to harmful videos through hashtags or recommendations and instances in which safety features did not work as intended.
“I certainly want to address any problem that’s even remotely as severe as something like sexual solicitation … Any negative action that happens offline, also to a certain degree, happens online,” said Mosseri. “We’re connecting billions of people. That is going to mean good and bad things happen.”