Landmark trial in New Mexico to decide whether Meta misled users about children's safety risks

SANTA FE, N.M. — Closing arguments were being held Monday in a landmark trial in New Mexico where social media conglomerate Meta is accused of misleading its users about how safe its platforms are for children.

Jurors will take up the case after the arguments and six weeks of testimony from scores of witnesses that included local teachers, psychiatric experts, state investigators, top Meta officials and whistleblowers who left the company.

The case in New Mexico state court is among the first to reach trial in a wave of litigation involving social media platforms and their impacts on children.

New Mexico prosecutors have accused Meta — which owns Instagram, Facebook and WhatsApp — of prioritizing profits over safety in violation of state consumer protection laws. They have raised concerns about the safety of complex algorithms, and a variety of messaging features and settings.

“It’s clear that young people are spending too much time on Meta's products, they’ve lost control,” prosecution attorney Linda Singer told the jury in closing statements. “Meta knew that and it didn’t disclose it.”

Prosecutor says trial evidence shows Meta failed to enforce its minimum user age

Singer said testimony and evidence at trial showed Meta’s algorithms had been recommending sensational and harmful content to teenagers, and failing to truly enforce its minimum user age of 13.

“The safety issues that you’ve heard about in this case, weren’t mistakes. .... They were a product of a corporate philosophy that chose growth and engagement over children’s safety,” Singer said. “And young people in this state and around the country have borne the cost.”

Attorneys for Meta dispute the claims and say the company incorporates protections for teenagers and weeds out harmful content, while also acknowledging that some potentially harmful posts get past its safety nets.

Meta attorney insists the company has disclosed risks of its platforms

Meta attorney Kevin Huff told the jury Monday that the company “disclosed to the world that its safeguards are not perfect, and that some bad content and bad actors get onto its service.”

“Common sense also says that parents and teens know that there is bad content on the internet, and on Facebook and Instagram specifically,” he added. But Huff noted the social media company has disclosed risks of its platforms in its user agreements, website, ads and on television.

“Wherever it could get its message out, Meta was disclosing risk to the public,” Huff said.

Singer urged jurors to impose a civil penalty of more than $2 billion against Meta, based on the maximum $5,000 penalty per violation on two counts of consumer protection violations, and an estimated 208,700 monthly users of Meta platforms under the age of 18 in New Mexico.

“Over the course of a decade Meta has failed over and over again to act honestly and transparently, failed to act to protect young people in this state,” Singer said. “It is up to you to finish this job."

Huff called the state’s request for penalties “a shocking number” and said prosecutors failed to provide any examples of teenagers who chose to use Instagram because of a false understanding of its risks.

“Even though teens are aware of the risks, they continue to use Instagram because they enjoy Instagram,” Huff said.

A second phase of the trial will follow with a judge deciding whether Meta created a public nuisance and should be on the hook financially to fund programs to address alleged harms to children.

Company's attorneys say the state has cherry-picked evidence to support its case

Attorney General Raúl Torrez filed suit in 2023, accusing Meta of creating a marketplace and “breeding ground” for predators who target children for sexual exploitation and failing to disclose what it knew about those harmful effects. State investigators created social media accounts posing as children to document online sexual solicitations and the response from Meta.

Meta attorneys accuse prosecutors of cherry-picking evidence and conducting a shoddy investigation.

Meta executives emphasized at trial that the company continuously improves safety and addresses compulsive social media use without infringing on free speech or censoring users.

But the prosecution on Monday said that public assurances about safety disclosures from Meta executives including founder Mark Zuckerberg and Instagram head Adam Mosseri often didn't square with internal studies and communications at the company.

“It was included in Meta’s internal research -- again this was research that didn’t get disclosed by Meta -- one-in-three teens experienced problematic use,” Singer said. "They knew these kids were struggling with problematic use — again, addiction.”

A jury assembled from residents of Santa Fe County, including the politically progressive state capital city, will decide whether Meta violated the state's Unfair Practices Act on two counts, including “unconscionable” trade practices.

A finding of willful violations would open the way for possible fines of up to $5,000 per violation. Prosecutors say that could add up to billions of dollars.

Tech companies have been protected from liability for material posted on their social media platforms under Section 230, a 30-year-old provision of the U.S. Communications Decency Act, as well as a First Amendment shield.

Prosecutors say New Mexico is not seeking to hold Meta accountable for content on its platforms, but rather its role in pushing out that content through complex algorithms that proliferate material that can be addictive and harmful to children.

In California, a jury already is sequestered in deliberations on whether Meta and YouTube should be liable for harms caused to children using their platforms. The bellwether case could impact how thousands of similar lawsuits against social media companies are likely to play out.