© 2026 KRWG
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Meta knowingly misled children and teens about safety, NM jury finds

Meta CEO Mark Zuckerberg leaves after testifying in a landmark trial over whether social media platforms deliberately addict and harm children, Wednesday, Feb. 18, 2026, in Los Angeles. (Damian Dovarganes/AP)
Damian Dovarganes/AP
Meta CEO Mark Zuckerberg leaves after testifying in a landmark trial over whether social media platforms deliberately addict and harm children, Wednesday, Feb. 18, 2026, in Los Angeles. (Damian Dovarganes/AP)

New Mexico’s attorney general says the case he brought against Meta could lead to changes to social media platforms.

A New Mexico jury found Tuesday that the parent company of Facebook and Instagram knowingly hid what it knew about child exploitation on its social media platforms, prioritizing profits over safety.

New Mexico Attorney General Raúl Torrez said the verdict will likely be the first of many to come.

“There’s an opportunity now to hold these companies accountable in a way that, frankly, has never existed before,” Torrez said.

5 questions with Raúl Torrez

What did Meta know about child sexual exploitation on Facebook and Instagram?

“ Meta’s known for years that this is a dangerous product and addictive product and a product that actually facilitates predatory behavior on the platform. They’ve known that because of their own internal documents. Safety experts have been raising red flags and recommending specific changes. But again and again, executives, including [Mark] Zuckerberg, have chosen and prioritized profits over safety.

“And because of that, the jury in New Mexico found them liable, found that their actions were willful, and rendered a $375 million judgment in our favor, and it’s the first, but not the last time that I think juries across this country are going to demand real accountability from Meta and social media companies writ large.”

What should parents know about the dangers that underage users face on these apps?

“ First, they need to understand that the business model that drives this company is one that’s really centered on engagement, and it’s engagement at all costs. They want to keep kids connected to the platform. They should also understand that the same things that connect people with their likes and interests also connect predators on the platform with kids that are on the platform.

“And so both of those things together should give parents real concern about having kids accessing these platforms in ways that, frankly, at this point, are not safe. And they should also understand that the company hasn’t been honest about the dangers that they know exist in these spaces.”

A state undercover operation created a fake social media profile of a 13-year-old girl. What did that investigation reveal?

“What we were trying to do through the establishment of an undercover account was to recreate the experience of a young girl on the platform. We wanted to see for ourselves what the response would be. And frankly, it was shocking.

“They were immediately inundated with solicitations for sex, requests for graphic material, and an explosive growth in the following of predominantly men from all over the world.

“And what’s even more shocking is that in response to that growth on that particular account, rather than raising a red flag, the company sent information to the account about how they could grow their user base and how they could monetize that following. That’s a clear example of what’s wrong at the core of this company. They have continued to prioritize engagement over safety, and we found that very clearly through our undercover investigation.”

What do you hope comes out of a case like this?

“ I think this is the first crack in the dam. You know, Meta and other social media companies have been hiding behind Section 230 for years… Section 230 was created in a totally different era in the early days of the internet, and has become a liability shield that they’ve hidden behind for years and years, because this is effectively a product liability case.

“Brought under our consumer protection laws, there’s an opportunity now to hold these companies accountable in a way that, frankly, has never existed before. And it’s going to be a blueprint for what comes next in terms of our own case.

“We will be back in front of the district court in May on our public nuisance claim. We’ll be arguing that Meta has created a public nuisance, and we will, in addition to financial penalties, we’ll be asking for injunctive relief, specific changes to the design features of the platform, the algorithm, elimination of end-to-end encryption for minors and an independent monitor to really enforce those features and those changes. What it will also do is potentially create a roadmap or a blueprint for how this company could change its product offerings, not only around the country, but around the world.”

What does the penalty of $375 million mean for a company valued at $1.5 million? Is it effectively just a slap on the wrist?

“We don’t have any illusions that the fine that was rendered by the jury [Tuesday] is sufficient deterrence to change their behavior. But if you think about the relative size of New Mexico and the user base here, if similar types of actions are brought in other jurisdictions, which I anticipate they will, the potential penalties will add up really quickly, and the financial impact on the company’s bottom line will add up considerably.

“But I also think that the wave of litigation is going to prompt congressional action. I think it’s going to prompt Congress to really reexamine Section 230 and some of the stalled legislation that’s been pending there for quite some time. And when that happens, I think you’re going to realize a sea change in the regulatory space in terms of what we expect of social media companies and the safety features built into their platforms.”

This interview was edited for clarity.

____

Wilder Fleming produced and edited this interview for broadcast with Michael Scotto. Grace Griffin produced it for the web.

This article was originally published on WBUR.org.

Copyright 2026 WBUR

Scott Tong
Wilder Fleming