Skip to Content

New Mexico lawsuit accuses Meta of creating ‘breeding ground’ for child predators

By Clare Duffy, CNN

New York (CNN) — New Mexico Attorney General Raúl Torrez has accused Meta Platforms of creating a “breeding ground” for child predators on Facebook and Instagram in a lawsuit filed Tuesday, the latest in a string of legal actions related to alleged harms to young users caused by the social media giant.

Meta allegedly exposes young users to sexual content and makes it possible for adult users they don’t know to contact them, putting children at risk of abuse or exploitation, according to the complaint, filed in New Mexico state court.

“Meta’s business model of profit over child safety and business practices of misrepresenting the amount of dangerous material and conduct to which its platforms expose children violates New Mexico law,” the complaint, which also names Meta CEO Mark Zuckerberg as a defendant, states. “Meta should be held accountable for the harms it has inflicted on New Mexico’s children.”

Meta has faced growing scrutiny over the impact of its platforms on young users in recent years. The social media giant has been sued by various school districts and state attorney generals in lawsuits related to youth mental health, child safety and privacy. Former Facebook employee-turned-whistleblower Arturo Bejar also told a Senate subcommittee last month that Meta’s top executives, including Zuckerberg, ignored warnings for years about harms to teens on its platforms.

The social media giant last month also sued the Federal Trade Commission in an effort to prevent regulators from reopening the company’s landmark $5 billion privacy settlement from 2020 and from banning the social media giant from monetizing the user data of children

Meta strongly denied claims that its platforms put children at risk.

“We use sophisticated technology, hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators,” Meta spokesperson Nkechi Nneji said in a statement, adding that Meta has removed hundreds of thousands of accounts, groups and devices for violating its child safety policies.

The company said in a blog post earlier this month that it has launched technology to proactively detect and disable accounts displaying suspicious behaviors, and that it formed a Child Safety Task Force to improve its policies and practices around youth safety. Meta also says it offers some 30 safety and well-being tools to support teens and families, including the ability to set screen-time limits and the option to remove like counts from posts.

As part of its investigation, the attorney general’s office created a number of sample Instagram accounts registered to minors as young as 12-years-old. Those accounts were able to search for and access explicit “sexual or self-harm content,” including “soft-core pornography,” the complaint states.

New Mexico’s investigation

In one case, the complaint alleges, a search for porn was blocked on Facebook and returned no results, but the same search on Instagram yielded “numerous accounts.”

Photos of young girls posted to Instagram regularly produced “a stream of comments from accounts of adult males, often with requests that the girls contact them or send pictures,” the complaint alleges, adding that it identified adult accounts that followed multiple pages with photos of children.

“After viewing accounts that showed sexually suggestive pictures of girls, Instagram’s algorithms directed investigators to other accounts with images of sexual intercourse and sexualized images of minors,” the complaint states.

Investigators identified dozens of accounts sharing sexualized images of children, including photos of young girls in lingerie and images suggesting that children were “engaged in sexual activity,” the complaint alleges. In some cases, such accounts appeared to be offering child sexual abuse material for sale, it claims.

The lawsuit also alleges that Meta’s safety measures are falling short, making it easier for people to find sexualized images of children.

“An Instagram search for Lolita, with literary roots connoting a relationship between an adult male and teenage girl, produced an Instagram warning flagging content related to potential child sexual abuse,” the complaint states. “However, the algorithm also suggested alternative terms like ‘lolitta girls,’ which yielded content without a warning.”

The lawsuit seeks to fine Meta $5,000 for each alleged violation of New Mexico’s Unfair Practices Act and an order enjoining the company from “engaging in unfair, unconscionable, or deceptive practices.”

The-CNN-Wire
™ & © 2023 Cable News Network, Inc., a Warner Bros. Discovery Company. All rights reserved.

Article Topic Follows: CNN – Business/Consumer

Jump to comments ↓

Author Profile Photo

CNN Newsource

BE PART OF THE CONVERSATION

News Channel 3-12 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content