Democracy Dies in Darkness

New Mexico sues Snapchat over sexual predation of minors

The state attorney general says the app’s design facilitates the trade in explicit images and extortion of children.

7 min
Snapchat says its messages and photos disappear, but New Mexico argues that they can be preserved, making the site a uniquely attractive destination for adults seeking sexually explicit images of children. (Kevork Djansezian/Getty Images)

Snapchat is a “breeding ground” for predators seeking to collect sexually explicit images of children and extort them, the state of New Mexico argues in a lawsuit filed Wednesday evening against the popular social app’s owner, Snap.

The lawsuit alleges that the disappearing messages and photos that have long differentiated Snapchat from its competitors made it a uniquely attractive destination for the ugliest conversations on the internet. During a months-long undercover investigation, the New Mexico Department of Justice said it surfaced evidence that the app recommends accounts held by strangers to underage Snapchat users, who are then contacted and urged to trade sexually explicit images of themselves.

“While failing to prevent, identify, or protect even young children on its platform, Snap introduced them to the equivalent of an adults-only show for which they were not only the inappropriate audience, but often the main object,” said the lawsuit, which was viewed exclusively by The Washington Post.

The probe involved standing up a decoy account named “Sexy14Heather,” using AI-generated images to simulate a 14-year-old girl. The lawsuit alleges the fake account received a flood of recommendations to contact accounts seeking to trade sexually explicit messages and photos, including from users with names including “teentradevirgin” and other sexual phrases.

The suit brought by New Mexico Attorney General Raúl Torrez says Snap violated state law against deceptive trade practices, misleading the public about how safe the platform was so it could continue to profit off young users. The lawsuit alleges the company made a series of misleading claims and omissions about the proliferation of illicit content on Snapchat, even as its own internal research showed young users frequently encountered harmful content.

The lawsuit catapults Santa Monica, Calif.-based Snap to the center of a growing international effort to address the online proliferation of child sexual abuse material, or CSAM. Amid an intensifying debate about how to protect children online, enforcers are focusing on the role apps that blend messaging and social networking play in fostering illegal content.

Snap spokesman Russ Caditz-Peck said the company was reviewing the complaint and planned to respond to the claims in court.

“We share Attorney General Torrez’s and the public’s concerns about the online safety of young people and are deeply committed to Snapchat being a safe and positive place for our entire community, particularly for our younger users,” Caditz-Peck said in a statement.

The company in June released a blog post detailing ways it was improving safety on the platform, including by expanding warnings when strangers add a teen on the app and making it easier to block someone a user no longer wishes to contact.

Last week, in a stunning display of enforcers cracking down, French prosecutors indicted the CEO of the popular messaging service Telegram on charges that included complicity in the distribution of child sex abuse images.

Regulators have long scrutinized the role that social networks play in facilitating sexual abuse of children, and last year, the New Mexico attorney general brought a similar lawsuit against Meta and CEO Mark Zuckerberg. Simultaneously, law enforcement agencies around the world have warned that encrypted apps provide a haven for pedophiles and other criminals because they make it impossible for investigators to access the contents of their messages. The recent actions against Snap and Telegram indicate that enforcers are casting a wider net and increasingly looking at hybrid services, which present unique risks.

The New Mexico investigators also found a link between Snap and Telegram, finding accounts on the United Arab Emirates-based messaging app that were distributing explicit images of young people captured on Snap. Investigators’ searches on Telegram surfaced accounts that were sharing sexually explicit images of young women.

Snapchat is uniquely popular among teens, with half of all teenagers in the United States reporting using the app every day. Torrez, a Democrat, told The Post in an interview that his office saw many cases where young people — especially boys — fell victim to “sextortion” schemes on Snap because they believed the disappearing nature of the messages made it a safe place to share explicit images. Snap notifies users when a screenshot or screen recording is taken of a message, but there are many third-party applications and workarounds to evade that feature.

Many young Snapchat users, unaware of how easy it is to preserve the images, are extorted when the predator saves the picture and threatens to release it to their school or family.

“There’s a notion that content that is exchanged on the platform disappears forever,” Torrez said. Snap “has created this false sense of security in the way that it’s marketed the platform … and that makes it so dangerous, especially for young ones.”

During its investigation, the New Mexico Department of Justice uncovered a vast network of sites on the dark web focused on sharing stolen, non-consensual, sexual images from Snapchat, compiling what the lawsuit referred to as a “virtual yearbook of child exploitation.” Investigators found more than 10,000 explicit photos and videos apparently linked to Snapchat, including examples of minors younger than 13 being sexually assaulted. Snapchat was by far the most common source of the CSAM posts investigators found on sites on the dark web.

One of those sites included a handbook for how to engage in extortion, including tips for taking over a victim’s account. The guide describes Snapchat as “the best app for sextortion, since the perpetrator can take screenshots without the victims being aware.”

The New Mexico investigators’ accounts also received recommendations for accounts that were apparently focused on meeting or collecting explicit images of children, with usernames such as one called “ilikekids60” that includes the tagline “Kids under12” with a heart-eye emoji. Snap also surfaced accounts with sexually explicit names in response to investigators’ queries for terms associated with child sexual abuse.

The lawsuit alleges that these features facilitate not just child exploitation, but also other illicit activities including drug sales and gun sales. Torrez said his office has opened a probe into the role of social media platforms in trafficking these illicit goods.

The suit is narrowly focused on violations of New Mexico state law, but Torrez said he plans to share his office’s findings with other states as well as the U.S. Justice Department. New Mexico is not among the more than 40 states and territories that last year banded together to sue Meta for building addictive features.

State enforcers have been playing a heightened role in checking the power of large tech companies, following years of flailing efforts to pass updated online safety laws in Congress. In July, the Senate passed a landmark bill to protect children online, but the bill has not been taken up by the House.

“Most voters in this country — Republicans and Democrats — are becoming aware of the threat that this poses, not only to mental health but also the physical well-being of their children,” Torrez said. “Eventually that is going to force a much more difficult and harder conversation among policymakers in D.C. to take concrete action.”