Federal regulators have for the first time banned a digital platform from serving users under 18, accusing the app — known as NGL — of exaggerating its ability to use artificial intelligence to curb cyberbullying in a groundbreaking settlement.
Bussiness
In a first, federal regulators ban messaging app from hosting minors
The complaint alleged that NGL tricked users into paying for subscriptions by sending them computer-generated messages appearing to be from real people and offering a service for as much as $9.99 a week to find out their real identity. People who signed up received only “hints” of those identities, whether they were real or not, enforcers said.
After users complained about the “bait-and switch tactic,” executives at the company “laughed off” their concerns, referring to them as “suckers,” the FTC said in an announcement.
NGL, internet shorthand for “not gonna lie,” agreed to pay $5 million and stop marketing to kids and teens to settle the lawsuit, which also alleged the company violated children’s privacy laws by collecting data from kids under 13 without parental consent.
The settlement marks a major milestone in the federal government’s efforts to tackle concerns that tech platforms are exposing kids to noxious material and profiting off it. And it’s one of the most significant actions by the FTC under Chair Lina Khan, who has dialed up scrutiny of the tech sector at the agency since taking over in 2021.
“We will keep cracking down on businesses that unlawfully exploit kids for profit,” Khan (D) said in a statement. NGL could not be reached for comment.
NGL’s popularity has exploded, with a user base topping 200 million and at one point becoming the most downloaded product on Apple’s app store only a year after its 2021 launch. The platform lets users anonymously respond to questions from friends and social media contacts and markets itself as a place where people can play games such as “never have I ever.”
But it’s one of several anonymous messaging services whose pervasiveness among young people has triggered alarm from children’s safety advocates, who say the companies have failed to take adequate steps to prevent cyberbullying and other harmful activities on their products.
In October, child safety group Fairplay and parent activist Kristin Bride filed a complaint urging the FTC to investigate allegations that the app’s parent company, NGL Labs, illegally marketed itself to children using unfair and deceptive trade practices.
Bride’s 16-year old son Carson died by suicide in 2020 after facing cyberbullying on two separate anonymous messaging services, Yolo and LMK. Bride has said that Carson’s last search on his phone was for ways to uncover who had been harassing him anonymously online.
“It was extremely concerning to learn that a new anonymous app, NGL hit the market and found a way to further monetize their dangerous product by charging vulnerable teens for useless hints regarding who is sending them the messages,” Bride said in a statement last year.
The agency added it “received invaluable assistance from Fairplay and social media reform advocate Kristin Bride” in the case.
As part of the deal, NGL will be required to prevent users from accessing the app if they indicate they are under 18 and to delete any data it obtained from young children unless a parent signs off on it. The company will also be barred from making misrepresentations about its ability to filter out cyberbullying or about the sender of messages on its app.
While limited to one company, the settlement represents one of the FTC’s most forceful actions to better protect children online under Khan.
The agency last year struck a record $520 million settlement with Epic Games, maker of the popular “Fortnite” video game series, over allegations the company violated children’s data privacy laws and tricked players into making unwanted purchases. But the settlement stopped short of imposing any prohibitions against marketing to those under 18.
The FTC has separately proposed a sweeping plan to bar Facebook and Instagram parent company Meta from monetizing the data of children and teens under 18, but the plan has yet to be implemented pending a series of legal challenges from the tech giant. The agency proposed the restrictions as an update to its historic $5 billion privacy settlement with the company.
The FTC is also considering broadening its enforcement of the landmark Children’s Online Privacy Protection Act. Under the proposed rulemaking, platforms would be required to turn off targeted ads to children under 13 by default.