The Department of Justice sued TikTok and its parent company ByteDance on Friday for allegedly violating children’s online privacy law.
The agency accused the popular social media app of allowing children under the age of 13 to create accounts, collecting data on those children and failing to comply with parents’ requests to delete the accounts and information.
TikTok’s actions would violate the Children’s Online Privacy Protection Act (COPPA), as well as a 2019 settlement agreement with the app then known as Musical.ly, according to the lawsuit.
“To put an end to TikTok’s unlawful massive-scale invasions of children’s privacy, the United States brings this lawsuit seeking injunctive relief, civil penalties, and other relief,” the filing reads.
The Justice Department alleges that TikTok “knowingly allowed children under 13 to create accounts” on the platform and “collected extensive personal information” without notifying their parents or getting their consent.
When parents have asked TikTok to delete their children’s accounts and the associated data, the lawsuit claims that the company obstructed and failed to comply with these requests.
“Parents must navigate a convoluted process to figure out how to request deletion of their child’s account and information,” the DOJ alleged, adding, “Even if a parent succeeded in submitting a request to delete their child’s account and information, [TikTok] often did not honor that request.”
The lawsuit followed an investigation by the Federal Trade Commission (FTC), which in 2019 filed a consent order against TikTok for previous alleged COPPA violations.
“TikTok knowingly and repeatedly violated kids’ privacy, threatening the safety of millions of children across the country,” said FTC Chair Lina M. Khan. “The FTC will continue to use the full scope of its authorities to protect children online—especially as firms deploy increasingly sophisticated digital tools to surveil kids and profit from their data.”
In a statement, TikTok said “We disagree with these allegations, many of which relate to past events and practices that are factually inaccurate or have been addressed.”
“We are proud of our efforts to protect children, and we will continue to update and improve the platform. To that end, we offer age-appropriate experiences with stringent safeguards, proactively remove suspected underage users, and have voluntarily launched features such as default screentime limits, Family Pairing, and additional privacy protections for minors,” the company continued.
Updated at 2:18 p.m.