(The Hill) -- More than a dozen states and the District of Columbia sued TikTok on Tuesday, alleging the platform exploits and harms young users while "deceiving" the public about these dangers.
California Attorney General Rob Bonta and New York Attorney General Letitia James led the coalition of 14 attorneys general, who each filed suits in state court over violations of state consumer protection laws.
Bonta said a national investigation into TikTok found that the platform "cultivates social media addiction to boost corporate profits."
The investigation was launched in March 2022 by a bipartisan coalition of attorneys general from various states including New Jersey, California, North Carolina and Kentucky.
"TikTok intentionally targets children because they know kids do not yet have the defenses or capacity to create healthy boundaries around addictive content,” Bonta wrote.
“When we look at the youth mental health crisis and the revenue machine TikTok has created, fueled by the time and attention of our young people, it’s devastatingly obvious: Our children and teens never stood a chance against these social media behemoths," he continued.
TikTok's business platform allegedly prioritizes maximizing young users' time through its algorithm, which determines what users see on the app's "For You" page. This helps boost the platform's revenue through targeted advertising, the suits alleged.
The social media platform is further accused of deploying "manipulative features" to keep young users hooked, including its beauty filters, push notifications, temporary stories and live streams.
TikTok's "autoplay" feature, which continuously plays new and temporary posts, along with its "endless/infinite scroll," are also mentioned in the suits.
In doing so, TikTok allegedly deceives users by claiming it prioritizes user safety through various tools, community guidelines and content moderation features, the attorneys general said.
"In truth, such features and efforts do not work as advertised, the harmful effects of the platform are far greater than acknowledged, and TikTok does not prioritize safety over profit," Bonta's office wrote in a release.
A TikTok spokesperson told The Hill the company "strongly disagrees" with the claims, describing them as "inaccurate and misleading."
"We're proud of and remain deeply committed to the work we've done to protect teens and we will continue to update and improve our product," the spokesperson wrote.
"We provide robust safeguards, proactively remove suspected underage users, and have voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16," the spokesperson continued.
TikTok has worked with the attorneys general over the past two years, the spokesperson said, adding it is "incredibly disappointing" to see the suits rather than working together.
Other states behind the suits include Illinois, Louisiana, Massachusetts, Mississippi, Oregon, South Carolina, Washington and the District of Columbia.
This builds upon previous suits filed against TikTok by the attorneys general of Utah, Nevada, Indiana, New Hampshire, Nebraska, Arkansas, Iowa, Kansas, and Texas.
Other social media platforms, including Facebook and Instagram parent Meta, have been sued over similar allegations the companies business models harm youth mental health.
It follows a separate crackdown by Congress amid national concerns sparked by TikTok's China-based parent company, ByteDance.
The platform could face a ban in the U.S. after President Biden signed legislation in April that established a timeline for ByteDance to sell the platform or be prohibited from U.S. app stores and networks.
ByteDance has contended divestment is practically impossible, meaning that the law effectively amounts to a nationwide ban of the video-sharing platform.
The Justice Department sued TikTok, ByteDance and their affiliates in August for alleged violations of the Children's Online Privacy Protection Act, which bans website operations from knowingly collecting or using personal information from kids under 13 without consent from parents.