A group of state attorneys general from both political parties and disparate regions of the country have opened an investigation into the popular social media app TikTok and how it markets itself to young users.
Citing possible violations of state consumer protection laws, the group wants to determine whether the app is designed and marketed to keep children and teens checking in constantly to the detriment of their mental and physical health. The key in the investigation is whether the social media giant knew about the harms in advance.
TikTok has an estimated 1 billion monthly users and is wildly popular with teens and younger children.
In a separate case, the state of Texas in February announced its investigation into alleged abuse of children’s privacy and culpability in online trafficking by TikTok.
Critics also charge that TikTok, which Beijing-based ByteDance owns, could be supplying private information to the Chinese government.
Indiana Attorney General Todd Rokita levels particularly harsh criticism against TikTok, saying the app has surpassed Google worldwide but is banned in India and restricted in its home country of China. Part of Rokita’s criticism is aimed at the Chinese government as he says, “by definition, if you’re a Chinese company, then the (Chinese Communist Party) is a part of it.
Rokita questions whether the app is “grooming our kids” for drugs, alcohol and porn and notes fully one-third of its users are between 10 and 19 years old.
California Attorney General Rob Bonta says there are many ways that TikTok can harm young Americans, such as “anxiety, depression, suicidal ideation or body image.”
For its part, TikTok says it appreciates the states’ concerns over the safety of young people. “We care deeply about building an experience that helps to protect and support the well-being of our community,” the company said in a statement after the announcement of the investigation.
The investigation comes when many critics challenge social media companies over potentially harmful practices and what many see as selective censorship practices.
Last year, a US Senate committee investigated Facebook and its photo-sharing app Instagram for possible adverse effects on young users, especially girls who “felt bad about their bodies” and said Instagram made the problem worse. Republican Sen. Marsha Blackburn of Tennessee was a leader in the Facebook inquiry and said the company knew of Instagram’s negative impact on teens, especially girls.
Facebook knew this. They were watching. They were monitoring Blackburn charges while adding the search for profits made them “hesitant” to restrict access for children.