TikTok will come under regulatory scrutiny in the US once again as part of a new investigation into the impacts that TikTok content can have on the mental health and wellbeing of younger users.
As per CNBC:
“TikTok is under investigation by a bipartisan group of state attorneys general to determine if the popular short-form video platform’s design, operations or promotion to young users negatively affects their physical or mental health. The AGs are seeking to find out if the short-form video app violated state consumer-protection laws.”
The investigation will examine how TikTok entices young users, and the content it displays, and how those factors can influence behavior and response – and whether TikTok knowingly puts youngsters at risk through its recommendation systems.
The announcement comes just a day after US President Joe Biden put the focus on the negative impacts of social media once again, after calling out the harms caused by social apps in his annual State of the Union address.
“We must hold social media platforms accountable for the national experiment they’re conducting on our children for profit. It’s time to strengthen privacy protections, ban targeted advertising to children, demand tech companies stop collecting personal data on our children.”
The new TikTok probe won’t be looking at data collection specifically, but it could form another element in a broader push against social media apps, and their negative impacts on younger audiences.
The same coalition of AGs also launched a similar investigation into Instagram last November.
What will that mean for TikTok?
It’s hard to say, especially since the Instagram probe is also in progress, so we have no precedent, as such, to indicate the potential findings and recommendations. But it could result in new restrictions for younger users, and potentially a change in the age limit for access to these apps, along with stricter enforcement for any such rules, and penalties for violations.
That’s a difficult area in itself, because in general, online age verification systems are not highly complex, and can easily be side-stepped by increasingly web-savvy youngsters. The platforms are doing more to address this - Instagram added compulsory age checks last year, as well as a new process which defaults teen users into private accounts, and restricts ad targeting capacity for younger audiences. But there are still concerns surrounding their impacts, and with Facebook Whistleblower Frances Haugen in attendance at the State of the Union address, it does seem that this will be a high priority focus over the coming year.
For its part, TikTok says that it’s doing all it can to protect its primarily young audience.
Responding to the news, TikTok provided this statement (via Axios):
“We care deeply about building an experience that helps to protect and support the well-being of our community, and appreciate that the state attorneys general are focusing on the safety of younger users. We look forward to providing information on the many safety and privacy protections we have for teens."
But at the same time, social media is now such a critical part of our interactive process, and has become even more so over the past two years, amid the restrictions of the pandemic. Is it possible to create a system that offers adequate protection, while also facilitating connection across such a broad group?