Unmasking TikTok's Influence on Youth Mental Health: A Comprehensive Investigation

Teaming up with the Algorithmic Transparency Initiative and Amnesty International, we conducted a two-part audit on TikTok's content recommendation system. This involved setting up 40 automated 'sock puppet' accounts simulating young users' behaviors, and a series of manually personalized accounts that rewatched mental health-related videos suggested to supposed 13-year-olds in Kenya, the Philippines and the USA.

project overview

In an era where social media is intertwined with daily life, especially among youth, our study, led by Amnesty International, sought to understand the profound impact of TikTok's algorithm on mental health. This research was imperative to uncover how personalized content feeds, particularly TikTok's 'For You' page, may contribute to self-harm and suicidal ideation among young users.

Teaming up with the Algorithmic Transparency Initiative and Amnesty International, we conducted a two-part audit on TikTok's content recommendation system. This involved setting up 40 automated 'sock puppet' accounts simulating young users' behaviors, and a series of manually personalized accounts that rewatched mental health-related videos suggested to supposed 13-year-olds in Kenya, the Philippines and the USA.

Among the recommendations served to a manually personalized account located in the Philippines, the first video tagged with #depresionanxiety [sic] showing a young boy in distress was suggested within the first 67 seconds of scrolling through recommended content on the ‘For You’ page. From minute 12 onwards, 58% of the recommended posts related to anxiety, depression, self-harm and/or suicide and was categorized as potentially harmful for children and young people with pre-existing mental health concerns.

In the US-based manual experiment, the fourth video shown was tagged #paintok and focused on text reading “when you realize you’ve never been put first your entire life but instead are just that person that fills a void in other people’s lives until they don’t need you anymore”. From the 20th video onwards (less than three minutes in), 57% of the videos are related to mental health issues, with at least nine posts romanticizing, normalizing or encouraging suicide in a single hour.

The Kenyan account in the manual experiments saw the slowest progression towards a feed filled with depressive content. However, once that point was reached (20 minutes into the experiment), 72% of the videos recommended in the next 40 minutes related to mental health struggles, with at least five references to suicidal thinking or the content creator’s death wish. Not a single mental health-related video was produced by a mental health care professional or recognized mental health organization.

Our findings reveal a stark and troubling reality: TikTok's algorithms could potentially lead young users down dangerous paths, exposing them to content that glorifies self-harm and suicidal thoughts. This study has ignited a vital conversation about the ethical responsibilities of social media platforms and their impact on youth mental health.

The urgency of implementing robust oversight and regulatory frameworks for social media platforms is clear. Our research advocates for the development of safer digital environments, ensuring that the mental well-being of the youngest and most vulnerable users is prioritized in the design and operation of these influential platforms.

Amnesty International has a dedicated page about the overall “Driven into Darkness” project here. Additionally, there's an open letter to the European Parliament addressing the addictive design of online services, including social media platforms, which can be accessed here. For those interested in actively participating in the movement to make platforms like TikTok safer, Amnesty International also provides a platform to sign a petition, available here.