©GettyImages

21st October 2025, 09:15:36 UTC

The report contains sensitive content including references to self-harm and suicide

New Amnesty International research has found that TikTok’s ‘For You’ feed is pushing French children and young people engaging with mental health content into a cycle of depression, self-harm and suicide content.

The research, ‘Dragged into the Rabbit Hole‘, highlights TikTok’s ongoing failure to address its systemic design risks affecting children and young people.

“Our technical research shows how quickly teenagers who express an interest in mental health-related content can be drawn into toxic rabbit holes. Within just three to four hours of engaging with TikTok’s ‘For You’ feed, teenage test accounts were exposed to videos that romanticized suicide or showed young people expressing intentions to end their lives, including information on suicide methods,” said Lisa Dittmer, Amnesty International’s Researcher on Children and Young People’s Digital Rights.

“The testimonies of young people and bereaved parents in France reveal how TikTok normalized and exacerbated self-harm and suicidal ideation up to the point of recommending content on ‘suicide challenges’.”

TikTok’s ‘For You’ feed is a personalized stream of short videos that recommends content based on viewing.

Amnesty International researchers set up three teen accounts, two female, one male, registered as 13-year-olds based in France to manually examine the algorithmic amplification of content in TikTok’s ‘For You’ feed. Within five minutes of scrolling and before signaling any preferences, the accounts encountered videos about sadness or disillusionment.

Watching these videos rapidly increased the amount of content related to sadness and mental health. Within 15 to 20 minutes of starting the experiment, all three feeds were almost exclusively filled with videos related to mental health, with up to half containing depressive content. Two accounts had videos expressing suicidal thoughts within 45 minutes.

Additional experiments were conducted with the Algorithmic Transparency Institute using automated test accounts of 13-year-olds in France. They found TikTok’s recommender system more than doubled the share of recommended sad or depressive content when watch histories included different levels of such videos.

The research was conducted in France where TikTok is regulated under the European Union’s Digital Services Act (DSA), which since 2023 required platforms to identify and mitigate systemic risks to children’s rights.

French lawmakers are currently debating gaps in social media regulation, and this research adds to Amnesty International’s prior evidence that TikTok has not addressed systemic risks tied to its engagement‑based business model.

 

Impact on young people

Despite risk mitigation measures announced by TikTok since 2024, the platform continues to expose vulnerable users to content that normalizes self-harm, despair and suicidal ideation.

Testimonies from young people with depression and from affected or bereaved parents reveal the extent of the risks and harms of TikTok’s business model for the mental and physical health of already struggling youth.

“There are videos that are still burnt into my retina,” said Maëlle, 18, describing how she was drawn to depressive and self-harm content in TikTok’s ‘For You’ feed in 2021. Over the next three years, her mental health struggles with self-harm continued to decline while she became consumed by harmful online content.

“Seeing people who cut themselves, people who say what medication to take to end it, it influences and encourages you to harm yourself.”

While the report focuses on the amplification of harmful content, the collected testimonies also point to TikTok’s failures in content moderation. Despite repeated reports from young people and their families, content inciting self-harm or suicide has not been removed from the platform according to the research participants.

For example, Amnesty researchers found two videos of the “lip balm challenge” in the feed of a manually managed test account in the summer of 2025. The social trend supposedly began as a challenge to guess the scent of a lip balm on another person. The idea evolved into a different version encouraging people to remove a piece of their lip balm every time they felt sad and self-harm or attempt death by suicide when the lip balm was finished.

“For these platforms, our children become products rather than human beings. They use our children as products with an algorithm and a filter bubble, using their emotions to captivate them. The algorithm captures your interests, which is not normal. They intrude into the child’s private life. But children have rights,” said Stéphanie Mistre, mother to 15-year-old Marie Le Tiec, a French child who fell into TikTok’s spiral of depressive content and ended her life in 2021.

 

Urgent and binding measures to make TikTok safe

This research demonstrates TikTok’s failure to address the systemic risks caused by the addictive design of its platform on young people. The company is failing to live up to its responsibility to respect human rights in line with the UN Guiding Principles on Business and Human Rights and is not fulfilling all of its DSA obligations.

“This new evidence of clear DSA violations by TikTok must be urgently factored into the European Commission’s ongoing investigation. Binding and effective measures must be taken to force TikTok to finally make its application safe for young people in the European Union and around the world”, says Katia Roux, Advocacy Officer at Amnesty France.

TikTok’s disregard for systemic harms linked to its engagement‑driven model raises serious DSA compliance concerns and underscores the need for stronger regulatory and platform accountability measures to protect children and vulnerable users.

Amnesty International shared its key findings with TikTok. The company did not respond.

 

Background

In 2023, Amnesty International published two complementary reports Driven into the Darkness: How TikTok Encourages Self‑harm and Suicidal Ideation and “I feel exposed”: Caught in TikTok’s surveillance web, highlighting abuses suffered by children and young people using TikTok.

Further help on issues covered in this report can be found Amnesty International’s guide on Staying Resilient While Trying to Save the World (Volume 2): A Well-being Workbook for Youth Activists.