Families File Lawsuit Over Harmful Content
Seven French families have filed a lawsuit against TikTok. They accuse the platform of exposing their teenagers to harmful content linked to two suicides. The families claim TikTok’s algorithm repeatedly showed videos promoting suicide, self-harm, and eating disorders to the teens.
Lawyer Laure Boutron-Marmion, representing the families, said TikTok’s recommendation system played a key role. She explained that TikTok’s algorithm is designed to keep users engaged. This system allegedly pushed disturbing videos to the teenagers, according to Boutron-Marmion’s statement to broadcaster franceinfo.
Legal Action and Accountability
The families are taking joint legal action at the Créteil judicial court. Boutron-Marmion described it as the first case of its kind in Europe. The lawsuit aims to hold TikTok accountable for the content its algorithm promotes to minors.
“The parents want TikTok to face responsibility for exposing their children to harmful material,” Boutron-Marmion said. She emphasized that TikTok, as a commercial platform marketing to minors, should be held responsible for potential harm caused by its algorithm and content recommendations.
This lawsuit follows growing scrutiny of TikTok’s content moderation practices and its impact on young users. Similar lawsuits have been filed against Meta, the parent company of Facebook and Instagram, in the United States. Those lawsuits accuse Meta of fostering addictive behaviors in children and worsening mental health issues.
TikTok has not commented on the specific allegations in this case. However, the company has previously highlighted its commitment to child safety. TikTok CEO Shou Zi Chew told U.S. lawmakers earlier this year that the platform had implemented stronger safety measures to protect young users. Despite these claims, critics argue that TikTok must do more to protect vulnerable teenagers from harmful content.