A U.S. appeals court has reopened a case involving the tragic death of 10-year-old Nylah Anderson, who died after participating in the “Blackout Challenge” on TikTok. Her mother had previously filed a lawsuit against the platform, claiming that TikTok’s algorithm had promoted the deadly challenge to her daughter, leading to her death.
The “Blackout Challenge” is a dangerous trend where participants hold their breath until they nearly pass out, and it has been circulating on TikTok. Although U.S. federal law typically shields internet companies from liability for content posted by users, the court found that TikTok’s algorithm may have played a direct role in suggesting the challenge to Nylah, which could make the platform accountable for her death.
Judge Patty Shwartz, representing a panel of three judges, indicated that while the 1996 law provides immunity for user-generated content, it does not protect companies when their algorithms actively promote harmful content. This ruling challenges the protections usually afforded under Section 230, suggesting that TikTok may bear responsibility for the tragic consequences of its algorithmic recommendations.