TikTok's viral “blackout challenge” is linked to at least 20 child deaths

In the past few years, TikTok has faced backlash over viral challenges trending on the app.

TikTok's viral “blackout challenge” is linked to at least 20 child deaths
TikTok app logo is seen in this illustration taken, August 22, 2022. REUTERS/Dado Ruvic/Illustration/File Photo

Social media giant TikTok is the world's fastest-growing social media app, with over 3.9 billion downloads. Last week it raised the minimum age requirement for live streaming from 16 to 18, aiming to improve community safety. The platform's age limit is 13, but internal data revealed in 2020 that more than a third of TikTok's users were under age 14 – including many below the allowed age – so the age verification process doesn't seem so reliable.  

In the past few years, TikTok has faced backlash over viral challenges trending on the app. These include ideas you might never imagine, such as the "Benadryl challenge," which encourages users to take large amounts of the antihistamine to hallucinate. Recently, the "blackout challenge," where people use to choke themselves until they pass out, made headlines after being linked to at least 20 children's deaths in the past year – and 15 of them were under 12.

More and more deaths are being linked to the deadly challenge. Multiple lawsuits were filed by the deceased children's families, alleging that TikTok is liable for their deaths because its algorithm recommended the challenge to the young kids, including 10-year-old Nylah Anderson's death.

This year, TikTok filed a motion to dismiss Nylah's lawsuits and claimed it has "no legal duty of care to protect against third-party depictions of dangerous activity." And the judge dismissed the case, agreeing that the video platform couldn't be sued even if it had recommended the challenge to the children. But, in October, the US Supreme Court agreed to hear a case related to the same law TikTok cites to avoid blame for its content. The decision could affect future lawsuits against the platform.

Key comments:

"The last thing in the world these companies want to do is stand up in front of a jury and explain to them why their profits were more important than life," said Matthew Bergman, a product liability lawyer who founded the Social Media Victims Law Center in Seattle.

"They have these solutions ready to go. They could implement them almost immediately. They choose not to. That's serving a business interest, not a safety interest," said Marc Berkman, CEO of the Organization for Social Media Safety.

“Archie had the TikTok app. In the last few weeks [before his injury] he kept making out he was dizzy, that he could make himself pass out. He'd never caused me any alarm by putting anything around his neck or anything like that, so this was a very new thing. For him to all of a sudden start that at the age of 12 years old, he's seen it somewhere, and the only thing I can think of is TikTok," said Hollie Dance, the mother of Archie Battersbee, a 12-year-old boy who died in August.

"As we make clear in our community guidelines, we do not allow content that encourages, promotes, or glorifies dangerous challenges that might lead to injury," a TikTok spokesperson said. "Though we have not seen [the Benadryl challenge] trend on our platform, we actively remove content that violates our guidelines and block related hashtags to further discourage participation."