It only takes ten minutes after creating an account on China’s TikTok app for the platform’s algorithm to begin pushing suicide videos to 13-year-old children.
The Chinese app’s recommendation algorithm is so advanced that within ten minutes, it will start pushing suicide videos if the young TikTok user suggests he is sexually frustrated, according to research published Tuesday by corporate accountability group Ekō and shared with VICE News.
The researchers set up nine different new TikTok accounts, and listed their age as 13 — the youngest age users can join the platform — then they mimicked who they referred to as “incels” or “involuntary celibates,” which is an online community of “young men who formed a bond around their lack of sexual success with women,” according to VICE News.
Minutes later, the researchers found that after viewing just ten videos having to do with “incel”-related topics, the TikTok accounts’ “For You” pages were all filled with similar content.
One test account was shown a video that featured a clip of Jake Gyllenhaal — whose films have reportedly been popular among the “incel community” — and in the video, the actor was seen with a rifle in his mouth saying, “Shoot me. Shoot me in the fucking face.”
The video also included text, which read, “Get shot or see her with someone else?”
Additionally, the majority of the commenters were in support of the suggested suicide. Other commenters lamented about their loneliness, with many saying they felt “dead inside.” One commenter even suggested his own suicide within the next four hours.
The Jake Gyllenhaal clip, which has since been deleted, had garnered over 440,000 likes, over 2.1 million views, 7,200 comments, and more than 11,000 shares.
“Ten minutes and a few clicks on TikTok is all that is needed to fall into the rabbit hole of some of the darkest and most harmful content online,” Maen Hammad, Ekō campaigner and co-author of the research, told VICE News.
“The algorithm forces you into a spiral of depression, hopelessness, and self harm, and it’s terribly difficult to get out of that spiral once the algorithm thinks it knows what you want to see,” Hammad added. “It’s extremely alarming to see how easy it is for children to fall into this spiral.”
TikTok, which has replaced Instagram and Facebook as the de facto social media platform for teenagers in the United States, is known for pushing content harmful to children and young adults — which in some cases, even result in injury and death.
Earlier this month, the University of Massachusetts had to warn its students about a new drinking trend on TikTok, which has resulted in 28 ambulances being called to off-campus parties in the area. The trend involves students creating a “blackout rage gallon” of alcohol, flavoring, and other ingredients.
Earlier this year, a 12-year-old girl in Argentina died after participating in the deadly “choking challenge” first popularized on the Chinese app. The girl’s death was even filmed in a video call while her classmates watched as she attempted the deadly challenge.
Last summer, a 14-year-old and a 12-year-old in the UK allegedly died due to attempting the same TikTok challenge.
Last September, the FDA warned parents of a deadly new TikTok challenge that involves children cooking chicken in NyQuil, “presumably to eat.”
Another TikTok challenge in 2020 involved urging users to take large doses of the allergy medication Benadryl (diphenhydramine) to induce hallucinations. The challenge resulted in reports of teens being rushed to the hospital, and in some cases, dying.
You can follow Alana Mastrangelo on Facebook and Twitter at @ARmastrangelo, and on Instagram.
Comments are closed.