TikTok, YouTube Algorithms ‘Brazenly’ Recommend Dangerous Videos to Children: Study
TikTok, YouTube and Instagram employ algorithms that “brazenly” recommend videos that encourage dangerous and even deadly behavior to children, according to a new report.
The report by the nonprofit organization Fairplay found that all three platforms violated their own policies by recommending videos encouraging dangerous behavior when a teen user searches for a video with risky behavior.
Researchers in the study opened accounts on the three platforms while purporting to be a 14-year-old male. They then searched for videos depicting “car surfing” and “train surfing” – dangerous stunts in which individuals ride or stand on the top of (or hold onto the side of) cars and trains in motion.
The report found that 80 percent of “car surfing” videos on TikTok depicted people surfing atop cars following a “car surfing” search, while 60 percent of TikTok videos depicted people “train surfing” after a search for that phrase.
For YouTube, 60 percent of the videos were “car surfing” videos after a search for that phrase. A search for “train surfing” led to a page where 90 percent of the videos were videos depicting that dangerous behavior.
For Instagram, it was 28 percent (car surfing) and 84 percent (train surfing).
“Algorithms across these platforms brazenly recommend slews of videos that applaud risky actions,” Fairplay said. “With each recommended video, all three platforms violate their own code of conduct, which pledges to flag or remove content and disable accounts that glorifies dangerous acts. Worse, kids and teens have been severely injured and killed attempting these challenges.”
The report quoted a mom, Joann Bogard, whose 15-year-old son died in 2019 while taking part in an online challenge, the so-called “choking game.”
“No child should ever be hurt because an algorithm pushed them to dangerous and harmful videos,” Bogard said.
Fairplay is urging Congress to pass a bill, the Kids Online Safety Act (S. 3663), that would better protect minors. According to Fairplay, the law would require platforms to “act in children’s best interest” and “mitigate against harms arising from the promotion of self-harm and other matters that pose a physical threat to a minor,” and “make dangerous challenges easier to avoid by allowing minors to opt out of algorithms that recommend them.”
“This legislation is urgently required to hold platforms accountable and to ensure that they adequately safeguard young people who use their services,” Fairplay said.
Photo courtesy: Sara Kurfess/Unsplash
Michael Foust has covered the intersection of faith and news for 20 years. His stories have appeared in Baptist Press, Christianity Today, The Christian Post, the Leaf-Chronicle, the Toronto Star and the Knoxville News-Sentinel.
Comments are closed.