Now Reading
Study: TikTok and YouTube Keep Pushing Dangerous Videos to Teen Users

Study: TikTok and YouTube Keep Pushing Dangerous Videos to Teen Users

A new study from the nonprofit FairPlay found that even though social media outlets like TikTok and YouTube say they have policies against hosting videos with risky content on them, their algorithm still delivered such videos to teen users. On the off chance you’re still of the mind that these companies have the safety and best interests of their users at heart, this study might force you to rethink those convictions.

FairPlay built their own bots and registered them on TikTok, YouTube and Instagram as 14-year-old boys. They then had those bots search for “car surfing” and “train surfing” — basically videos of people laying on top of vehicles in motion and seeing how long they can stay on. Obviously, not the safest behavior for anyone — let alone a young teen — and one that has resulted in the injury and death of teens who attempted it. But the algorithm delivered page after page of results of their peers doing just that anyway.

“Algorithms across these platforms brazenly recommend slews of videos that applaud risky actions,” Fairplay said in a statement. “With each recommended video, all three platforms violate their own code of conduct, which pledges to flag or remove content and disable accounts that glorifies dangerous acts. Worse, kids and teens have been severely injured and killed attempting these challenges.”

FairPlay is pushing for legislation called the Kids Online Safety Act, which would legally require social media companies to act in the best interests of their young users.

View Comments (0)

Leave a Reply

© 2023 RELEVANT Media Group, Inc. All Rights Reserved.

Scroll To Top

You’re reading our ad-supported experience

For our premium ad-free experience, including exclusive podcasts, issues and more, subscribe to

Plans start as low as $2.50/mo