Now Reading
Instagram’s Dark Side: Algorithm Linked to Vast Pedophile Network

Instagram’s Dark Side: Algorithm Linked to Vast Pedophile Network

Instagram, one of the most popular social media platforms worldwide with over two billion monthly active users, is facing allegations that its recommendation algorithms have facilitated a “vast pedophile network.”

Researchers from Stanford University and the University of Massachusetts Amherst shared their findings with the Wall Street Journal, claiming that Instagram’s algorithms have promoted the sale of child sexual abuse material (CSAM) by allowing users to seek out illicit content using specific hashtags.

The researchers discovered that accounts using hashtags like #pedowhore, #preteensex, #pedobait and #mnsfw (minors not safe for work) were connecting pedophiles and guiding them to content sellers through recommendation systems that catered to their niche interests. Disturbingly, some accounts claimed to be run by the children featured in the abusive material and offered to sell pedophilic content through menus that included photos and videos of children engaging in self-harm, enduring sexual abuse and performing sexual acts with animals.

“Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests,” the researchers said.

Further investigation by the Stanford Internet Observatory revealed that certain accounts even encouraged buyers to request specific illicit acts. The analysis also uncovered evidence suggesting that, for a fee, some children were made available for in-person meet-ups.

In response to the report, a spokesperson from Meta, the parent company of Instagram, acknowledged the seriousness of child exploitation and pledged to take action.

“Child exploitation is a horrific crime,” the statement from Meta said. “We’re continuously investigating ways to actively defend against this behavior.”

Meta further stated it has taken down 27 pedophile networks in the last two years and is working to remove more. Additionally, the company said it actively removes accounts linked to users who buy and sell CSAM. In January alone, Meta removed 490,000 accounts violating child safety policies.

Previously, Instagram allowed users to view content associated with CSAM but provided a notice cautioning users that the material could contain images of child sexual abuse and cause extreme harm. Users were then given the choice to access resources or proceed regardless. However, following the Wall Street Journal’s report, Instagram disabled the option to view the content.

Meta has established an internal task force to address these issues and actively investigate methods to combat child exploitation on its platform.

The allegations against Instagram and the subsequent response from Meta highlight the urgent need to tackle the issue of child sexual abuse and exploitation online. The findings from the researchers’ investigation underscore the importance of developing robust measures to detect and prevent the dissemination of harmful content involving minors.

© 2023 RELEVANT Media Group, Inc. All Rights Reserved.

Scroll To Top

You’re reading our ad-supported experience

For our premium ad-free experience, including exclusive podcasts, issues and more, subscribe to

Plans start as low as $2.50/mo