Now Reading
TikTok and Bumble Are Cracking Down on Revenge Porn

TikTok and Bumble Are Cracking Down on Revenge Porn

TikTok and Bumble are implementing new strategies to keep users safe from revenge porn, and it’s about time.

Revenge porn is a form of abuse where someone’s intimate photos and videos are used against them — often by an ex — to extort, manipulate or simply ruin their life. Research shows it affects as many as 1 in 12 U.S. adults, and both TikTok and Bumble are the latest apps to step up to the responsibility of making sure their users and their photos are safe from this abuse. 

The apps will now track and block any images that have been submitted to the online site Stop Non-Consensual Intimate Image Abuse. SNCII allows those who suspect they are a victim of revenge porn to submit intimate photos of themselves to the website’s secured database. The site then works with platforms like TikTok and Bumble to scan for the image on their platforms and pull it if it’s found. 

Meta began working with SNCII back in 2017 and has since aided a reported 12,000 users in removing 40,000 photos and videos from Facebook and Instagram. Considering TikTok reportedly has over a billion global users and Bumble was the second most downloaded dating app this year, it’s about time these platforms step up to the plate. 

Social media platforms have spent the last several months cracking down on pornographic content on their sites. Instagram has suspended Pornhub (twice) from creating an account on their platform, and TikTok is slowing enacting policies to protect the mental health of its younger users. It seems like many of these social platforms are finally understanding their role in creating safer digital environments for all users. 

View Comments (0)

Leave a Reply

© 2023 RELEVANT Media Group, Inc. All Rights Reserved.

Scroll To Top

You’re reading our ad-supported experience

For our premium ad-free experience, including exclusive podcasts, issues and more, subscribe to

Plans start as low as $2.50/mo