[ad_1]
Revenge porn is a form of abuse where someone’s intimate photos are used against them, usually by an ex, to extort, manipulate, or just make their life a living hell. In the age of the Internet, it’s everyone’s worst nightmare, and TikTok and Bumble are trying to do something about it.
According to a report from Bloomberg, TikTok and Bumble are the latest internet entities to step up to the responsibility of making sure their users, and their users’ photos, are safe from abuse. Considering that TikTok reported hitting a billion global users last fall and that Bumble was the second most downloaded dating app in 2022, it’s about damn time. TikTok and Bumble did not immediately return Gizmodo’s request for comment on the initiative.
TikTok and Bumble will now, apparently, track and block any images that are submitted to Stop Non-Consensual Intimate Image Abuse’s website, StopNCII.org. StopNCII allows those who suspect they are the subject of revenge porn to submit intimate photos of themselves to the website’s secured database. The image is then converted into a digital fingerprint called a hash, which will then be shared with TikTok and Bumble, who will scan for the hash on their platforms and pull it if it’s found.
This workflow was first pioneered by Facebook, now Meta, who began testing a similar program in Australia in 2017. Since then, Instagram and Facebook have been running a souped-up version of the tracking system that has aided a reported 12,000 users and found 40,000 photos and videos. Meta also recently announced a more nuanced version of the reporting tool for minors in the company’s efforts to protect underage users from “suspicious adults.”
Revenge porn affects an estimated 1 in 12 U.S. adults, but social media platforms framing their efforts as a win fails to tackling the way they have developed and fostered digital infrastructure, whether intentionally or not, where sharing intimate images against the will of those in them is easy. TikTok, Meta, and even Bumble are platforms that are just now understanding the need to better police the digital environments that they helped create.
[ad_2]
Source link