Tuesday, November 5, 2024
ad
HomeNewsBumble open sources its Private Detector AI to counter unsolicited nudes

Bumble open sources its Private Detector AI to counter unsolicited nudes

The dating app has also helped in passing laws to enforce legal consequences for sending unsolicited nudes.

While online dating apps are great for meeting new people, receiving inappropriate photos may be a frustrating experience for users. In order to counteract this, Bumble has been shielding its users from vulgar images since 2019. The feature, known as Private Detector, examines photos supplied by matches to see whether they include objectionable material. Although it was primarily intended to detect unwanted nude photographs, it can also flag bare-chested selfies, which are also prohibited on Bumble. The software will obscure the unwanted photo when a match is made, giving you the option of seeing it, blocking it, or reporting the sender.

Bumble claims that Private Detector is trained using extremely large datasets, with the negative samples—those devoid of any obscene content—carefully chosen to better depict edge situations and other portions of the human body (such as the legs and arms) in order to avoid flagging them as abusive. Adding samples to the training dataset iteratively to replicate the behavior of real users or test misclassification proved to be a fruitful exercise that the dating app used throughout the years in all of the subsequent machine learning projects. Nothing precludes data scientists from potentially establishing new concepts (or labels), to potentially merge them back immediately before the actual training epochs, even if the downstream goal is phrased as a binary classification issue.

Bumble’s Data Science team has now published a white paper that explains the technology behind Private Detector and made an open-source version of it available on GitHub for commercial usage, distribution, and modification. In order to make the internet a safer place, the dating app anticipates that the feature will be embraced by the larger IT community. Bumble hopes that by making Private Detector open source, other digital firms will modify it and add their own features to increase online safety and accountability in the battle against abuse and harassment.

According to Bumble’s VP of member safety, Rachel Haas, “Open-sourcing this feature is about remaining firm in our conviction that everyone deserves healthy and equitable relationships, respectful interactions, and kind connections online.”

Read More: Iranian Government Admits using Facial Recognition to Identify Women Violating Hijab Rule

In recent years, Bumble has waged a campaign against cyberflashing in both the UK and the United States. Whitney Wolfe Herd, the CEO and founder of the app, contributed to the approval of HB 2789, a law in Texas that makes posting non-consensual nude photographs against the law. Since then, the dating app has aided in the passage of legislation like this in Virginia (SB 493) and California (SB 53).

Bumble has been pushing for the criminalization of cyberflashing in England and Wales, and the government stated in March 2022 that it would do so under the proposed rules, with offenders facing up to two years in jail.

These new regulations are the first step toward ensuring accountability and repercussions for this common kind of harassment that leaves victims—mostly women—feeling upset, violated, and vulnerable online.

Subscribe to our newsletter

Subscribe and never miss out on such trending AI-related articles.

We will never sell your data

Join our WhatsApp Channel and Discord Server to be a part of an engaging community.

Preetipadma K
Preetipadma K
Preeti is an Artificial Intelligence aficionado and a geek at heart. When she is not busy reading about the latest tech stories, she will be binge-watching Netflix or F1 races!

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular