Dating app Bumble has unveiled a new feature, ‘Deception Detector,’ utilizing AI to identify and block spam, scams, and fake profiles. The tool aims to enhance user safety by proactively addressing malicious content before users encounter it.
During testing, Deception Detector successfully automatically blocked 95% of accounts flagged as spam or scams, resulting in a 45% reduction in user reports of such content within the first two months.
The launch of this AI-powered tool aligns with internal research from Bumble, revealing that fake profiles and scam risks are top concerns for users engaging in online dating. The company’s commitment to user safety is highlighted by the integration of Deception Detector with Bumble’s human moderation team.
CEO Lidiane Jones emphasized the app’s dedication to building equitable relationships and empowering women, stating that Deception Detector is the latest innovation in ensuring genuine connections within their community. Trust, particularly in the era of AI, is considered paramount in creating a secure online dating environment.
Recent data from the Federal Trade Commission (FTC) reported that romance scams incurred losses of $1.3 billion in 2022, with a median reported loss of $4,400. While dating apps are commonly used by romance scammers, social media platforms’ direct messages remain a prevalent starting point, accounting for 40% of reported scam initiations.
Deception Detector follows Bumble’s earlier AI-driven safety feature, ‘Private Detector,’ introduced in 2019. The ‘Private Detector’ automatically blurs and labels nude images in chats, offering users the choice to view the image or report the user.
It continues to leverage AI to enhance user safety, extending its application to Bumble For Friends, the platform’s dedicated app for finding friends. The app recently introduced AI-powered icebreaker suggestions to facilitate engaging conversations among users, further showcasing Bumble’s commitment to leveraging technology for a safer and more authentic online experience.