It’s been nearly 10 years since swiping left became the gesture of choice for millions of daters with the advent of Tinder. Since that app first came on the scene Sept. 12, 2012, it seems every single application has tried to suss out the best way for users to burn through potential dates at a pace that can set fire to the most-well intentioned thumbs.
Despite it being a decade since, what so many of these apps still struggle with is dealing with bot, spam, and scam accounts. Gizmodo has previously reported on the thousands of people who told the FTC they were scammed through Tinder during the pandemic. Innocent folks looking for love reported getting scammed out of tens of thousands of dollars, convinced to put down credit card info, or even being threatened after they didn’t want to pay their fees. It took Tinder until last year to introduce ID verification to most of the app’s global user base, although it remains voluntary for the majority of users. The idea of being deceived on a dating app even got a worldwide premiere through the popular Netflix documentary The Tinder Swindler.
Bots and scammers are rife on dating apps. A 2017 study published by researchers at the University of Southern California pointed out that it’s especially difficult to determine if a user is a bot since there’s few ways to actually look at the users’ profile without interacting with them. These accounts often seem more legit than not, with original pictures and other social media accounts. Scammers are even more difficult, since even when you knock off one predatory account, they can easily come back onto the platform with a different identity entirely.
Well, one dating app had a novel approach to dealing with scammers and bots on their platform—turning them against each other. In an August blog postco-founder of video-centric dating site startup Filter Off, Brian Weinreich, said that when a supposed scammer first signs up for the site, they’re put in a so-called “Dark Dating Pool” away from other users. The dev said his small team flooded the pool full of GPT-3 based chat bots and collected the most hilarious examples of scammers attempting to scam a being with no compassion or love (sorry, but AI simply isn’t there yet).
Weinreich wrote that all chats are encrypted, and they “err on the side of caution” when putting users in the dark dating pool, which could mean some potential scammers slip through. In a Wednesday interview with TechCrunchWeinreich said they used algorithms that sort accounts based on how scam users most often sign up for the app. Funnily enough, these scammers will apparently try to scam each other, arguing back and forth over who should send a $40 gift card.
“We have probably over 1,000 scammers that I know of that are actively talking to just bots,” Weinrich told TechCrunch.
Though Gizmodo couldn’t independently verified much of what the developer is claiming, reading through these posted chat logs between bot and scammer is like watching the Aliens vs. Predator of the scummy dating scene. The bots are chock full of canned autoresponses that even a basic question resulted in repeated answers or even straight-up contradictory replies. How does a bot respond to “Are you on WhatsApp?” Well, it first says “no,” then “no,” then “no,” and finally “yes.”
Here’s a few of the best snippets we found from when a scummy scammer met their match against the most obstinate opponent imaginable: a derelict AI bot.