You’ll Never Go Broke Correctly Estimating The Internet’s Desire For Deepfake Nudes
Cheap thrills? Perhaps not so much. A report from Joseph Cox for 404 Media not only points out there’s no such thing as a free meal AI-generated deepfake nude, but that this is exactly the sort of thing Vice Media would have been all over if it hadn’t been burnt to the ground by string of executives whose only “talent” was steadily increasing their take-home pay.
It has sex, Russian crime, and — comically enough — a seemingly direct contradiction of assertions made the US Department of Justice.
Anyway, the upshot is this: if you’re looking to find a free AI nude generator, perhaps you should just stop looking, unless you like having your Bitcoin portfolio siphoned into someone else’s pockets while malware distributed by Russian criminals wanders around your computer/device looking for any other information that might be valuable.
Multiple sites which promise to use AI to ‘nudify’ any photos uploaded are actually designed to infect users with powerful credential stealing malware, according to new findings from a cybersecurity company which has analyzed the sites. The researchers also believe the sites are run by Fin7, a notorious Russian cybercrime group that has previously even set up fake penetration testing services to trick people into hacking real victims on their behalf.
Well, you get what you pay for, I guess. Free AI-generated nudity isn’t worth the asking price, not when cybercriminal groups are running the shop. But it’s exactly the sort of thing that’s always going to work because people looking for free software allegedly capable of “nudifying” any photo are the sort of people who aren’t really going to be doing a whole lot of upper brain thinking when initiating downloads.
That’s the expected outcome of setting up a digital honeypot promising nudity you won’t find elsewhere on the ‘Net. And it worked completely as expected, according to the security researchers who looked into these sites. The cross-section of people willing to click through on questionable sites offering rare nudes also contains plenty of people who’ve plunked down a lot of real money to obtain funds that aren’t quite as tactile.
“The deepfake AI software may have an audience of mostly men with a decent amount [of income] who use other AI software or have crypto accounts,” Zach Edwards, senior threat analyst at cybersecurity firm Silent Push, told 404 Media in an online chat.
There’s a sucker logging on every minute. All Fin7 had to do is give them what they wanted. Which is, according to Zach Edwards, “the bleeding edge of creepy.” The victims get none of the nudes and the cybercriminals get all the cryptocurrency and financial info they can carry away.
But while all of that is kind of funny and completely expected, the researchers digging into these sites were able to trace this back to Fin7, which means people shouldn’t believe everything that falls out of the DOJ’s mouth.
The news also shows that Fin7 is alive despite the U.S. Department of Justice saying last year that “Fin7 as an entity is no more.”
No criminal entity is ever completely dead. The DOJ should know this because it’s been able to witness this history repeat itself for decades. Crime-fighting is whack-a-mole. Claiming total victory is a move guaranteed to make you eat your words. Crime can be fought. It can’t be defeated. And whatever you don’t kill (which is all of it) will either go further underground or come back even stronger.
The only good news to report is that some of these sites (most of which seem to offer the same “nudifying” and trace back to Fin7 operations) are no longer accessible. Most of the domains were handled by Hostinger, which immediately blocked what it could once it had been notified by 404 Media.
The usual rules apply and internet users ignore them at their own peril. If the offer is too good to be true, it’s because it isn’t. Expecting a free service to use AI to strip the clothes off any picture you happen to possess is just the horniest of wishful thinking. And agreeing to download anything from a site you know nothing about other than its vague promise of “nudifying” is just asking for international cybercriminal trouble. Keep it (and your digital wallet) in your pants, creeps.