Beware the AI celebrity clones peddling bogus ‘free money’ on YouTube
Share- Nishadil
- January 11, 2024
- 0 Comments
- 4 minutes read
- 26 Views
Online scammers are using AI voice cloning technology to make it appear as if celebrities like Steve Harvey and Taylor Swift are encouraging fans to fall for medical benefits related scams on YouTube. 404 Media first reported on the trend this week. These are just some of the latest examples of scammers harnessing increasingly accessible generative AI tools to target often economically impoverished communities and impersonate famous people for quick financial gain .
404 Media was contacted by a tipster who pointed the publication towards more than 1,600 videos on YouTube where deepfaked celebrity voices work as well as non celebrities to push the scams. Those videos, many of which remain active at time of writing, reportedly amassed 195 million views. The videos appear to violate several of YouTube’s policies, particularly those around misrepresentation and spam and deceptive practices .
YouTube did not immediately respond to PopSci’s request for comment. The scammers try to trick viewers by using chopped up clips of celebrities and with voiceovers created with AI tools mimicking the celebrities’ own voices. Steve Harvey , Oprah , Taylor Swift , podcaster Joe Rogan , and comedian Kevin Hart all have deepfake versions of their voices appearing to promote the scam.
Some of the videos don’t use celebrities deepfakes at all but instead appear to use a recurring cast of real humans pitching different variations of a similar story. The videos are often posted by YouTube accounts with misleading names like “USReliefGuide,” “ReliefConnection” and “Health Market Navigators.” “I’ve been telling you guys for months to claim this $6,400,” a deepfake clones attempting to impersonate Family Feud host Steve Harvey says .
“Anyone can get this even if you don’t have a job!” That video alone, which was still on YouTube at time of writing, had racked up over 18 million views. Though the exact wording of the scams vary by video, they generally follow a basic template. First, the deepfaked celebrity or actor addresses the audience alerting them to a $6,400 end of the year holiday stimulus check provided by the US government delivered via a “health spending card.” The celebrity voice then says anyone can apply for the stimulus so long as they are not already enrolled in Medicare or Medicaid.
Viewers are then usually instructed to click a link to apply for the benefits. Like many effective scams, the video also introduces a sense of urgency by trying to convince viewers the bogus deal won’t last long. In reality, victims who click through to those links are often redirected to URLs with names like “secretsavingsusa.com” which are not actually affiliated with the US government.
Reporters at PolitiFact called a signup number listed on one of those sites and spoke with an “unidentified agent” who asked them for their income, tax filing status, and birth date; all sensitive personal data that could potentially be used to engage in identity fraud. In some cases, the scammers reportedly ask for credit card numbers as well.
The scam appears to use confusion over real government health tax credits as a hook to reel in victims. Numerous government programs and subsidies do exist to assist people in need, but generic claims offering “free money” from the US government are generally a red flag . Lowering costs associated with generative AI technology capable of creating somewhat convincing mimics of celebrities’ voices can make these scams even more convincing.
The Federal Trade Commission (FTC) warned of this possibility in a blog post last year where it cited easy examples of fraudsters using deepfakes and voice clones to engage in extortion and financial fraud, among other illegal activities. A recent survey conducted by PLOS One last year found deepfake audio can already fool human listeners nearly 25% of the time .
The FTC declined to comment on this recent string of celebrity deepfake scams. This isn’t the first case of deepfake celebrity scams, and it almost certainly won’t be the last. Hollywood legend Tom Hanks recently apologized to his fans on Instagram after a deepfake clone of himself was spotted promoting a dental plan scam.
Not long after that, CBS anchor Gayle King said scammers were using similar deepfake methods to make it seem like she was endorsing a weight loss product. More recently, scammers reportedly combined a AI clone of pop star Taylor Swift’s voice alongside real images of her using Le Creuset cookware to try and convince viewers to sign up for a kitchenware giveaway.
Fans never received the shiny pots and pans. Lawmakers are scrambling to draft new laws or clarify existing legislation to try and address the growing issues. Several proposed bills like the Deepfakes Accountability Act and the No Fakes Act would give individuals more power to control digital representations for their likeness.
Just this week, a bipartisan group of five House lawmakers introduced the No AI FRAUD Act which attempts to lay out a federal framework to protect individuals rights to their digital likeness, with an emphasis on artists and performers. Still, it’s unclear how likely those are to pass amid a flurry of new, quickly devised AI legislation entering Congress ..