Several stars from the Hallmark Channel recently issued a warning to fans, urging them to be cautious of scammers impersonating them on social media.
In a video shared on the Hallmark Channel’s Instagram page, Tyler Hynes, Jonathan Bennett, Tamera Mowry, BJ Britt and other Hallmark regulars alerted viewers of an “industrywide” problem on social media involving impersonation scams.
“Fake (social media) accounts are impersonating actors, reaching out to fans directly with misleading messages,” the group said. “It’s very important that you know. It’s important that you know we will never contact you personally to ask for financial help, donations or to meet up. If you receive a message like this, it’s a scam.”
The group urged fans who receive fraudulent messages from Hallmark impersonators to immediately block the scammer account and report it to the social media platform.
To close the message, the group encouraged viewers to stay safe and stay connected.
“We love our Hallmark family — and that means doing our part to raise awareness around scammers, some who are impersonating Hallmark stars. Learn how to spot the signs and protect yourself," the caption of the post reads.
Hynes, who has starred in more than a dozen Hallmark movies, previously issued a warning against AI scammers asking for money and other favors through fake social media accounts.
“Please, beware of scammers,” he wrote in an August social media post. “Please, never send anyone money. Please, look after the vulnerable. Please, use this to show others that who you THINK you are talking to is not me or other actors.”
He added, “It is not just our community. It is everywhere. Awareness is everything. And AI is making even less pleasant with voice, video and pictures being very realistic.”
Scammers take advantage with celebrity deepfakes
Artificial intelligence has empowered scammers to impersonate celebrities and other public figures by manipulating real images or audio to generate a deepfake version of that person.
Typically, scammers use an AI-generated likeness of a celebrity to ask for money, promote a product or endorse a cause. Brad Pitt, Taylor Swift, Tom Hanks, Jamie Lee Curtis, Scarlett Johansson, Johnny Depp, Gayle King and Elon Musk have all been recent targets of deepfakes.
Last year, a deepfake of Swift endorsing Le Creuset cookware products circulated the internet. Months before, a deepfake of Hanks promoting “some dental plan” also made the rounds.
In January, Depp issued a warning that scammers were using AI to target his fanbase with requests for money or personal information.
“Sadly, it has been brought to my attention that online scammers are intensifying their efforts to target my fans and supporters,” Depp wrote in a social media post.
He added, “Today, AI can create the illusion of my face and voice. Scammers may look and sound just like the real me. But, neither I, nor my team, will ask you for money or your personal information.”
Also in January, a French woman who believed she was in a romantic relationship with Brad Pitt, willingly gave more than over 830,000 euros ($855,000) to scammers posing behind a deepfake version of the “Ocean’s 11″ actor.
“It’s awful that scammers take advantage of fans’ strong connection with celebrities, but this is an important reminder to not respond to unsolicited online outreach, especially from actors who have no social media presence,” a representative for Pitt said in a statement, per The Hollywood Reporter.
With the rise of celebrity deepfakes, around 400 performers and former SAG-AFTRA president Fran Drescher signed in support of legislation moving through Congress called the No Fakes Act, which aims to put protect public figures’ image and voice from unauthorized deepfakes.
“The No Fakes Act is supported by the entire entertainment and media industry landscape, from studios, record labels, broadcasting companies and technology companies to unions, workers and artist advocacy groups,” SAG-AFTRA wrote in a statement. “It is a milestone achievement to bring all these groups together for the same urgent goal”
The bill has not been passed.