TRENDING

Scammers Are Using the Face of the CEO of HeyGen, an AI Company That Lets You Create a ‘Digital Twin,’ in Crypto Phishing Video

As of Thursday, the video of Joshua Xu, HeyGen co-founder and CEO, had more than 209,000 views.

Heygen Ceo Youtube Crypto Scam
No comments Twitter Flipboard E-mail
jody-serrano

Jody Serrano

Editor in chief

Editor in Chief at Xataka On. Before joining Webedia, I was a tech reporter at Gizmodo and The Messenger. In recent years, I've been especially interested in Twitch, streamers, and Internet culture. LinkedIn

The CEO of HeyGen, an AI video company that lets its users create their own “digital twins” to send off to multiple Zoom meetings, interviews, and coaching sessions, is the deepfake star of a recent crypto scam video on YouTube with more than 209,000 views.

The "teacher" named “Chao.” Spotted by BoingBoing, the video appears to show HeyGen chief Joshua Xu, a former Snap engineer who co-founded the AI video company, who introduces himself as a web3 developer named Chao. The fake Xu goes on to say that he’s going to show users how to create an Ethereum sniping bot using ChatGPT.

“It already brings me and my students passive income every day, which is amazing,” the scammer says.

The video's description contains links that appear to be part of a phishing scam (one of them looks like a legitimate site until you click on it and see that the URL is wrong). While the video was spotted by BoingBoing this week, it was posted three weeks ago on Sept. 27. The YouTube channel has 10,000 subscribers, though less than 3,000 views on the five videos it's published, all of which are about crypto and digital assets.

The tells. Even if you don’t look at this video under a magnifying glass like I did, the proof that it’s an AI scam is pretty obvious. First off, Chao says claims he’s an AI developer who shared this video on his private channel but is deciding to share it more broadly to “gain support.”

Secondly, the comments look fishy, too. The majority are overly positive and praise the video’s creator, and while there are some questions, none of those get legitimate answers.

Finally, let’s focus on the image and the voice. The video is plainly ripping off the video of Xu explaining the company's products on HeyGen's website, which include video translation and creating crazy realistic AI avatars. (Notably, the video of Xu is actually his avatar created using HeyGen tech). His clothes and even the background is the same. As far as the voice, it’s clearly robotic. As a bonus, the thumbnail the scammer uses for the video is literally a copy-and-paste from HeyGen's website.

Heygen Ceo Deepfake Crypto Scam Youtube The scammers didn't even bother to erase the mic, which Xu wears in the video showing off HeyGen's various offerings, in their screenshot.
Heygen Ceo Website A screenshot of the real Joshua Xu from a video on HeyGen's homepage.

The rise of celebrity deepfakes. This obviously isn’t the first time someone has created a deepfake of someone influential or famous. Just last week, tech YouTuber Marques Brownlee called out a company for using an AI copy of his voice to sell their product.

“It's happening. There are real companies who will just use an AI-created rip of my voice to promote their product,” Brownlee said in a post on Threads. “And there's really no repercussions for it other than being known as this scummy shady company that is willing to stoop that low to sell some product.”

Former President Donald Trump shared deepfake images of Taylor Swift and her fans endorsing him in August, stating “I accept!” alongside them. Swift went on to endorse Vice President Kamala Harris, specifically stating that she was motivated to speak out after seeing the AI images of her falsely supporting Trump.

HeyGen response. I reached out to HeyGen and Google for comment on the deepfake video of Xu but didn’t immediately receive a response. It’s also not clear whether the video was created using HeyGen’s tech, although it would be a violation of its terms and acceptable use policy.

“We also believe that Al generated avatars and videos depicting these avatars should only be created or shared with expressed permission from those individuals,” the company says on its website. “In cases where we find out an individual's image, likeness or voice is used without their permission, we will remove the relevant content and take appropriate action against the user that engaged in the unauthorized use.”

Back in June, HeyGen raised $60 million, which increased its valuation to $500 million. One of the company’s biggest clients in recent months has been McDonald’s. To promote its Grandma McFlurry, the food company used HeyGen’s tech to create an intergenerational ad campaign for families who speak different languages. Customers were asked to create video messages that HeyGen later translated into their grandma’s own language, all the while maintaining their voice and likeness.

Cutting through the noise. With the rise of AI, this won’t be the last deepfake we see of someone influential or famous. However, it shines a light on a problem that regular people have in a world where these tools are common: the power to remove legitimacy.

While people like Brownlee, Swift, and countless other celebrities can push back against what the fake copies of them say or do, it’s not so easy for everyone else. There’s no doubt that deepfakes made without consent are terrible—if you want to send your digital twins off to multiple Zoom meetings, that’s on you—but being able to do little or nothing about them is just as bad.

Image | HeyGen

Related | Amazon, Google, and Microsoft Are Investing in Nuclear Power to Secure the Future of AI. It’s Not Going to Be Easy

Home o Index