These Latest Celebrity Deepfakes Show Just How Advanced Scamming Has Become

As long as the Internet has existed, this advice has held true: “Don’t believe everything you see on the Internet.” Whether it’s a personal blog, tweet, YouTube video, or TikTok, anyone can say anything here, and it’s hard to know if they’re right or wrong (or even if they’re telling the truth).

But we’re at a tipping point in internet literacy: Generative AI has reached scary levels where the technology is good enough to imitate celebrity likenesses and make these “clones” say whatever they want. For those in the know, these deepfakes may not be convincing yet, but what about the average social media user? It seems that we are quickly approaching the point where the general public will start to believe that these scam videos are real, and that is a scary thought.

There are three high-profile examples in the past week or so alone. The first is Tom Hanks: the actor posted a screenshot on his Instagram of a video promoting a “dental treatment plan” with a representative who looked like Tom Hanks if he had his teeth reinstalled. The video itself is not publicly available, and Hanks has not shared it or the name of the company promoting it. But it is believed that some company or person made a deepfake of Tom Hanks to sell a dental product. As I mentioned, by looking closely at the image, you can already tell there’s something “off” about his face, and if we were able to see the video, it’s likely that this move would have that hard, eerie valley look. which so many artificial intelligence deepfakes have.

But Mr Hanks was far from the only celebrity this week to face the problem of deepfakes. Gayle King, co-anchor of CBS Mornings, posted the deepfake on her Instagram account, saying that people keep sending her videos and that she had nothing to do with it. This time, she shared the original video on which the deepfake is based, an innocent video of King discussing Bruce Springsteen’s appearance on her show:

Careful observers will be able to tell that the first video is fake: the lip movements don’t match the sound, and although the sound resembles King’s voice, it is too harsh, as if she was pretending to be a bad actress while reading the script. I’m sure that any of us who are attuned to these limitations of deepfakes and generative artificial intelligence will immediately notice the telltale signs, but I’m not sure that everyone will immediately see this as a fake.

Third, the internet sensation Mr. Beast is currently going viral due to deepfake ads being shown to TikTok users, including myself. In it, “Mr. “The Beast” congratulates the viewer for being one of 10,000 users chosen to win an iPhone 15 Pro for just two dollars. Lucky you!

The video in question is one of the best I’ve seen, although it has obvious flaws. Whoever did this made sure that Mr. Beast seemed more expressive while talking, trying to make the whole interaction more natural. I think this may be effective for some, but again, it’s not 100%. Watching the video, knowing that it is fake, reveals all the flaws.

Even if it seems obvious to you that these examples are fake, as Marques Brownlee says, that’s the worst this technology can be . Deepfakes will only continue to improve, aiming for the ultimate goal of being indistinguishable from the real thing.

If you’ve enjoyed the AI ​​song covers that have gone viral all over the internet, you know how good this technology is getting. The voice actor of SpongeBob ‘s Plankton should rightfully be concerned about how excellent these covers are, like Plankton’s cover of Beggin ‘. Frank Sinatra may have died when Dua Lipa was two, but this AI cover of him singing Levitation is a bop.

This technology can even translate your speech and dub it in real time with your voice . While there are plenty of apps with this feature, even Spotify is testing it to translate podcasts in the hosts’ voices.

Currently, the best deepfakes contain only audio, but even then they still have problems with accuracy. But what happens if a bad actor can create a fake Mister Beast ad that most people buy into? Imagine a truly compelling Mr. Beast telling young and impressionable fans directly, “All you have to do to enter my giveaway is enter your banking information so I can transfer your winnings directly to you.” Perhaps the “competition” will take place inside “Mr.” Beast” which actually installs malware on your device.

Of course, there are more frightening scenarios to consider. We are approaching next year’s presidential elections. How advanced will deepfake technology be by November 2024? Will anyone open TikTok before heading to the polls to watch a deepfake of Joe Biden declaring that his ultimate goal is to jail his political enemies? Or maybe Donald Trump is calling on his supporters to come to the polls with guns?

Be careful when watching videos online.

Social media companies need to be more proactive in attacking these fake videos before they spread to others, but we also have a role to play in all of this. We need to be careful, now more than ever, about accidentally scrolling and clicking around our vast internet. Just because you see a video of a “celebrity” saying something or endorsing a product doesn’t make it real – not anymore. Carefully check the account on which the message is posted: if it is supposedly a real person, their account should be verified (unless we are talking about a useless platform like X, in which case all messages should be treated as false unless proven the opposite).

While we wait for deepfakes to get really good, there are still plenty of red flags that will tell you something is illegal. First, eye and mouth movements will appear strange. Watch the video of Mr. Beast: Although they tried their best to make him expressive, in the first half of the video his eyes are quite blank. And while they were good at matching lip movements, many deepfakes aren’t very good at it yet.

Many of these videos are also presented in very poor quality. This is because increasing the resolution shows how flabby the video is. Deepfakes are based on a real video of a person, whether a celebrity or not, then put the celebrity’s face on top of that video and manipulate it to their liking. It’s quite difficult to do this in high resolution without blending issues, so you see the layers being cut out of each other.

A healthy dose of skepticism goes a long way on the Internet. Now that generative AI is taking over, turn up the skepticism as much as possible.

More…

Leave a Reply