After the rampant spread of misinformation during the 2016 presidential election, experts warned about a new threat for 2020: deepfakes.
Fake news stories can easily be fact-checked. But deepfakes? Experts feared the technology could become so advanced it would be difficult to tell a real video from a fake one.
Maybe you’ve seen those deepfake celebrity videos on YouTube, like the clip below. The creator used AI to digital stitch Arnold Schwarzenegger’s face onto comedian Bill Hader. That’s achieved through deep learning — training a computer with lots of data until it learns to make a piece of manipulated content look real.
The celebrity interviews were fun. But there was a dark side to early deepfakes. Clips of female celebrities superimposed into pornographic movies were so widespread that Reddit was forced to ban them. And some women were the victims of nonconsensual deepfakes on social media.
The U.S. Department of Defense readied itself for a deluge of deepfakes spreading political misinformation. But the flood never came.
“We joke that we’ve never really seen a real deepfake in the wild,” said Alan Duke, editor-in-chief of LeadStories, an official Facebook fact-checking partner. “Everybody says it’s coming. We’ve been getting ready. We’ve been watching for it. But, we’ve not found one with any misinformation impact.”
“The thing I’m worried about are ‘cheapfakes,'” explained Duke.
“Cheapfakes,” as misinformation researchers call them, are manipulated media created with basic editing techniques. They can be made with photo and video software as simple as Windows Movie Maker.
Take this viral video of Joe Biden from the first presidential debate. All the editor did was zoom in on Biden’s shirt and claim a crease was actually a hidden wire, feeding the candidate information.
Deepfakes take time, resources, and skill to create. Anyone can make a cheapfake.
The Rock versus…Hillary Clinton
“BUDDY, YOU HAVE JUST LOST ME!” read one Facebook comment left on Dwayne “The Rock” Johnson’s Facebook page in September. “Now you are just The Pebble!”
The angry and disappointed reactions from Trump’s supporters came in quick the day the pro wrestler-turned-Hollywood actor made an extremely rare political endorsement, publicly backing Joe Biden for president.
Multiple Trump supporters shared a video in the comments under Johnson’s Facebook post. Several of them asked how he could say those things about Hillary Clinton in 2016 and then back a Democrat in 2020.
But he rarely gets political, and certainly didn’t comment on Clinton in 2016. Clicking on the video explained everything.
It starts with The Rock in a 2013 appearance on WWE Raw. He’s sitting on a stool in the middle of the ring with a guitar in hand. Wrestling fans know this as the night he serenaded pro wrestling heel Vickie Guerrero with a cover of Eric Clapton’s Wonderful Tonight.
“You abuse all your power, waste everybody’s time,” sings The Rock in the more PG moments of the song. “B-iatch, you look horrible tonight.”
The clip posted to Johnson’s Facebook page is edited. Unlike the original, it never cuts to Guerrero’s distraught, over-the-top performance. Instead, it shows Hillary Clinton standing at a podium in 2016.
The video is clearly a parody, made to look like The Rock is singing to Hillary. Yet some Trump supporters used the video as evidence Johnson had turned his back on them.
“I wasn’t thinking people were going to think it was real,” said Jason, who only wanted to give his first name, and goes by the handle Skitz4Twenty on YouTube. “I was just like ‘Man, this is going to make people laugh. Cool!'”
He made the video in 2016. That year, he made a number of comedy videos by cutting up existing political footage, like Trump’s campaign ads. However, The Rock video is the one that went viral. He remembers excitedly calling his friends when it hit 15,000 views. Today, it has 3.5 million.
Back when it first came out, Jason said, wrestling fans reached out and gave him props for the funny edit. They knew the source material. However, once it entered the orbit of Trump supporters, things changed.
“Trump supporters believed [the video] was real,” he told me. “They were kind of getting riled up. At first I tried commenting on people’s comments. I was like ‘Dude, this isn’t real, like chill.’ And then I just gave up. It was too much.”
The cheapfake revolution
Cheapfakes were everywhere after the 2016 election. But there was a huge uptick this year.
“Doctored video views have grown twenty times since last year,” said JC Goldenstein, founder of CREOpoint, a brand protection firm that tracks disinformation.
He said manipulated media focusing on politics attracted 120 million views across the major social media platforms between September 2019 and October 2020. Overall, Goldenstein said, 60 percent of all deepfakes and cheapfakes over the last year were political.
LeadStories’ Alan Duke pointed out a recent piece of misinformation that has spread on Facebook. It’s a simple Photoshopped image from someone who used the warp tool to try to prove that Michelle Obama is actually a man. (Certain conservative corners of social media, like Facebook, really believe this.)
He also brought up the Nancy Pelosi clip that was slowed down to make her appear drunk, an editing tactic used sometimes by late-night comedy shows. Yet, after LeadStories and other organizations debunked it, Duke still received calls and emails from people arguing that it was real. Even President Trump shared it.
These were all cheapfakes, created with tools that require little to no skill. No deepfake technology necessary.
“Real deepfakes…that’s something we haven’t seen,” Duke tells me. “The deepfakes we’ve seen have basically been experiments or demonstrations.”
“Real deepfakes…that’s something we haven’t seen.”
CREOpoint’s Goldenstein provided me with access to the firm’s internal database, which tracks hundreds of pieces of manipulated media and the coverage they receive. Looking at the videos on the list, it seemed pretty clear: cheapfakes are being used to spread misinformation. Deepfakes are being used to show off what the technology can do.
Creating a deepfake
A few weeks ago, the major U.S. television networks refused to air a set of TV ads during the first presidential debate. They featured Russian President Vladimir Putin and North Korea’s Kim Jong-un discussing America’s eroding democracy. The videos are incredibly realistic works of fiction from nonprofit RepresentUS.
RepresentUS worked with deepfake expert John Lee to create the ads.
“In terms of how photo real it looks, you can achieve that with a deepfake, assuming you have a broad enough data set,” he explained.
Deepfake creators need to sift through thousands of images to find photos with the right quality and angles to provide their computer with information useful enough to make a convincing deepfake.
“I don’t think [the technology] is quite there yet because there’s still a lot of limitations and certain things just don’t work well,” explained Lee, who noted how matching mouth movements and voices can still be tricky. Profiles are tough too, which is why most deepfakes are front-facing.
“If you have the patience and the drive and the time to really sit through a few thousands of different images, you could get close to those types of results,” he continued. “It takes a long time and a lot of work to curate the data to the acceptable images that will make it work.”
Deepfakes, especially realistic looking ones, are very labor intensive. And they can be quite expensive, too. Lee uses free, open-source software to create his deepfakes. But he had to spend $5,000 on graphics cards that could handle the RepresentUS project.
People will believe anything
Deepfakes can be costly and time consuming. Why make that investment when the cheaper option works just as well?
“I think that the most poisonous type of misinformation out there is the meme,” Duke said. “Anybody can create one in 30 seconds and it’ll go viral if they put it in the right place. It’s very effective in spreading misinformation.”
And you don’t even need to purposefully spread misinformation to create it.
You’ve probably seen Vic Berger’s videos. The comedy editor has worked with the popular comedy duo Tim & Eric, but came into his own during the 2016 election with his hilarious viral political video edits for the now-defunct humor site Super Deluxe.
“I’m not making Trump say things he really isn’t saying,” Berger explained. “I do try to have that responsibility.”
Most people who view Berger’s videos realize they’re satirical, he said. But Trump supporters still accuse him of spreading misinformation.
In 2016, Trump walked off stage after a debate without shaking the moderator’s hand. Berger thought it was a funny moment and created a video for Vine. Trump fans were upset that he didn’t include a later moment when Trump came back on stage to talk to the moderator.
“I just put a Price is Right sad trombone or whatever in there,” said Berger. “I just thought I was making dumb videos to make people laugh.”
And that’s what makes some of these “cheapfakes” so effective. Not only are they easy to make, but anything could be a cheapfake if the person watching thinks it’s real. It doesn’t matter if the intent was to misinform or make people laugh.
When I told Jason that his video featuring The Rock was being shared on Facebook, a platform he didn’t upload it to, four years after he created it, he was surprised. But it explained the uptick in new comments on his YouTube video.
“Anything could be a cheapfake if the person watching thinks it’s real.”
“There were some people [in the comments] saying ‘You sold out, Hollywood did this to you,'” Jay explained. “I was like, ‘God, these idiots, what the hell?'”
I told him that one of those Facebook re-uploads had 11 million views, more than three times the original, and was being shared by Trump supporters on Johnson’s Facebook Page. Jason was floored.
“I didn’t know it was getting posted on The Rock’s page,” he said. “I think he’s a good guy. I’m not trying to have him come over here and kick my ass.”
ที่มา : Mashable