Deepfakes trick your brain – just like text

Deepfake technology can swap one person’s face onto another’s in a video, showing the first doing something they never did. New research, published in the open-access journal PLOS ONE on July 6, shows that such doctored recordings can create false memories. In the study, participants watched Will Smith star as Neo in The Matrix, Brad Pitt & Angelina Jolie portray Jack & Wendy Torrance in a 2012 remake of The Shining, and Chris Pratt take the lead in a 2013 version of Indiana Jones.

These movies don’t exist, yet half of those who viewed the altered clips remembered the films. That’s a significant proportion and will come as no surprise to anyone suspicious of the technology. But deepfakes are actually no more misleading than simple text descriptions, it turns out.

The potential harm of deepfakes ignited debates as soon as the first videos appeared on Reddit in 2017. Concerns about our ability to tell what’s real and not spread fast. And for good reason. YouTube channel Diep Nep demonstrated the point neatly with a clip called ‘This is not Morgan Freeman’ – starring Morgan Freeman, without Morgan Freeman.

This is not Morgan Freeman by Diep Nep on YouTube

That video took a lot of skills to make – including impressive voice acting. But that was two years ago. Today, because of the progress in AI, anyone can create deepfakes using just their phone and an app from the app store. Reface, a popular and easy-to-use option, has been downloaded from Google Play over 100 million times and has a 4.8 out of 5 stars rating on Apple’s App Store. Users happily swap their faces into music videos, movies, and celebrity clips for entertainment. And if you want to change your voice, speechify can help. It’s all great fun.

Until it isn’t. Things turn quickly when your face appears in a video without your consent. It’s even worse when it’s pornography. That happened to Harry Potter star Emma Watson and reality TV contestants Zara McDermott and Georgia Harrison. But it happens to regular people, too. Deepfake creators can be motivated by money or revenge for a broken-up relationship, and the damage to victims is profound. Still, the problem isn’t one of just technology.

The authors of this new study, led by Gillian Murphy at University College Cork in Ireland, explain, “While deepfakes are of great concern for many reasons, such as non-consensual pornography and bullying, the current study suggests they are not uniquely powerful at distorting our memories of the past. Though deepfakes caused people to form false memories at quite high rates in this study, we achieved the same effects using simple text. In essence, this study shows we don’t need technical advances to distort memory, we can do it very easily and effectively using non-technical means.”


The research team used four genuine movie remakes (Charlie & The Chocolate Factory, Total Recall, Carrie, and Tomb Raider) and four fabricated ones (The Shining, The Matrix, Indiana Jones, and Captain Marvel). The 436 participants in the study initially believed they were taking part in an online survey investigating opinions about recent movie remakes. Each saw six films – four real ones and two fakes. Three movies were video clips with text descriptions; the others were just text descriptions. Participants indicated whether they saw the original film and this particular remake. They then learnt that, perhaps, some videos and descriptions were manipulated. With this new information in hand, they gave each film a score between 0 (definitely not real) and 100 (definitely real).

Three in four participants (75%) recalled the fictitious Captain Marvel deepfake as real. However, those who only read the text description remembered the film at nearly the same rate (70%). On top of that, 4 in 10 actually thought the remake was better than the original.

Fig 1: Participant memories for the fictitious movies (image by researchers). Deepfakes trick your brain - just like text
Fig 1: Participant memories for the fictitious movies (image by researchers)

The other fakes created fewer false memories. Still, the doctored clips and text descriptions had the same effect. From this result, we could take away that maybe we overestimate how much deepfakes can distort our memories. However, the researchers suggest that, instead, we underestimate how easily memory can be influenced even without technological manipulation.

Deepfakes may not be as powerfully misleading as suspected, but the technology still raises interesting social and cultural questions. Asked about the possibility of replacing actors, participants liked the idea of swapping out ones they don’t like or having their favourite star in every movie. Most, however, volunteered drawbacks. Would actors be treated and paid fairly? How about consent from the actor or the creative director? Doesn’t that count? And might diversity come undone?

Also, customising films could damage the shared social experience. Watching and talking about movies creates cultural ties, and personalisation erodes that connection. Not to mention, too much choice is bad. Figuring out what to watch on Netflix can be daunting already, and selecting actors as well could be cumbersome. One participant summarised, “I like movies to escape. Not to select variables.”


Deepfake technology has risks and opportunities, and human nature demands we focus on the first when faced with change. To add some perspective, the authors remind us that we once feared ‘that the written word would destroy human memory and that lightbulbs may cause blindness’. Yet here we are.

As we begin to understand how deepfakes impact people and society, we can think more freely about positive applications. The authors highlight how David Beckham spoke nine languages in a campaign against malaria, and LGBT individuals in Russia talked openly about persecution while keeping their identities hidden. The technology also helps people whose voice is affected by ALS sound more like themselves again. And it lets us witness events that should have happened but didn’t – like the speech John F. Kennedy was due to give on the day of his assassination. 

JFK Unsilenced by The Times

Source: PLOS ONE via EurekAlert!


For more about new developments in AI, click here.


Posted

in

, ,

by

Tags: