Researcher Explains the Dangers of Deepfake Videos

We’ve all enjoyed Deepfake videos—videos that show people doing or saying things that they didn’t actually do or say. For instance, there’s a Deepfake video that substitutes in Nicholas Cage into famous movies like The Matrix or Avengers: Endgame. But like so many things, while Deepfake videos have a light side, they also have a very dark side that can be used for sinister purposes such as harassment and even disseminating false information.

WIRED’s Tom Simonite visited with Sam Gregory, the program director for the human rights nonprofit WITNESS, about the dangers of Deepfake videos.

Deepfake videos are video and audio manipulations based on artificial intelligence. While most people think of it as a face swap, but it goes beyond that. The technology can make it appear that you’re saying something you’re not by manipulating how your lips move in a video. It can also make you appear to move differently in the video. While many of these videos may seem harmless, there are many more that are causing harm.

A vast majority of the harmful videos are of nonconsensual sexual encounters, often videos of a celebrity having sex.

We’re not quite at a point where just anyone can create a Deepfake video, but it is a growing segment. Code is available online, and video and audio manipulations are getting increasingly better. As it gets easier to make these videos, more will start appearing—and it won’t just be for entertainment. Gregory notes that the political world is particularly vulnerable to fake videos swaying public opinion.

The growing concern about these videos is pushing creators to implement measures to identify what is a Deepfake video and what is not. The challenge is being able to identify what is malicious and what is not. Some videos actually are innocent, but the challenge lies in creating the definition. Gregory suggests that detection is a better tool than blocking all the videos. What is perhaps most important, however, is learning to identify what sources of information you can trust.

Deepfakes are only going to get better, so learning to spot one will be increasingly important. Being aware is the first step.

Check it out