Date
16 September 2019
Actress Scarlett Johansson (inset), a victim of the so-called deepfake videos, has lamented that protecting oneself 'from the Internet and its depravity is basically a lost cause.’ Photos: Reuters
Actress Scarlett Johansson (inset), a victim of the so-called deepfake videos, has lamented that protecting oneself 'from the Internet and its depravity is basically a lost cause.’ Photos: Reuters

The danger of ‘deepfakes’

The 1987 film ‘The Running Man’ saw Arnold Schwarzenegger play an ex-cop who is framed for murder with the use of TV fakery technology.

In a dystopian world of food shortages and economic collapse (set roughly in today’s era), criminals are sent to their deaths in a maze game show, hunted by stalkers who brandish high tech weaponry.

While very entertaining, it seemed so far-fetched at the time and one could not help but laugh at the whole spectacle. Except that in 2019, the whole thing seems very prescient, with the rise of ‘deepfakes’, or AI generated video and images that are hard to tell from the real thing.

A recent NVIDIA report highlighted how AI is able to generate fake faces that are scarily realistic. Kyle McDonald, an artist that works with code, has pointed out specific methods to spot fakes which include the following:

• Straight hair that looks like paint
• Background text that is indecipherable and overall surreal
• A noticeable amount of asymmetry
• Odd looking teeth, noses and messy hair, along with unusual gender presentation
• Iridescent color bleed or rendering that looks like paint

While experts are still able to spot fakes, the situation is very trying for ordinary people who are not expecting such forgery.

The problem has recently been extended to video. The University of Washington demonstrated this as well with a fake Obama video produced using AI technology. The researchers described their process:

“Trained on many hours of his weekly address footage, a recurrent neural network learns the mapping from raw audio features to mouth shapes. Given the mouth shape at each time instant, we synthesize high quality mouth texture, and composite it with proper 3D pose matching to change what he appears to be saying in a target video to match the input audio track.”

In other words, algorithms recognize patterns in the audio and visual recordings of a person, allowing elements to be swapped. These type of images and videos are increasingly known as deepfakes, which don’t withstand the normal scrutiny of fact checking.

The Washington Post reports that this practice has been extended to the creation of fake pornography to blackmail ordinary people. A high profile victim of this practice is actress Scarlett Johansson, who told the paper she worries that “it’s just a matter of time before any one person is targeted”.

Johansson has been reportedly superimposed onto numerous graphic sex scenes, often packaged as “leaked” footage and viewed up to 1.5 million times on a popular pornography site.

“Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired,” she reportedly said. “The fact is that trying to protect yourself from the Internet and its depravity is basically a lost cause. . . . The Internet is a vast wormhole of darkness that eats itself.”

– Contact us at [email protected]

RC

EJI contributor