Enter the creepy and unsettling world of deepfakes

This week's episode of 'The Orville' has a character creating a bogus video to fake a crime. But, the truth is that fake videos are already here. And they are super disturbing.

The Orville

This week's episode of The Orville has Lt Talla investigating a sinister fake video. Source: SBS VICELAND

Like many Internet technologies, 'deepfake' video has its roots in pornography. It started with canny online enthusiasts creating videos that inserted celebrities' faces onto the actors in porn films. But very quickly, the technology was applied to other things. 

In an era where 'fake news' is a buzzword, trust in news and information is at an all-time low. As technology has evolved and the Internet crept further into our daily lives, we've become exposed to content from both trusted and illegitimate actors. Everyday media consumers need increased skills to detect 'true' from 'false', but many of us have trouble keeping up. Now deepfakes are about to really mess with viewers' minds.

This week's episode of The Orville plays around with the idea of faking a video. While that is very much presented as a far future technology, similar technology does exist today and is rapidly improving with every faked video.

How does a deepfake work?

The scariest thing about deepfakes is how easily they can be created. Using just a few lines of code easily found on coding resource site Github, deepfakes become possible through a machine learning system. The system examines the facial movements of a person, then maps it onto a different person's face in a video. 

A number of software tools exist to create deepfakes, including one called 'FakeApp' which was originally built using open-source software developed by Google.

Notable fakes

Comedian Jordan Peele recorded the following video for BuzzFeed, using his impersonation of the former US President, as a demonstration of what can be done with deepfake technologies.
And when Jennifer Lawrence's face was replaced with Steve Buscemi's, it gave birth to a nightmare none of us ever imagined.
In recent days, a well-known deepfake creator called 'derpfakes' published this video of the late Robin Williams, (who voiced the Genie in the 1992 animated film Aladdin) performing in the upcoming live-action Aladdin film, replacing its actual star Will Smith:

Finding a solution

There is truth in watching video. While most of us understand that video can be altered and manipulated through the way it is staged, framed and edited, it is still hard for us to shake the belief that what we are seeing has some element of truth. What we see on TV and online heavily shapes our understanding of the world. For example, it's often cited that the reason US citizens became disillusioned with the Vietnam War was because, for the first time, regular videos of the atrocities of war were beamed into US homes via television. Previously, all video seen by mainstream viewers was government approved footage. 

The idea of losing that implicit faith in the video we see has spurred a number of developers into finding a solution. One promising tool is , designed to run in the background on video recording devices. Amber periodically creates a cryptographic signature which is then recorded to a public blockchain. If someone creating a deepfake from the same video clip tries to use Amber, it will recognise the pre-existing numerical signature and reveal potential tampering.

In an interview with , Amber research consultant Josh Mitchell commented on the way it could be used to preserve the integrity of footage recorded from police bodycams: "The fact that there’s nothing protecting that evidence from a malicious party is worrying, and manufacturers don’t seem very motivated to do anything. So if we have a provable, demonstrable prototype, we can show that there are ways to ensure that all parties have faith in the video and how it was captured."

This isn't a permanent solution, as technology will improve very quickly and take this into account, but for now a viewer can rapidly recognise a deepfake by looking at the number of times a person in a video blinks. On average, people blink once every 2-10 seconds, but deepfakes rarely take this into account as it's rare to find usable images of notable people with their eyes closed.

Listen to the ORVILLELAND podcast this week for a discussion about the show and about the ethics of deepfakes:
You can watch the episode of The Orville that kicked off this discussion now at SBS On Demand:

Share
4 min read
Published 16 February 2019 8:56am
By Dan Barrett

Share this with family and friends