5 minute read ★

In an era post the 2016 US Presidential Elections fake news has been a major talking point, playing a significant role in all aspects of all our lives. The increased proliferation of the same on the internet calls for greater vigilance and verification of sources; the alternative being visibly depicted in many of our own families’ WhatsApp forwards. While the content itself is largely harmless gossip and scammy news to make Indians feel patriotic or to convince people of a particular ideology, the impact of it is not a matter to be taken lightly. Fake news channelling mob mentality via such means has already claimed several lives in India, often by using rifts among communal, political, and other grounds. 

    If text weren’t bad enough, the age of digital manipulation brings with it a myriad of opportunities to produce Fake News — or ‘False Propaganda’ as some have termed it — in Multimedia format: photos and videos, highly influential forms of media consumption. Those who followed the news a few years back may recall the incident of a news channel doctoring video ‘evidence’ to claim that students from a particular university had turned a protest violent first, contrary to what they claimed.

Despite the level of realism some of these efforts have achieved, there remain ways to distinguish between what’s real and what’s not. But what if that weren’t the case? And from here we can arrive at the focus of this piece: to showcase the impact of using tools made possible by advances in Computer Vision and the field of AI to produce stunningly realistic videos, capturing events that never happened, called the Deep Fake.

What is a Deep Fake?

    A deep fake is a doctored video (most often with audio) that is very realistic, and usually hard to detect as fake without a close inspection. It’s popularity arose primarily from the source of innovation of all things related to video and audio – the pornography industry. It was used to make fake videos with the faces of famous actresses. Despite being banned on major websites like Reddit, Twitter and PornHub, they remain easy to make and find. They’re named deep because of the deep learning artificial algorithms that make these possible. Input real audio or video of a specific person (the more the data, the better it is), and the software tries to recognize patterns in speech and movement. Then a new element like someone else’s face or voice is added to the video and a deep-fake is born. They are actually more easy to make than it seems at first. Some breakthroughs from academic researchers who work with this particular kind of machine learning have even reduced the amount of development video required to create one of these. There are a few freely available Apps like FakeApp which can be used to create such fakes, albeit of a low quality only. In August 2018, researchers at CMU revealed an unsupervised algorithm that could accurately reproduce not only facial features but also weather patterns. At Stanford University, researchers have manipulated head rotation, eye gaze and eye blinking, producing computer-generated videos that are hard to distinguish from reality.

So what’s the deal with Deep Fakes, really? Impacts considered. 

An observant reader would have already noticed several potentially far-reaching consequences of having such technology freely available out there. Over the next bit, we’ll explore the various impacts of all of this – categorised into Economic, Social, and Political.

The Economic

In film-making, deep fakes could be used to cut actor fees by large amounts. Not very famous actors can be cast in movies with their deep makeovers. At least one such instance (of replicating actor’s faces) in recent times is Star Wars Episode VIII, featuring an actor who had died years ago. It is likely more actors will be going the way of the late Robin Williams1 , restricting the use of their likeness by legal means. In business and world affairs, related technology could break the language barrier on video conference calls by translating speech and simultaneously altering facial and mouth movements so that everyone appears to be speaking the same language.

The Social

The damage from current fake news pales in comparison to the harm that could come from deep fakes. Deep fakes can not only convince people of things that are not real but also undermine their confidence in what is. This will impact everything in our society ranging from the rule of law to how journalism is done. Fake videos could feature public officials taking bribes. Videos of public officials uttering racial epithets could spark violent protests something like the ‘Bloody Sunday’ of March  7, 1965. Videos of officials engaging in adultery could lead to catastrophic consequences. Soldiers could be shown murdering innocent civilians in a war zone, precipitating waves of violence and even causing strategic harm to a war effort. 

Fakes might depict emergency officials “announcing” an impending missile strike on Los Angeles or an emergent pandemic in New York, provoking panic and worse. A false video might convincingly depict U.S. officials privately “admitting” a plan to commit this or that outrage overseas, exquisitely timed to disrupt an important diplomatic initiative. And given the current political scenario where outlandish statements happen to be the norm, to believe even odder, more outrageous situations backed up by video proof isn’t as unlikely as we may like to think.

The Political

Source: www.iflscience.com

Politicians and other government officials could appear in locations where they were not, saying or doing horrific things that they did not. Fake videos could place them in meetings with spies or criminals, launching public outrage, criminal investigations, or both. We still don’t know about the existence of the “Pee Pee tape”2 and yet there is so much buzz about it. Imagine if a person came up with a clip swapping someone else’s face with Trump’s. It could lead to serious tensions between the already strained relations between the superpowers.

A fake audio clip might “reveal” criminal behaviour by a candidate on the eve of an election. We have seen instances of the influence of social media in the 2016 US Presidential elections with details of the Clintons’ private email servers being brought to the public and the 2014 Indian Lok Sabha Elections wherein the opposition party leader was made into a meme to defame his candidacy to the Parliament. We can only contemplate the consequences of deep fakes in such a climate. A fake video might portray a Pakistani official doing or saying something so inflammatory as to cause riots in India, potentially disrupting diplomatic ties or even motivating a wave of violence.

Is it all bad and no good?

The ethical aspect of developing such technologies is debatable.  As we have discussed earlier, the technology has a lot of potential for future use. We all enjoy all the funny face swap memes made using similar technology. Dubbed movies will feel a lot more authentic than before. The thing that one needs to keep in mind with technology is that it is both good and bad. 

“There is no option to make a device secure against bad guys and insecure against good guys”

Edward Snowden

Given the current state of the technology, it seems that in no time, deep fakes can be used to perpetuate a kind of fraud which can lead to global disasters. They can be peddled with the help of social media with unprecedented success rates which could be harmful to the individual, community or nation. 

Whether or not to continue work on this is a call that needs to be made by the developers themselves. It’s a question applicable specifically to this, but also generally to the community at large working on anything potentially disruptive.

Loki replaced by a deep fake of Nicholas Cage
Source: www.abc.net.au
The following two tabs change content below.
Abhigyan Ghosh

Latest posts by Abhigyan Ghosh (see all)

  1. “Robin Williams Restricted Exploitation of His … – Hollywood Reporter.” 30 Mar. 2015, https://www.hollywoodreporter.com/thr-esq/robin-williams-restricted-exploitation-his-785292. Accessed 19 Mar. 2019.
  2. https://www.thecut.com/2018/04/donald-trump-pee-tape.html
Categories: Perspectives