Hyderabad: Brace for the menace of deepfakes in 2020
Hyderabad: Come 2020, ‘deepfakes’ could become the country's next biggest fake news problem. The term refers to audio or video clippings that impersonate human beings, creating eerily lifelike representations, generated using deep-learning artificial intelligence (AI) algorithms.
There have already been many (in)famous examples of deepfakes around the world since 2017. The most famous was Buzzfeed’s deepfake of former US president Barack Obama. Film director, Jordan Peele’s, voice and facial movements were superimposed on Obama’s face, making him say “President Trump is a total and complete dips**t”. While Buzzfeed posted this video as a public service announcement against fake news, it became clear that deepfakes had arrived and were here to stay.
Since then, from Facebook CEO, Mark Zuckerberg, to Prime Minister, Narendra Modi, deepfake experiments have been conducted on many prominent leaders.
The technology behind deepfakes is complex and requires massive amounts of computing power. The AI algorithms are fed with massive data sets of a particular human being to create a convincing fake.
People in the public domain such as politicians and movie stars are, hence, especially at risk of being ‘deepfaked’ since videos and pictures of them are easy to find.
One of the most prominent cases of deepfakes has been in pornography. Websites such as “mrdeepfakes.com”, “adultdeepfakes.com” and “porndeepfakes.com” currently have deepfakes of popular actresses like Priyanka Chopra and Deepika Padukone.
Not just for titillation, but deepfake porn clips have also been used to discredit people.
The most recent example of this was journalist, Rana Ayyub, after she campaigned for justice for the Kathua rape victim in Jammu and Kashmir.
A deepfake of Ayyub was circulated widely across WhatsApp and even Twitter. She said the video snowballed after it was shared by a BJP leader. Ayyub has since stopped posting on Facebook regularly.
It will take some time for deepfakes to actually become a problem in India, said experts. Jency Jacob, managing editor of BoomLive, a fact-checking website, said, “I don’t think deepfakes are that big a problem yet. They might not be a big challenge right now, but we can’t say they won’t be in the future. Deepfake technology has not been mastered yet.” Jacob said current challenges are morphed images and videos edited out of context.
“However, we are indeed seeing ‘cheapfakes’,” he added. “Cheapfakes” refer to videos made using rudimentary techniques.
For instance, in a video, Delhi Chief Minister Arvind Kejriwal was made to look like he was inebriated by slowing it down. Another technique is to crop a video, such as CCTV footage. One video from a CCTV camera of children being kidnapped in Karachi was edited in such a way as to fuel rumours of child kidnapping in India.