You are here:

Deepfakes and the Erosion of Trust: A Societal Dilemma?

AI redesign

As the world continues to embrace the limitless possibilities of technology, a new digital age phenomenon has emerged blurring the lines between what is factual and what is fake like never before. Welcome to the era of deepfakes – a fascinating cruise into the technological world where artificial intelligence (AI) breathes life into deception, altering the very definition of authenticity.

Have you seen the widely circulated photos of President Akufo-Addo seated on a bed in a private jet alongside Serwaa Broni? What about a viral image of a tarred road that cuts  through a closed canopy mountain forest that was reported online as having been taken in Adaklu in the Volta Region of Ghana? If your answer is “yes” then you have witnessed a deepfake.

Simply explained, deepfake is an image or recording (video or audio) that has been convincingly altered and manipulated to misrepresent someone as doing or saying something that was not actually done or said. For Cheres & Groza (2023), the term refers to use of artificial intelligence (AI) to create fake videos, audio or text and could be used for good or bad purposes.

Unlike the past, where deepfakes were difficult to create, the advent of AI has escalated the technology’s sophistication, making it easier to create images, audios, and videos and making it seem as though they are real.

Regardless of which angle this is viewed from, deepfakes have been a common phenomenon in creating comic content (such as memes, satire, parodies), fake news, political propaganda, child or celebrity sexual materials, revenge porn, hoaxes, financial fraud, and many more.  In the field of politics for instance, alleged videos of politicians and other government officials taking bribes or being involved in “secret” conversations always pop up from time to time.

The proliferation of deepfakes points to how easy it has become to create and spread deepfakes by using apps like HeyGen, FaceApp, or Zao and  having a reliable internet connection.

Many experts believe that deepfakes has come to stay because they offer a plethora of positives as well as negatives.

On the positive side, it provides educators the opportunity to deliver information in compelling ways relative to traditional means like reading and lectures. Educators can now show fake videos of historical speeches or scientific discoveries to help learners understand historical and scientific concepts in detail.

In the production space, deepfake technology allows for relatively cheap and accessible production of video content. A classic instance could be a scene from a movie where the actors appear to be having a discussion on a particular topic, but in reality, the scene had nothing to do with that.

On the negative, just like any other technology, deepfakes can be used to cause a broad spectrum of serious harms, many of which are exacerbated by the combination of networked information systems. In recent times, deepfakes have emerged as a powerful tool to exploit and sabotage individuals and organizations. Voices are doctored and lies are told in an attempt to gain some financial or other benefit.

Beyond the individual level, deep fakes have the capacity to harm society in a variety of ways. Fake video of a politician taking a bribe or doctored tape of election officials planning to manipulate an election can result in the distortion of democratic discourse in a country. These classic instances show how deepfakes could be used as a tool to enhance social divisions which often result in violence.

As the capacity to produce deepfakes continues to proliferate, individuals and organizations will continue to encounter the dilemma of: how do we validate videos or audios received from people and what can be done to ameliorate these harms?

According to experts, Ghana does not have the required technical (research, technology, AI robotics) and physical infrastructure (organizations, security experts, and surveillance cameras) to combat deepfakes; hence, our focus should be on prevention rather than cure.

In detecting deepfakes, there are special tools and software that could be used to curb the phenomenon. The reverse image search software, for instance, could be used to trace the originality of an image or video to determine if it is fake or not.

Additionally, the forensic analysis software can also be used to analyze audio, images, and videos. This software allows one to test for frequency, speed, and landmarks, as well as trace the possible location, owner, and originality of the said material.

Another way to fight deepfakes is to ensure strict enforcement of Section 208 of the Criminal Code of Ghana. The law explicitly provides that “any person who publishes or reproduces any statement, rumor, or report which is likely to cause fear and alarm to the public or to disturb the public peace, knowing or having reason to believe that the statement, rumor, or report is false, is guilty of the misdemeanor.”

Efforts to promote media and information literacy (MIL) among Ghanaians must also be strengthened to enable individuals to create and engage meaningfully with media content and be critical thinkers and consumers of news. The government can partner with CSOs such as Penplusbytes who are currently the pacesetters in building capacities around MIL, and educational institutions like the University of Media, Arts, and Communications (GIJ and NAFTI) and the Department of Communications Studies of the University of Ghana to equip Ghanaians with media and information literacy skills.

In conclusion, undoubtedly, deepfake technology has the potential to alter several parts of society with its accompanying complex and multifaceted challenges. To reap deepfakes benefits while preserving society, Ghana must take proactive strategies, combining legislative and technological measures to confront this emerging threat in the ever-evolving digital ecosystem.

Let us keep in mind that AI is only a tool in the hands of humans. The way we use it will determine whether it can improve humanity’s lives or act as an extremely dangerous weapon.

 This article is part of a Penplusbytes’ (www.penplusbytes.org)  informational series ahead of the Global Media and Information Literacy (MIL) Week 2023. You can reach us on info@penplusbytes.org