The Reality of Political Deepfakes

Deepfakes are spreading rapidly across the internet and especially on social media platforms. With advancements in AI image generation, these images are becoming increasingly difficult to identify with the naked eye.

In the past, one could easily spot a deepfake video by noticing obvious glitches, but that’s no longer the case, making this technology more dangerous than ever.

The term “deepfake” has become familiar with the rise of AI. If you’re unfamiliar, I’ve written an article on deepfake nudes that you can check out. Essentially, a deepfake involves creating fake pictures or videos of individuals doing things they never actually did.

This could be consensual or non-consensual. Deepfakes use machine learning algorithms to learn from photos or videos of people and replace their likeness with that of another person.

Deepfakes have the power to warp your thinking, stance, ideologies, etc. For example, imagine you come across a picture of Trump hugging black people online. It looks very real, but seems unlikely. [Editor’s Note: Mr. Trump has associated with known white supremacists. For example, the time he hosted Nick Fuentes at Mar-a-Lago.]

It’s so convincing that you might not realize it’s fake. If you see many such pictures, it could influence your decision on whether or not to vote for Trump if he ever runs again.

While deepfakes can be used for benign purposes like comedies or raising awareness, they have mostly been used for nefarious reasons since their inception.

Examples of Political Deepfakes

There are numerous real-life examples of political deepfakes on the internet. This issue isn’t limited to the US; we’ve seen shameless use of deepfakes against political opponents during elections worldwide.

  1. Deepfakes during the Nigerian 2023 elections: A deepfake audio was released on Facebook implicating presidential election candidate Atiku, his vice, Okowa, and Sokoto State governor Tambuwal planning to rig the election. The audio was posted on Facebook and has since gained over 500 reactions and 800 shares. In the comments, almost no one seemed to think it was a deepfake. One commenter wrote, “A presidential candidate and his vice stooping this low? This is quite unfortunate. They even call themselves by name. These guys are dull sha.” PRNigeria has come forward to debunk the audio as a deepfake. Various verifications with online software have confirmed this. Even though the audio has been shown to be doctored, imagine the thousands of people who watched it at the time and actually believed it just before the elections. There were also deepfake videos of Hollywood celebrities endorsing another political candidate, Peter Obi. Most Nigerians treat the opinions of Westerners very highly; this was done to improve the ratings of the presidential candidate.
  2. Deepfake-triggered coup in Gabon: In December 2018, a video of the president of Gabon, Ali Bongo, was released after a hiatus of not posting much. Rumors were circulating that the president was very sick, although his media team came out to confirm he had a stroke but was now stable and in good health. Citizens of Gabon on X were not entirely convinced with the video. They pointed out a lot of inconsistencies with the video, ranging from the president’s weird facial expressions or lack thereof. There were also complaints about the camera angles. Citizens were convinced all was not well with the president, and they had good reason to believe the video was a deepfake. A week after the video was posted, there was a coup in Gabon. Although it was unsuccessful, many have pointed out how the fear of deepfakes can cause misinformation which could escalate into violence.
  3. Rana Ayyub: Rana is an Indian journalist who found herself in deepfake porn. This was after she spoke out against gang rape. The video quickly went viral and was shared more than 40,000 times. Her personal details like her house address and phone number were exposed on the internet, earning her numerous death and rape threats and endangering her life. Rana’s case is not a political deepfake to disrupt elections or cause political unrest. But it’s a threat to democracy. It shows the length at which deepfakes can be used to shut a person advocating for human rights in a democratic society.
  4. Trump’s photos with black people: Photos of Trump with black people quickly went viral on X. In the photos, generated by avid Trump supporters, you could see Trump in a pleasant disposition with black voters. Though many do not believe the photos to be real, there are still a few who may think it’s true, thereby swaying their decisions towards Trump in the forthcoming elections.

Effects of Political Deepfakes

The adverse effects of political deepfakes are real, both online and in real life. They include:

Erosion of Trust: Political deepfakes lead to a significant erosion of trust in media and public figures as people become skeptical about the authenticity of what they see and hear. It gives politicians a quick avenue to dismiss controversial things they do as deepfakes. We used to blame Photoshop. Now, AI.

Manipulation of Elections: Deepfakes can be used to spread misinformation and manipulate public opinion during elections, potentially swaying voters’ decisions based on falsified evidence. This is especially worrying in developing countries where digital illiteracy abounds.

Destabilization of Diplomatic Relations: Fake videos or audio recordings of political leaders making inflammatory statements could exacerbate tensions between nations, leading to diplomatic crises or even conflicts, since there’s no concrete way of knowing if a video is a deepfake.

Undermining Democracy: Deepfakes have the potential to undermine the democratic process by creating confusion, spreading disinformation, and delegitimizing elected officials. A good example is the case of Rana Ayyub.

Personal and Professional Reputational Damage: People targeted by political deepfakes can suffer severe damage to their personal and professional reputations, as false content can be difficult to debunk and may spread rapidly. One tech expert, Farid, reported at least 6 politicians from developing countries met him to analyze explicit deepfake videos about them

Increased Polarization: They can increase social and political polarization by reinforcing existing biases and creating further divisions within society. Sometimes political deepfakes can support a confirmation bias like in the case of the defeated Nigerian presidential candidate Atiku whom Nigerians were ready to believe was going to rig the elections. The winner, President Tinubu, has also been accused of allegedly rigging the election.

Challenges for Law Enforcement: Law enforcement agencies may struggle to identify and prosecute those responsible for creating and disseminating political deepfakes, leading to challenges in maintaining public safety and order. When Rana Ayyub reported her deepfake case to the Indian police, the case was dismissed because powerful political people were behind the video.

Impact on Journalism: Journalistic integrity may be compromised as the credibility of media sources comes into question due to the prevalence of deepfake technology.

We have come to the conclusion that political deepfakes wreck more havoc than good in the society. Along with their cousins, deepfake porn images and videos, they should be cast out to the realm of Dark GPT. They can be used to create false narratives, sway decisions, cyberbully and cause unrest.

There’s no sure way to really spot a deepfake. But if it doesn’t feel right, I think you should hold on from jumping into conclusions. We can only hope that in the nearest future, better tech will be developed to identify political deepfakes and eliminate them.