Africa Defense Forum
ADF is a professional military magazine published quarterly by U.S. Africa Command to provide an international forum for African security professionals. ADF covers topics such as counter terrorism strategies, security and defense operations, transnational crime, and all other issues affecting peace, stability, and good governance on the African continent.

Concern Grows as ‘Deepfakes’ Spread Misinformation

ADF STAFF

Odd videos began circulating on social media in Burkina Faso in January, showing a diverse group of people urging Burkinabe to support the military junta.

“Hello to the African people and particularly to the Burkinabe people. My name is Alisha and I’m a Pan-Africanist,” a female figure said in the video. “I appeal to the solidarity of the Burkinabe people, and the people of Burkina Faso to effectively support the authorities of the transition.”

A second video showed four more people with the same message. All of them appeared stiff with lips that moved out of sync with their words. They mispronounced “Burkina Faso” and “Burkinabe.”

There was a reason for the unusual appearance: None of them was real.

They were a form of synthetic media called deepfakes — a video generated or altered by software that can create people or forgeries of real people. Users can manipulate the characters to say anything.

The use of deepfake technology to spread disinformation is a global problem. Its use in Africa is rising as internet access expands across the continent.

Experts around the world have shared concerns that the technology is developing into a more powerful and sophisticated misinformation tool. In some African countries, malicious actors could wield deepfake technologies to sow distrust, chaos and instability.

“As technology’s power increases, so the ability to cause harm increases exponentially,” Stellenbosch University Data Science Professor Johan Steyn told Forbes Africa magazine.

“From both a legal and a government ethics point of view, it’s a massive problem and I don’t know how it is going to be regulated. How do you present evidence to a court of law when you cannot confirm if a video or voice is authentic? There’s almost no way of proving deepfakes are authentic.”

The deepfake videos that pro-junta activists in Burkina Faso spread across Facebook, WhatsApp and Twitter were created using Synthesia, a video generator that uses artificial intelligence, according to VICE News.

Synthesia refused to reveal who made the videos but said that it had banned the user.

The use of deepfakes is rising, but many people aren’t familiar with the concept.

Security software company KnowBe4 recently surveyed 800 people in Botswana, Egypt, Kenya, Mauritius and South Africa between the ages of 18 and 54 to determine their awareness of deepfakes.

The results showed that “51% of respondents said they were aware of deepfakes, while 28% were not, and 21% were unsure or had a little understanding of what they were.”

Nearly three-quarters of respondents (74%) said they had believed an email, direct message, photo or video shared with them was real and later discovered it was fake.

Anna Collard, senior vice president of Content Strategy at KnowBe4 Africa, warned that deepfake technology has become so sophisticated that most people would find it difficult to spot fakes.

“These deepfake platforms [such as ChatGPT and Stable Diffusion] are capable of creating civil and societal unrest when used to spread mis- or disinformation in political and election campaigns and remain a dangerous element in modern digital society,” she told South African website News 24.

“This is cause for concern and asks for more awareness and understanding among the public and policymakers.”

In one notorious 2018 case in Gabon, a suspected deepfake video nearly toppled the government.

President Ali Bongo, who was out of the country for more than two months to get treatment for a stroke, delivered his customary New Year’s address to help calm the growing speculation about his health.

Instead of a live broadcast, Bongo spoke during a two-minute video. He appeared stiff, rarely blinked and spoke in rhythms and patterns that were very different from his previous speeches.

Some critics and political rivals claimed the video was a deepfake, although independent analysts said it was likely authentic.

A week after the video broadcast, Gabon’s military attempted an unsuccessful coup — the country’s first since 1964 — and cited the strange video as proof that something was amiss.

“We did not know what was happening,” a journalist told The Washington Post about the atmosphere of uncertainty in the country. “There was not a lot of official information, which was scary.”

Attempts by internet platforms to curb misinformation in Africa are in early stages.

“Platforms have to come up with solutions,” said Julie Owono, executive director of the digital rights organization Internet Without Borders. “The more time they take to tackle issues the worse things will get.”

Regardless of countermeasures, deepfake technology continues to evolve. As it does, its potential to erode credibility grows with it.

“We’re living in a world where we cannot trust almost anything,” Steyn said.

You might also like

Comments are closed.