University of Virginia Professor Emeritus Deborah G. Johnson believes that reputation is the commodity that wins elections. But this commodity is threatened by deepfakes.
Deepfakes are a technology that first started appearing online in 2017, in which sophisticated AI technology compiles and studies existing media to fabricate videos depicting specific people, often public figures, doing and saying things that never happened. These deepfakes can be detrimental in an election year, as videos of candidates can be doctored to show things that are what Johnson describes as “unseemly and inappropriate” to deter voters.
“The thing that worries me the most there, is that if it were done late in the election cycle, even though you might be able to debunk it, if it’s late in the election, there isn’t enough time to get the word out,” Johnson said. “Even if there is enough time, once it’s out there circulating online, you can’t possibly catch every place that it’s been to debunk it.”
Johnson will discuss deepfakes, election security and more in her lecture “Integrity of Cybersecurity and Digital Ethics” at 10:45 a.m. EDT Thursday, July 23, on the CHQ Assembly Video Platform, as part of the Week Four Chautauqua Lecture Series theme of “The Ethics of Tech: Scientific, Corporate and Personal Responsibility.”
“First of all, I want to make people aware of the possibility of these deepfakes being out there, and distorting the information that they’re getting,” Johnson said. “Second of all, I wanted to kind of go into some detail about the ethical implications (of) why this is so dangerous and worrisome. I also want to suggest that there are some possibilities and possible strategies for combating the negative effects.”
While Johnson will be discussing the sinister uses of this technology, she acknowledges that deepfakes can be used for good. Johnson believes that this technology can be used for parody and entertainment, historical reenactment, and editing speeches into different languages.
“The main ethical (dilemma of deepfakes) is deception. But, if it’s not used in a deceptive way, then it’s not necessarily bad,” Johnson said.
Through the presentation, Johnson said she hopes to inform the audience about deepfake technology so that they can form their own opinions about future policy to regulate or combat it.
“I want (the audience) to have a kind of healthy dose of skepticism about what they see in media, social media, and other kinds of media — but at the same time, I don’t want them to be so skeptical that they don’t believe anything they see,” Johnson said. “I think we need everybody to be thinking about how to manage these things. I think there are particular policies that we want people to get behind — whether it’s pressuring platforms to look out for deep fakes, ban them, or label them, or other kinds of legislation or policies.”
Johnson began researching and teaching the ethics of tech and engineering in the early 1980s, when she taught philosophy to classes comprised mostly engineering students at Rensselaer Polytechnic Institute. At the time, computing was a hot topic that she wove into her lessons to interest her students. She started to notice along the way unaddressed ethical dilemmas in the field, so she began to shape her research and career around the subject.
Even as technology evolved, her approach to ethics did not necessarily change. Johnson said that tech ethics is not based on technology, but rather social values.
“Trained as a philosopher, you think of ethics and morality as this thing that’s separate from the world, and there’s right and wrong. I think over the years it’s become clearer and clearer to me that it’s about social values more than right and wrong,” Johnson said. “I now realize that you can’t do the ethical issues without really understanding the relationships between technology and society. Technologies are always embedded in some social context. When you’re doing ethics, you’re really navigating the social context as much as the technology.”
This program is made possible by “The Lincoln Ethics Series” funded by the David and Joan Lincoln Family Fund for Applied Ethics and by the Miller-Beggerow Fund in honor of Cornelia Chason Miller.