close

Carnegie Mellon University panel talks importance of ethical AI, evolving views on computer science

Screen Shot 2020-07-23 at 9.49.46 AM

David Danks, head of the Department of Philosophy at Carnegie Mellon University, said if artificial intelligence is used in the United States to prevent crime or arrest more people, the AI would most likely target communities of color based on the country’s history of systemic racism, which has been protested against in recent weeks. 

“Unfortunately, I think, hopefully we all now recognize there’s a very long history of law enforcement and governments using technologies for surveillance and control, rather than purely for safety and support,” Danks said.

Though data and codes are essentially ones and zeros on a screen, Danks said that technology embodies the values that the creator prioritizes. 

“If you have a hammer, it embodies the value that everything is a nail. And, in practice, most of these AI systems aim to optimize. They tend to be high performing at some task,” Danks said. “Somebody is evaluating this task, whether it’s predicting where a crime will occur or predicting where an arrest might occur.”

Danks is the L.L. Thurstone Professor of Philosophy & Psychology at Carnegie Mellon University. He talked with Jennifer Keating and Illah Nourbakhsh at 10:45 a.m. EDT Wednesday, July 22, on the CHQ Assembly Video Platform as the third part of Week Four’s theme of The Ethics of Tech: Scientific, Corporate and Personal Responsibility.” Keating is a Senior Lecturer and Writing the Disciplines Specialist in the Department of English at the University of Pittsburgh and Nourbakhsh is K&L Gates Professor of Ethics and Computational Technologies at CMU and director of the Community Robotics, Education and Technology Empowerment (CREATE) Lab.

Nourbakhsh started the lecture by delving into three words that are needed to understand the way that AI influences humanity: agency, equity and power. Agency refers to how much people rely on AI to make decisions for them — an example of this is a Boeing 737 Max, which can make decisions as well as, or better than, pilots around 99% of the time.

“The problem is what happens in that .001% of the time when the AI system flying the airplane does a worse job the pilot and the pilot doesn’t know what’s going on because they’ve lacked agency,” Nourbakhsh said. “It’s been taken away from them.”

Equity refers to bias within AI, such as when credit approval is done by an algorithm that learns from decisions by humans in the past — but those decisions had racial bias, so the AI does, too. Power refers to the large amount of power AI brings to companies, such as advertisers who can use AI that can do a much better job than any single person. 

“It changes the dynamics of democracy, when our ability is based on information that’s highly mediated, and therefore not balanced. And that’s the fundamental question of power relationships,” Nourbakhsh said. “I can make the world better, but in doing so we can actually exacerbate inequities that can exacerbate unequal power relationships.”

Keating then talked about power and political strife, and how information gathering affects the way society thinks. She cited Anna Burns’ award-winning novel Milkman, which showed how powerless people can feel when their government seems omnipotent. She said that predictive policing measures can squash forms of resistance against the government. Keating questioned what the government in South Africa during apartheid would have done with modern technology to suppress the movement to end racial segregation.

“What if today’s technological advancements had been applied to such a movement?” Keating said. “What level of scale could have pushed down this already vulnerable population to quell any form of resistance or rebellion?”

Nourbakhsh said that in the past computer science was mostly a theoretical field and was treated like math, “We never thought in the design of the curriculum that we were having influence on society. So we didn’t bother with the question.”

His students at Harvard were excited about Google’s slogan of “Do No Evil,” and the prospect of becoming millionaires, but a transition happened as the company and others in the technology industry grew.

“What I started seeing is graduates of our programs in the early 2000s actually emailing me back and saying, ‘Could you come and do a lecture to my group at Google on ethics, because we’re getting into some spaces that I’m uncomfortable with?’” Nourbakhsh said.

He said that his current students are not interested in becoming millionaires or organizing the world’s information.

“What’s going on is a fundamental recognition, especially by millennials,” Nourbakhsh said, “That the code, they write changes and influences society, it changes our relationships in society.”

He said Barbara Grosz, Higgins Professor of Natural Sciences at Harvard University, focuses a week on her students arguing issues around computer science and AI, such as the pros and cons of facial recognition.

Danks said that it is important to educate the next generation about the challenges of AI, but that the industry and the government need to improve now. To achieve this, he said organizations need to recognize that ethical AI and technology are not products, but processes.

“You can’t just build cool tech, and then slap some ethics on at the end. That isn’t how it works,” Danks said. “It’s never going to achieve the goals that you really have in mind.”

He said that some companies, like Facebook, have had to “rebuild while your ship is out to sea.” This rebuilding means they had to change and question if the technology they create embodies their own values.

“When I’ve worked with companies,” Danks said, “oftentimes they come to me and say, ‘We need ethical help.’ And my response is to say, ‘No, you know what you should do. You just have to decide that you’re willing to do it.’”

Danks said this does not just apply to corporations; each individual must try to engage with technology that advances their own values and interests. He said several years ago he realized that social media platforms were not supporting his own personal interests.

“They weren’t empowering me, but rather were empowering those companies, and so I made a deliberate decision to simply remove myself,” Danks said. He also said that the decision of which technology to use is different for everybody, and that many people use social media effectively. “But it’s important for us to reflect on what it is that we want from the world and engage with technology in that way, so that we can all have more ethical and societal responsible technologies.”

The lecture then shifted to a Q-and-A session with Vice President and Emily and Richard Smucker Chair for Education Matt Ewalt. His first question was how teachers can create a learning environment that is respectful to students who start out wanting meaningful change in society, while also challenging assumptions and biases. 

Nourbakhsh said it is important to understand the mindsets of younger generations. Many millennials and those in Gen Z are disappointed in previous generations, whom they feel have failed in terms of economic equality, climate change and other issues. 

“That’s what used to drive them, not just a hopefulness to make the world better, but a fear that the world is basically political hell in a handbasket,” Nourbakhsh said. “And it’s going down and they need to solve it and resolve it, so that it doesn’t go downhill that badly.”

Younger generations are effective at creating social movements that are leaderless, he said, such as Black Lives Matter, “without a normalized sensibility that says, ‘Here’s the five steps we want to sell the older generations on.’”

Danks said that many students have a hunger to bridge the gap between people who disagree with them, especially older generations, and are frustrated that there are no places to do so. When these conversations do occur, he said students often feel that they are not talking through their differences and issues, rather they are “typically just being talked at rather than engaged with.”

Ewalt’s last question was what takeaway the panelists would like the audience to leave with.

Keating went back to an important point in the beginning of the lecture, that people need to be aware of their relationship with technology.

“(We need to be) conscious of those relationships with these tools, these machines, but also highly sensitive to the ways in which that might be affecting your interactions with other individuals, or collectives,” Keating said.

Tags : Carnegie Mellon UniversityChautauqua Lecture SeriesDavid DanksIllah NourbakhshJennifer Keating
blank

The author Nick Danlag

This is Nick Danlag’s second season at the Daily reporting the morning lecture recap. He worked remotely last year but loved waking up each day in Las Vegas to learn more about Chautauqua through his reporting. From Mount Laurel, New Jersey, Nick earned a creative writing degree from Eckerd College in St. Petersburg, Florida. As editor-in-chief of his student newspaper, The Current, he loved helping the staff develop their voices.