close

St. John’s University professor Noreen Herzfeld warns against serving technology instead of humans

noreen herzfeld

For centuries, technology simply amplified humans’ physical abilities. A hammer amplifies the force of an arm. A telescope helps us see something far away. But Noreen Herzfeld said the difference in modern technology like computers and artificial intelligence programs is that they extend the human mind.

“While technologies generally reflect and refract our purposes, amplify our natural abilities, they can also get away from us, embodying a power or purpose of their own,” she said.

Herzfeld, the Nicholas and Bernice Reuter Professor of Science and Religion at St. John’s University and the College of Saint Benedict, delivered her lecture that posed the question: “Tool, Partner, or Surrogate: How Autonomous Should Our Technology Be?” The lecture, at 2 p.m. EDT Thursday, July 23, on the CHQ Assembly Video Platform, returned to the Week Four Interfaith Lecture Series theme, “Ethics in a Technologically Transforming World?”

Herzfeld’s academic background spans degrees in computer science and mathematics from The Pennsylvania State University and a Ph.D. in theology from The Graduate Theological Union at Berkeley. She has contributed to four books on technology and religion as an author and editor.

Herzfeld pre-recorded her lecture on July 12 in Collegeville, Colorado, and attended a Q-and-A the day it was released with Maureen Rovegno, Chautauqua Institution’s Director of Religion. Rovegno delivered audience questions submitted through the www.questions.chq.org portal and on Twitter with #CHQ2020.

The purpose of developing any technology has always been to alter a condition or change an environment in a way that makes an action or condition easier — like taming the elements, disease, predators — and to make life more comfortable. Most recently, this has played out in COVID-19 solutions.

Herzfeld said machine learning was used to test over 6,000 existing drugs that had already passed clinical trials to see if they could be repurposed to fight COVID-19. Google’s DeepMind team trained a neural network to predict protein structures associated with the virus to help develop a vaccine.

Technology can be useful, but it can also re-shape the society it was created in. 

“Often we’re the ones who have to bend to technology, not vice versa,” Herzfeld said.

Technology can also alter its environment. Herzfeld cited German existentialist Martin Heidegger, who said that when a craftsman constructed a chair out of wood, he didn’t change the inside of the wood to create the chair. But a genetically engineered bacterium is new to the “natural order.”

Herzfeld looked to the Christian book of Genesis to explain the relationship between God’s creation and human creation, which are linked because God created humans in his image. Genesis 1 also gives humans dominion over everything in nature.

“We, too, are destined to be creators because we are in God’s image,” Herzfeld said.

But for Herzfeld, some Biblical scholars take the mandates in Genesis for humans being made in God’s image and having dominion over Earth too far when they view humans as deputies for God on Earth. Negative consequences can arise in human relationships, she said, as when Cain kills his brother Abel in Genesis 4-9. And the development of agriculture thanks to technology led to hubris and the construction of the Tower of Babel, which further divided people.

Herzfeld said that creation, both by God and humans, has three relationships. There is God’s image of himself, God’s relationship with humans as his creation, and humans’ relationships with one another. The story of Noah’s ark is an example of technology used to successfully augment these relationships between Noah and his family, the animals and his covenant with God.

“Human nature is only completely full when we are in relationship with God and one another,” Herzfeld said.

The closest relationship that humans have that emulates God’s hierarchical relationship with humans is the creation of artificial intelligence and robots. Herzfeld said she is not sure this is what we want, lauding the Amish for their careful consideration of technology they do bring into their communities.

“Contrary to popular conception, the Amish have not ‘stopped the clock,’” Herzfeld said. “They accept some technologies and reject others.”

The Amish use phones, but don’t install one in every home because it would discourage face-to-face conversations. They use refrigerated milk tanks, but don’t install one in every kitchen.

For every piece of technology they consider, the Amish ask if it provides tangible benefits — but also if it would hamper the relationships in the community.

Herzfeld said another way to look at how artificial intelligence could work for humans is to reframe the technology as “intelligence augmentation,” a term coined by Douglas Engelbart. Artificial intelligence indicates a surrogate in a human task relating to holding God-given dominion over the world, while intelligence augmentation describes a tool under human supervision and control.

Though artificial intelligence programs can execute human decisions, most can’t reason with these decisions.

“A machine with true agency would have a further ability to reason independently about its own actions and unpredictably change course should it consider those actions unethical or in violation of some overarching value or intention,” Herzfeld said.

Philosophers Michael and Susan Anderson have three rules for determining if a robot or program is a moral agent: they must not be under direct control by another agent or user, they choose to interact with their environment, and fulfill a social role or relationship with responsibilities. In the example of a robot health caregiver, Herzfeld said it fulfills the first two, but is not aware of its responsibility to the patient.

But there are also robots and programs with high autonomy in settings with serious consequences and moral implications. In warfare, lethal autonomous weapons systems operate on algorithms, making life-and-death decisions without synchronous human control.

For example, the AEGIS Weapon System is a naval air defense system used by the United States, Australia, Japan, Norway, the Republic of Korea and Spain. It searches and guides missiles in the air, on the surface and underwater. It chooses where and when to fire on its own.

Another example is the Cargo II, a 15-pound multicopter drone that tracks and engages targets using facial recognition technology. Drones can operate in swarms of 20 led by one head drone, which can be operated on its own or by a human.

Herzfeld said that Turkey has ordered 500, and the drone could have attack capabilities.

While providing cost-efficient benefits for commanders — since drones work without getting tired, don’t need to be paid and can work in harsh conditions — Herzfeld said the costs to the larger human community are comparable to the use of nuclear weaponry, which redefined the ethics of war for ethicists and theologians. It is unclear if a drone’s lack of emotion would make war atrocities more or less likely, or if a drone could shrug off human control and turn on the person it was previously taking orders from.

“A species with multiple ways of destroying itself or its environment has to grow smart,” Herzfeld said. “It has to have the wisdom not to do so. Essentially, it becomes a race between the development of technology and the development of morality. If technology wins, we lose.”

Tags : AIchautauqua institutiondrone warfareEthicsinterfaith lecture seriesNoreen HerzfeldSt. John’s Universityweek four
blank

The author Chloe Murdock

This is Chloe Murdock’s first season reporting for The Chautauquan Daily. She hopes to visit Chautauqua in the future, but in the meantime she covers news on Chautauqua’s Interfaith Lecture Series. Chloe is a rising senior at Miami University studying journalism and international studies. When she isn’t leading The Miami Student magazine or writing for The Miami Student newspaper, Chloe enjoys practicing martial arts.