Infinite moments of lucidity and, if it were possible, infinitely more infinite questions arise when absorbing the tacit brilliance of Braden Allenby.
At 4 p.m. Monday in the Hall of Philosophy, Allenby, President’s Professor and Lincoln Professor of Engineering and Ethics at Arizona State University, will convene his second lecture this summer, titled “Modern War and the Rise of Modern Warriors” as a part of Chautauqua Institution’s Lincoln Applied Ethics Series.
During Allenby’s previous lecture, “The Human as Design Space: Toward Human Version 2.0,” he started by reciting a brief passage from Shakespeare’s Hamlet, beginning with, “What a piece of work is a man ….”
Though Allenby took some creative interpretive liberties, it’s key to note that man is a “piece of work,” something that is constructed, and for the sake of that argument, by God. One should also note, Allenby said, that man constructs machines.
Therefore, if man is made by God, and machine is made by man, using Allenby’s logic, man is a machine.
That’s one of the concepts Allenby discusses and questions as he applies it to modern warfare.
How’re we supposed to think about it?
If Allenby were to start selling bumper stickers netting his whole outlook on the subject, it’d probably take up the windshield.
“Designer warriors are an important and particularly human part of a rapidly evolving military, security and environment,” Allenby said. “Without understanding how and why designer warriors are so important, you really don’t understand the world that you’re living in.”
Today, the application of drones in military engagement has become more and more prevalent. Allenby says a major part of drones’ appeal stems from their ability to significantly reduce casualties. However, they don’t reduce the demand for people.
“It takes a lot of people to keep one drone in the air,” Allenby said. “What you want to do is you want to develop autonomous machines because it allows you to rely on less people.”
Some people may believe autonomous machines are a thing of the far future. The fact is, they already exist.
“We already have robots that are empowered to identify targets, track targets and take them out, including human targets, without any human interference,” Allenby said.
The example Allenby cited was in the contained Korean Demilitarized Zone. Also known as the DMZ, it acts as a 160-mile-long and 2.5-mile-wide buffer region of no-man’s land between North and South Korea. There is currently a lethal autonomous device in play that has been deployed by South Korea.
Some may argue on behalf of the complexity and mystery of such powerful machines, that they could potentially initiate an uprising, a familiar theme of the “Terminator” films. Allenby knows that to be true, but that doesn’t stop him from trying to understand the topic further.
“There’s always the chance that something goes wrong with robotic systems. Anybody who’s ever given a talk and had their computer fry knows exactly what technology can do. Some people say you should never deploy these technologies because that might be a problem,” Allenby said. “I think the response to that is not can we make the technology perfect, because we know we can’t. The response to that is, can we make the technology significantly better than the humans that are performing that function? And if we are, then isn’t that a good thing?”