close

Homepage

When COVID hit, local churches turned to lifelong Chautauquan Zach Stahlsmith for online transition

IMG_5478

When church services in Western New York started to transition to online platforms as COVID-19 shuttered places of worship this spring, local pastors looked to Zach Stahlsmith as their gateway to the internet. Which felt great, he said, only he didn’t have all the answers.

Zach_Stahlsmith
Stahlsmith

Recording with a phone seemed easiest, but the video turned out pixelated, the sound muffled. Facebook Live was crashing, every attempt a “hit or miss.” Stores were closing one by one, the servers out of stock.

Perhaps, Stahlsmith said, it would always be about the “weighing.” 

“I think what we all wanted was perfection right out the door,” he said. “We wanted familiarity, but what we found was that we couldn’t have it all. The priority became to pick what we couldn’t live without and do without the rest of it.”

Stahlsmith has worked as an audio-visual technician at Chautauqua Institution for the past seven years — as a member of the Woods Crew, he is a summer staple zipping across the grounds in his golf cart. A recent graduate of Colgate Rochester Crozer Divinity School, Stahlsmith has also presided over two Week Four morning devotional services. In early March, however, he extended his assistance beyond the Institution, this time to local churches looking to reinvent their services as the COVID-19 pandemic “took its toll,” Stahlsmith said. 

Some pastors had too much equipment, others not nearly enough, according to Stahlsmith. Depending on what was available, he helped with YouTube uploads, Facebook livestreams and even radio broadcasts.

“It just felt nice to be able to do something for other people,” Stahlsmith said. “I am still trying to enter into the audio-visual field in a full-time capacity, so it has also been fun for me to learn all of these things, while doing it in a way that’s not just to serve me.” 

This is all new for us, too,” Stahlsmith said. “We are building this from the ground up. Things are not going to work perfectly all the time. There have been a lot of patient people, but of course you can’t make everyone happy.”

The Post-Journal published an article detailing Stahlsmith’s work over the past month on March 28, and he said dozens of pastors reached out to him as a result. He has since coordinated with First Covenant Church in Jamestown, Bemus Point United Methodist Church, Park United Methodist Church in Sinclairville, Gerry Free Methodist Church in Gerry, Ross Mills Church of God in Falconer, Findley Lake United Methodist in Findley and Hurlbut Memorial Community United Methodist Church in Chautauqua.

“You don’t want to have a really long service, no matter what church or what their usual tradition is,” he said. “You are basically watching a TV show — you have about 30 minutes before people turn it off. A lot of pastors had to completely reinvent what they’ve been doing for years, even decades.” 

The adjustments — shorter services and empty pews — induced a lot of fear, he said. 

“Fear was a huge thing,” Stahlsmith said. “(The pastors) didn’t know what to do and they didn’t know how to do it. There are a lot of pastors who are not technologically literate, so the concept of not being able to reach out to people directly was terrifying.”

In mid-April, the outreach calmed down as most churches found their “own rhythms,” so when the Institution’s programming started in late June, Stahlsmith said he looked for ways to carry his off-season knowledge into a new unknown.

“The live-streaming and audio techniques were relatively the same, but I also had to learn brand new things, like how to build sets,” he said. “I was lucky this time because I could work with a team instead of doing it all on my own.”

The ups and downs — late announcements, canceled lectures and files in the wrong folders — have felt the same from place to place: “defeating,” Stahlsmith said. But he has no plans to ruminate this time around; instead, he’s been working on a blooper reel to release at the end of the season. 

“If we can’t laugh at ourselves, what can we do?” he said. 

The transition hasn’t been an easy one for everyone involved. Stahlsmith said he understands the frustrations this “baptism by fire” has created for the community, and is constantly working to provide the best possible programming to Chautauquans.  

“This is all new for us, too,” he said. “We are building this from the ground up. Things are not going to work perfectly all the time. There have been a lot of patient people, but of course you can’t make everyone happy.”

Nonetheless, there is “still so much to look forward to,” he said. 

“That weighing will continue, probably forever,” Stahlsmith said. “It’s not perfect, but it’s possible. It’s not our normal, but it’s better than nothing. I know, on the other side of this summer, we will be better than when we came in.”

St. John’s University professor Noreen Herzfeld warns against serving technology instead of humans

noreen herzfeld

For centuries, technology simply amplified humans’ physical abilities. A hammer amplifies the force of an arm. A telescope helps us see something far away. But Noreen Herzfeld said the difference in modern technology like computers and artificial intelligence programs is that they extend the human mind.

“While technologies generally reflect and refract our purposes, amplify our natural abilities, they can also get away from us, embodying a power or purpose of their own,” she said.

Herzfeld, the Nicholas and Bernice Reuter Professor of Science and Religion at St. John’s University and the College of Saint Benedict, delivered her lecture that posed the question: “Tool, Partner, or Surrogate: How Autonomous Should Our Technology Be?” The lecture, at 2 p.m. EDT Thursday, July 23, on the CHQ Assembly Video Platform, returned to the Week Four Interfaith Lecture Series theme, “Ethics in a Technologically Transforming World?”

Herzfeld’s academic background spans degrees in computer science and mathematics from The Pennsylvania State University and a Ph.D. in theology from The Graduate Theological Union at Berkeley. She has contributed to four books on technology and religion as an author and editor.

Herzfeld pre-recorded her lecture on July 12 in Collegeville, Colorado, and attended a Q-and-A the day it was released with Maureen Rovegno, Chautauqua Institution’s Director of Religion. Rovegno delivered audience questions submitted through the www.questions.chq.org portal and on Twitter with #CHQ2020.

The purpose of developing any technology has always been to alter a condition or change an environment in a way that makes an action or condition easier — like taming the elements, disease, predators — and to make life more comfortable. Most recently, this has played out in COVID-19 solutions.

Herzfeld said machine learning was used to test over 6,000 existing drugs that had already passed clinical trials to see if they could be repurposed to fight COVID-19. Google’s DeepMind team trained a neural network to predict protein structures associated with the virus to help develop a vaccine.

Technology can be useful, but it can also re-shape the society it was created in. 

“Often we’re the ones who have to bend to technology, not vice versa,” Herzfeld said.

Technology can also alter its environment. Herzfeld cited German existentialist Martin Heidegger, who said that when a craftsman constructed a chair out of wood, he didn’t change the inside of the wood to create the chair. But a genetically engineered bacterium is new to the “natural order.”

Herzfeld looked to the Christian book of Genesis to explain the relationship between God’s creation and human creation, which are linked because God created humans in his image. Genesis 1 also gives humans dominion over everything in nature.

“We, too, are destined to be creators because we are in God’s image,” Herzfeld said.

But for Herzfeld, some Biblical scholars take the mandates in Genesis for humans being made in God’s image and having dominion over Earth too far when they view humans as deputies for God on Earth. Negative consequences can arise in human relationships, she said, as when Cain kills his brother Abel in Genesis 4-9. And the development of agriculture thanks to technology led to hubris and the construction of the Tower of Babel, which further divided people.

Herzfeld said that creation, both by God and humans, has three relationships. There is God’s image of himself, God’s relationship with humans as his creation, and humans’ relationships with one another. The story of Noah’s ark is an example of technology used to successfully augment these relationships between Noah and his family, the animals and his covenant with God.

“Human nature is only completely full when we are in relationship with God and one another,” Herzfeld said.

The closest relationship that humans have that emulates God’s hierarchical relationship with humans is the creation of artificial intelligence and robots. Herzfeld said she is not sure this is what we want, lauding the Amish for their careful consideration of technology they do bring into their communities.

“Contrary to popular conception, the Amish have not ‘stopped the clock,’” Herzfeld said. “They accept some technologies and reject others.”

The Amish use phones, but don’t install one in every home because it would discourage face-to-face conversations. They use refrigerated milk tanks, but don’t install one in every kitchen.

For every piece of technology they consider, the Amish ask if it provides tangible benefits — but also if it would hamper the relationships in the community.

Herzfeld said another way to look at how artificial intelligence could work for humans is to reframe the technology as “intelligence augmentation,” a term coined by Douglas Engelbart. Artificial intelligence indicates a surrogate in a human task relating to holding God-given dominion over the world, while intelligence augmentation describes a tool under human supervision and control.

Though artificial intelligence programs can execute human decisions, most can’t reason with these decisions.

“A machine with true agency would have a further ability to reason independently about its own actions and unpredictably change course should it consider those actions unethical or in violation of some overarching value or intention,” Herzfeld said.

Philosophers Michael and Susan Anderson have three rules for determining if a robot or program is a moral agent: they must not be under direct control by another agent or user, they choose to interact with their environment, and fulfill a social role or relationship with responsibilities. In the example of a robot health caregiver, Herzfeld said it fulfills the first two, but is not aware of its responsibility to the patient.

But there are also robots and programs with high autonomy in settings with serious consequences and moral implications. In warfare, lethal autonomous weapons systems operate on algorithms, making life-and-death decisions without synchronous human control.

For example, the AEGIS Weapon System is a naval air defense system used by the United States, Australia, Japan, Norway, the Republic of Korea and Spain. It searches and guides missiles in the air, on the surface and underwater. It chooses where and when to fire on its own.

Another example is the Cargo II, a 15-pound multicopter drone that tracks and engages targets using facial recognition technology. Drones can operate in swarms of 20 led by one head drone, which can be operated on its own or by a human.

Herzfeld said that Turkey has ordered 500, and the drone could have attack capabilities.

While providing cost-efficient benefits for commanders — since drones work without getting tired, don’t need to be paid and can work in harsh conditions — Herzfeld said the costs to the larger human community are comparable to the use of nuclear weaponry, which redefined the ethics of war for ethicists and theologians. It is unclear if a drone’s lack of emotion would make war atrocities more or less likely, or if a drone could shrug off human control and turn on the person it was previously taking orders from.

“A species with multiple ways of destroying itself or its environment has to grow smart,” Herzfeld said. “It has to have the wisdom not to do so. Essentially, it becomes a race between the development of technology and the development of morality. If technology wins, we lose.”

Deborah Johnson, former professor at the University of Virginia, discusses the danger of deepfakes for institutions and how society can defend against disinformation

Screen Shot 2020-07-24 at 9.28.21 AM

A video circulated around the internet of former President Barack Obama calling President Donald Trump a “dipshit” and making references to the movies “Black Panther” and “Get Out.” 

“Now, you see, I would never say these things. At least not in a public address, but someone else would. Someone like Jordan Peele,” Obama appeared to say in the video. “This is a dangerous time. Moving forward, we need to be more vigilant with what we trust from the internet.”

This video was a “deepfake,” a video created using synthetic media technologies, a type of artificial intelligence, to show people saying and doing something that they never did. The video of Obama was created, said emeritus professor at the University of Virginia Deborah Johnson, by taking a picture of the former president and superimposing it on a video of Peele talking. The AI then made Obama’s face match the movements of Peele’s, making it look like Obama was talking.

Johnson recently retired as the Anne Shirley Carter Olsson Professor of Applied Ethics in the University of Virginia’s Department of Engineering and Society. She authored one of the first textbooks on computer ethics in 1985. She talked at 10:45 a.m. EDT Thursday, July 23, on the CHQ Assembly Video Platform as the fourth part of Week Four’s theme of The Ethics of Tech: Scientific, Corporate and Personal Responsibility.” Johnson discussed multiple examples of deepfakes and the potential harm they can cause on individuals and institutions, and how they are applied for entertainment and education — but also about how society must take action to defend against this kind of disinformation.

Johnson is a philosopher who looks at ethical implications of digital technologies, going into this field because she “saw that they were having this enormous effect on our world.”

An example she shared was a video where Adele’s face was doctored by AI to appear like she was talking about other deepfakes, such as ones impersonating Kim Kardashian and Arnold Schwarzenegger. The video ended with her saying she was not Adele, but media studies expert Claire Wendell. Another example of a deepfake was of Mark Zuckerberg, CEO of Facebook, talking about how one man can control billions of people’s data.

Johnson said that the most frequent use of deepfakes are in pornography, mainly taking a face of a female celebrity and superimposing it on a pornographic video. She also said that the technology is available online, making the process very easy — meaning a person does not have to be an expert to create a deepfake. 

Deepfakes are not solely used for misinformation and pornography, Johnson said, but also for entertainment and education. An example of a deepfake used to entertain is a video of actor Burt Reynolds as James Bond. Reynolds was considered for the role in the 1970s, and the creator of the deepfake wanted to see what that scenario would have looked like.

Johnson said a handful of museums are also using deepfakes to recreate historic events, and also have videos of dead artists talking about their work. One possibility in deepfakes, she said, is having speeches by political leaders being translated into multiple languages and making it seem like the person is speaking in the listener’s language.

“Traditionally, we have relied upon our senses, especially hearing and seeing, to tell us what’s real and true,” Johnson said. “If I see someone directly doing something or saying something, I believe that the person said that and did that.”

Some people may argue that technology has mediated information for hundreds of years, with inventions like the printing press, photographs and radio, but Johnson said deepfakes provide “a much greater opportunity for mischief.”

“I think the most worrisome thing about deepfakes is this idea that’s referred to as amplification,” Johnson said. “It’s the idea that you’ve got this deepfake that can spread across the globe very quickly and very broadly. So it has much more power than a single person telling you a lie.”

She said there are three categories of harm caused by deepfakes: harm to viewers, harm to reputations and harm to institutions.

When people make deepfakes and generate other fake news, Johnson said they are intentionally misleading the public for their individual goals.

“When someone gives us information that’s false, they’re undermining that autonomy; they’re there in a kind of classic term, they’re using us as a means to their end,” Johnson said. “They’re manipulating us to do their work and not allowing us to think, the way we think and vote the way we want to vote.”

Reputation is especially important with elections, when Johnson said “you win or lose because of your reputation.” She said that while deepfakes may seem like a clear example of defamation, courts also do not usually pursue defamation claims around political speech, because they do not want to interfere with the election process.

“They have this rule against not interfering with (political) speech, not because they think it’s not harmful — they know it is harmful. But they are worried about how to do it,” Johnson said. “They’re worried that it would be too hard and too political to try to draw the line between what is considered a lie and what is considered the truth.”

Deepfakes and other forms of disinformation harm institutions such as the electoral process, and Johnson said if people do not trust the process, they do not trust the outcome. 

“It hurts everyone. It hurts the winners as well as the losers,” Johnson said.

She said remedies to the problems caused by deepfakes exist, such as educating the public on media literacy and teaching people how to spot videos that spread disinformation. Two other strategies are protecting individuals targeted by deepfakes, like having a part of a political staff dedicated to keeping track of information on the internet, and creating technology that can decipher if something is true or false. 

The last way of defending against deepfakes is by stopping them from spreading. California, for example, bans the use of deepfakes in elections, but Johnson said she is not sure how much this is enforced and some people may believe banning deepfakes interferes with free speech. An alternative, Johnson said, is social media sites labelling the video as questionable or as not containing valid information. 

“We have a set of technologies that are capable of a good deal of mischief, to put it mildly,” Johnson said. “And not only should we be aware of them, we should all be trying to figure out how to get the benefits of these synthetic media technologies, without letting the technology undermine the integrity of our oral and visual experience.”

The lecture then transitioned to a Q-and-A session with Emily Morris, vice president of marketing and communications and chief brand officer at Chautauqua Institution. Morris asked how fast the development of detection tech is happening compared to the development of deepfake tech.

Johnson said it is hard to evaluate the pace, but the technology for detection is going forward. She said another detection strategy is having the creators of the technology behind deepfakes tell the public where to look for flaws in the video. 

“I always think about this as kind of an arms race. You have the new technology and they counter,” Johnson said. “Then they change the way they do it and then you have to get a new counter.”

Morris asked what’s next on Johnson’s agenda.

Johnson said she has recently started working on accountability in AI, and said that even the creators of an artificial intelligence sometimes do not understand how the algorithm came up with a decision they made. 

“Who is then responsible for decisions that are made by artificial intelligence? I’ve always, for a long time, thought we’re being set up to accept that nobody is responsible, which I don’t accept,” Johnson said. “I think it’s the people who design the artificial intelligence systems and test them, and then put them out there for use, knowing that they don’t really understand how they work.”

Second Opera Invasion to be a virtual take on Opera Open Book

07010_OperaInvasion1_EB PM

This season’s second virtual Opera Invasion will be a Choose Your Own Adventure of sorts.

“All of our events this summer are available on-demand, (so) whether or not you actually (watch) an event live doesn’t really matter,” said Steven Osgood, Chautauqua Opera’s general and artistic director. “However, this one does (matter): If you’re not there you can still watch it later, but you’re not going to be part of the decision making.”

Chautauqua Opera’s Opera Invasion #2: Opera Open Book, will air at 10 a.m. EDT Friday, July 24, on the CHQ Assembly Virtual Porch.

Opera Open Book is an event that has taken place every year since the invasions started in 2016. It began as a way for the Opera to showcase the expansive repertoire of its Young Artists.

“When they come into an audition, they bring in three or four or five pieces that they’re offering,” Osgood said. “While they may get to choose the first piece they sing, they certainly don’t get to pick the second or third pieces — that’s entirely up to the audition panel. They have to have this library of pieces that are not only good show pieces for them vocally, but that they can sing at the drop of a hat.”

In past Opera Open Books, a few Young Artists took over Odland Plaza while Chautauquans lined up to get into the Amphitheater for an evening performance. Each Young Artist came prepared with five songs from their collections, and Osgood let audience members choose what they wanted to hear out of the “menu” of songs, which the Young Artists would immediately perform.

“It allows our audience to really appreciate how nerve-wracking the experience (of auditioning) might be,” he said.

Due to the limits and variation of technology available to the Young Artists, they will not be performing live, but that element of spontaneity will still be present.

Of all the events we’re doing, this one is purposefully built to test the capacity for us to use this (virtual) platform, and for me not to mess things up,” he said. “It’s going to be partly fun just to see how many times I flirt with disaster in the course of the hour.”

The Young Artists have submitted pre-recorded videos performing arias from their audition repertoire. Using the Virtual Porch’s live polling and Q-and-A features, Osgood will survey viewers on what they want to hear.

“We’ll have all of the arias ready to go, we’re not telling the audience what they are, but I will ask the audience questions that lead to the decision of what the first piece will be, what the second piece will be (and so forth),” Osgood said. “(I’ll say), ‘Do you want to hear a tenor next or a mezzo soprano next?’ And we’ll take the voting and once that voting is in, I’ll say, ‘OK, it’s a tenor, from the tenor arias that are remaining there’s two English and one in French, do you want French or English?”

Osgood recommends that Chautauquans who want to help pick the first two arias tune in at least five minutes early. He’s only a little nervous about the event’s many moving parts.

“Of all the events we’re doing, this one is purposefully built to test the capacity for us to use this (virtual) platform, and for me not to mess things up,” he said. “It’s going to be partly fun just to see how many times I flirt with disaster in the course of the hour.”

Carnegie Mellon University panel talks importance of ethical AI, evolving views on computer science

Screen Shot 2020-07-23 at 9.49.46 AM

David Danks, head of the Department of Philosophy at Carnegie Mellon University, said if artificial intelligence is used in the United States to prevent crime or arrest more people, the AI would most likely target communities of color based on the country’s history of systemic racism, which has been protested against in recent weeks. 

“Unfortunately, I think, hopefully we all now recognize there’s a very long history of law enforcement and governments using technologies for surveillance and control, rather than purely for safety and support,” Danks said.

Though data and codes are essentially ones and zeros on a screen, Danks said that technology embodies the values that the creator prioritizes. 

“If you have a hammer, it embodies the value that everything is a nail. And, in practice, most of these AI systems aim to optimize. They tend to be high performing at some task,” Danks said. “Somebody is evaluating this task, whether it’s predicting where a crime will occur or predicting where an arrest might occur.”

Danks is the L.L. Thurstone Professor of Philosophy & Psychology at Carnegie Mellon University. He talked with Jennifer Keating and Illah Nourbakhsh at 10:45 a.m. EDT Wednesday, July 22, on the CHQ Assembly Video Platform as the third part of Week Four’s theme of The Ethics of Tech: Scientific, Corporate and Personal Responsibility.” Keating is a Senior Lecturer and Writing the Disciplines Specialist in the Department of English at the University of Pittsburgh and Nourbakhsh is K&L Gates Professor of Ethics and Computational Technologies at CMU and director of the Community Robotics, Education and Technology Empowerment (CREATE) Lab.

Nourbakhsh started the lecture by delving into three words that are needed to understand the way that AI influences humanity: agency, equity and power. Agency refers to how much people rely on AI to make decisions for them — an example of this is a Boeing 737 Max, which can make decisions as well as, or better than, pilots around 99% of the time.

“The problem is what happens in that .001% of the time when the AI system flying the airplane does a worse job the pilot and the pilot doesn’t know what’s going on because they’ve lacked agency,” Nourbakhsh said. “It’s been taken away from them.”

Equity refers to bias within AI, such as when credit approval is done by an algorithm that learns from decisions by humans in the past — but those decisions had racial bias, so the AI does, too. Power refers to the large amount of power AI brings to companies, such as advertisers who can use AI that can do a much better job than any single person. 

“It changes the dynamics of democracy, when our ability is based on information that’s highly mediated, and therefore not balanced. And that’s the fundamental question of power relationships,” Nourbakhsh said. “I can make the world better, but in doing so we can actually exacerbate inequities that can exacerbate unequal power relationships.”

Keating then talked about power and political strife, and how information gathering affects the way society thinks. She cited Anna Burns’ award-winning novel Milkman, which showed how powerless people can feel when their government seems omnipotent. She said that predictive policing measures can squash forms of resistance against the government. Keating questioned what the government in South Africa during apartheid would have done with modern technology to suppress the movement to end racial segregation.

“What if today’s technological advancements had been applied to such a movement?” Keating said. “What level of scale could have pushed down this already vulnerable population to quell any form of resistance or rebellion?”

Nourbakhsh said that in the past computer science was mostly a theoretical field and was treated like math, “We never thought in the design of the curriculum that we were having influence on society. So we didn’t bother with the question.”

His students at Harvard were excited about Google’s slogan of “Do No Evil,” and the prospect of becoming millionaires, but a transition happened as the company and others in the technology industry grew.

“What I started seeing is graduates of our programs in the early 2000s actually emailing me back and saying, ‘Could you come and do a lecture to my group at Google on ethics, because we’re getting into some spaces that I’m uncomfortable with?’” Nourbakhsh said.

He said that his current students are not interested in becoming millionaires or organizing the world’s information.

“What’s going on is a fundamental recognition, especially by millennials,” Nourbakhsh said, “That the code, they write changes and influences society, it changes our relationships in society.”

He said Barbara Grosz, Higgins Professor of Natural Sciences at Harvard University, focuses a week on her students arguing issues around computer science and AI, such as the pros and cons of facial recognition.

Danks said that it is important to educate the next generation about the challenges of AI, but that the industry and the government need to improve now. To achieve this, he said organizations need to recognize that ethical AI and technology are not products, but processes.

“You can’t just build cool tech, and then slap some ethics on at the end. That isn’t how it works,” Danks said. “It’s never going to achieve the goals that you really have in mind.”

He said that some companies, like Facebook, have had to “rebuild while your ship is out to sea.” This rebuilding means they had to change and question if the technology they create embodies their own values.

“When I’ve worked with companies,” Danks said, “oftentimes they come to me and say, ‘We need ethical help.’ And my response is to say, ‘No, you know what you should do. You just have to decide that you’re willing to do it.’”

Danks said this does not just apply to corporations; each individual must try to engage with technology that advances their own values and interests. He said several years ago he realized that social media platforms were not supporting his own personal interests.

“They weren’t empowering me, but rather were empowering those companies, and so I made a deliberate decision to simply remove myself,” Danks said. He also said that the decision of which technology to use is different for everybody, and that many people use social media effectively. “But it’s important for us to reflect on what it is that we want from the world and engage with technology in that way, so that we can all have more ethical and societal responsible technologies.”

The lecture then shifted to a Q-and-A session with Vice President and Emily and Richard Smucker Chair for Education Matt Ewalt. His first question was how teachers can create a learning environment that is respectful to students who start out wanting meaningful change in society, while also challenging assumptions and biases. 

Nourbakhsh said it is important to understand the mindsets of younger generations. Many millennials and those in Gen Z are disappointed in previous generations, whom they feel have failed in terms of economic equality, climate change and other issues. 

“That’s what used to drive them, not just a hopefulness to make the world better, but a fear that the world is basically political hell in a handbasket,” Nourbakhsh said. “And it’s going down and they need to solve it and resolve it, so that it doesn’t go downhill that badly.”

Younger generations are effective at creating social movements that are leaderless, he said, such as Black Lives Matter, “without a normalized sensibility that says, ‘Here’s the five steps we want to sell the older generations on.’”

Danks said that many students have a hunger to bridge the gap between people who disagree with them, especially older generations, and are frustrated that there are no places to do so. When these conversations do occur, he said students often feel that they are not talking through their differences and issues, rather they are “typically just being talked at rather than engaged with.”

Ewalt’s last question was what takeaway the panelists would like the audience to leave with.

Keating went back to an important point in the beginning of the lecture, that people need to be aware of their relationship with technology.

“(We need to be) conscious of those relationships with these tools, these machines, but also highly sensitive to the ways in which that might be affecting your interactions with other individuals, or collectives,” Keating said.

School of Music Instrumental students to feature fantasy flair in second recital

SoM-02
ILLUSTRATION BY MADELINE DEABLER/DIGITAL EDITOR

Fantasies are fanciful, seemingly untethered in their rhythm, form and inspiration. The musical term, more commonly known as “fantasia,” has consistently signified freedom, an absence of structure. Of all the f-words formerly mentioned — fanciful, free, form — one School of Music student said the foremost is “flair.”

“For fantasies, you don’t have to play like the composers and musicians who came before you,” said Joseph Brozek, a trumpeter and student from Northwestern University. “It’s not about emulating someone else’s sound, it’s about crafting your own.”

Brozek will bring his flair to the second School of Music instrumental recital of the season at 7 p.m. EDT Thursday, July 23, on the CHQ Assembly Virtual Porch. Joining him is cellist Katsuaki Arakawa, flautists Chi Ting and Lauren Scanio, and violinist Jason Hurlbut.

Brozek chose Jean-Baptiste Arban’s “Fantaisie Brillante” for the performance. Arban was a French composer of the 1800s, who is credited for proving the trumpet could be a solo instrument by developing its virtuoso technique. 

“Fantaisie Brillante” was rearranged for a soloist by John Philip Sousa in 1885. 

“The history of this selection is so important to know, because this piece helped put the trumpet’s sound on the map as a solo instrument,” Brozek said. “The composition showed all of the trumpet’s strengths and I think people began to develop a new appreciation for it after that.” 

The variations throughout possess a breadth of different musical styles. It was composed in “theme and variations,” meaning it consists of a starting introduction, then a melody, or theme, followed by variations of that melody. The melody variations in “Fantaisie Brillante” employ triplet groupings, quadruple 16th-note groupings, and finally, the triple tonguing, used when a wind musician is required to play a fast passage in groups of three, Brozek said.

“It features many different techniques across the instrument and many different emotional shifts, too,” Brozek said. “From both a musical and a technical perspective, this piece gives a vast array of a lot of different things an audience can listen for.”

But that’s all “trumpet talk,” he said. At the end of the day, Brozek said what’s most important is ensuring each variation has “flair and personality.”

“That’s really my goal — to get my unique touch across,” he said. 

Although he has had his eye on the piece for “quite some time,” Brozek said he learned it in its entirety online. 

“Learning this piece and diving into it deeply was actually very effective over Zoom,” he said. “I was very fearful of the limits the online transition could cause for us, but frankly, it’s all working itself out.” 

Lauren Scanio, flautist and student from The Juilliard School, will perform Georges Hue’s 1913 “Fantasie.” Hue’s “Fantasie” was set as a competition piece for the end-of-the-year exams at the Paris Conservatoire and displays the virtuosity of the modern boehm flute. It also served as an homage to Claude-Paul Taffanel, French flautist and founder of the French Flute School. 

You can tell Hue really knew the flute when he wrote it, because it flatters the instrument well, — both in the virtuosic opening and ending, but also in the middle there are some beautiful long phrases that are aria-like,” Scanio said. “It’s golden, really.”

“This one is a favorite, so much so that flautists like myself are playing it 100 years later,” Scanio said. “That says something.” 

As a classic “French Romantic piece,” Scanio said the piece includes long lyrical lines and technical passages that require a masterful implementation of dynamics and tone. 

“You can tell Hue really knew the flute when he wrote it, because it flatters the instrument well, — both in the virtuosic opening and ending, but also in the middle there are some beautiful long phrases that are aria-like,” she said. “It’s golden, really.”

Though a classic, Scanio said there is a lot of room for variance in different “colors, dynamics and vibrato.”

“There are a lot of opportunities for variance in a fantasy like this; no two people play it exactly alike,” she said. “The challenge would be to come up with a way to play it that no one has heard before. I would like to think I’ve done that.”

Faith-based ethics: Southern Baptist Convention’s Jason Thacker weighs in on how technology can be used for both good and evil

Screen Shot 2020-07-23 at 9.34.11 AM

Whether people regard artificial intelligence as a tool that can turn into either a future friend or a threat, Jason Thacker said the technology is already here.

“The reality is that AI is everywhere in our society, and if you don’t believe me I dare you to say something like, ‘Hey Siri,’ or, ‘Hey Alexa,’ because ultimately something around you will likely light up,” Thacker said. “Whether it’s wearable tech like a watch or smartphones, something around you is connected to the cloud and connected to these AI systems.”

Thacker delivered his lecture “The Age of AI: Artificial Intelligence and the Future of Humanity” at 2 p.m. EDT on Tuesday, July 21, on the CHQ Assembly Video Platform. The name of Thacker’s lecture matches his book and was part of the Week Four Interfaith Lecture Series theme, “Ethics in a Technologically Transforming World?”

Maureen Rovegno, Chautauqua’s director of religion, led the subsequent Q-and-A while the audience submitted questions through the www.questions.chq.org portal and through Twitter with #CHQ2020.

“Your faith-based ethics give you that joyful kind of optimism that counteracts the pessimism that many people have started to feel — that technology is taking over areas and directions that have shadow sides to them,” Rovegno said.

Thacker is Chair of Research in Technology Ethics at The Ethics and Religious Liberty Commission of the Southern Baptist Convention. He said his work with the ERLC revolves around the message in the Book of Matthew, specifically Matthew 22.

“The ethical system in Matthew 22 is more robust than any challenges we will face and any innovations that will come,” Thacker said.

He holds a Master of Divinity from The Southern Baptist Theological Seminary and is now pursuing a Ph.D. in Ethics and Public Theology at the seminary. In the meantime, he writes articles on technology ethics for the ERLC site and has also written for Christianity Today, The Gospel Coalition and other online sites. His work with the ERLC’s document, “Artificial Intelligence: An Evangelical Statement of Principles,” was featured on Slate.

Thacker said that new technology is not actually bringing up new questions.

“Questions that are posed by today’s technologies like artificial intelligence actually aren’t new at all,” Thacker said. “This is because AI doesn’t really cause us to ask new questions of humanity, per se, but to ask age-old questions of new opportunities. It’s the same old vices and sins and proclivities we’ve always dealt with in humanity, but with new opportunities before us.”

Thacker said that artificial intelligence poses two questions. What does it mean to be human? And what is the role of technology in our lives?

What being human means, for Thacker, is based on the fact that in Christianity, God created humans in his likeness and image. Others, however, have claimed that religions are no longer necessary to guide ethics because of science’s progress.

“Many secular folks will caricature faith as believing in something without fact or knowledge, but fail to see that science itself has some faith to it,” Thacker said. “Because we’re not only able to explain what is seen, we have to identify the design behind it.”

Thacker cited Professor John Lennox of Oxford, who said that past philosophers and scientists — including Galileo and Isaac Newton — believed that God created the laws of nature, which drove their scientific inquiries and major breakthroughs.

As an ethicist, Thacker thinks about his work in a similar way.

“I simply cannot buy into this role of autonomous thinking untethered from any natural law framework of the world with a creator god at the center,” Thacker said. “I simply find it unsustainable and unconvincing ethically, as well as unsustainable with the pursuit of truth and the way this world works and what it means to be human.”

Thacker looks to the Book of Genesis to answer the old questions. In the story of creation, God created everything and made humans separate from the rest of the world, including animals, in his image to take dominion over all things and be stewards of the world. Thacker believes technology allows for humans to fulfill this role.

“This creativity and these abilities to make things to aid us in our role as image-bearers is the core of what technology is,” Thacker said. “ … But these tools and technologies are made by fallible and sinful human hands, and they are quick to show the brokenness of this world.”

Thacker said that tools can be used to lord power over others and dehumanize them, and therefore dishonor God. It reflects Cain’s sin in the Bible, when he used his strength, given by God in order to work, to kill his brother Abel instead.

Along with the promise of what technology can do, it can also open up new ways to hurt others.

“It expands what is possible for us to do, and ultimately we are the ones who are responsible for it, not the tools themselves,” Thacker said.

Pulling morality from the Bible allows for Christians to be unfettered by changing social mores over time, Thacker said. Christians instead seek to love God and their neighbor as fellow image-bearers.

Thacker said that technologies that follow a natural order, aligning with protecting God’s creation, fulfill proper uses.

“There are so many God-honoring benefits in these technologies,” Thacker said.

Thacker said technology, including technology used in warfare, has uses in protecting people as fellow image-bearers, regardless of differences.

“And so that’s why I’ll stand up for Uighur Muslims in China, just like I’ll stand up for the person next door or the elderly lady down the street or the baby in the womb,” Thacker said. “It’s because of this concept of human dignity in the image of God.”

As a Christian, Thacker is not afraid of any negative outcome from humans’ misuse of technology because of the message in the Book of Revelations, the final book in the New Testament that describes Jesus returning to Earth to save believers once again.

“I don’t fear killer robots, or massive job loss, or catastrophic downturn, because I know that my God is reigning and ruling and holding the entire universe together in his hand,” Thacker said. “There is nothing that will catch my God off-guard, and nothing that will stop his plans for this world.”

School of Music voice students to grapple with love and loss in second recital’s art songs

SoM-02
ILLUSTRATION BY MADELINE DEABLER/DIGITAL EDITOR

Strung to the melodies of the world’s most well-known art songs are stories of war and peace; love and loss. In this week’s Voice Program recital, the poetry behind a French song will transport audiences to the battlefield, an English piece to face the regret of time one can’t get back, and an Italian song on a journey of following love no matter the cost.

“You have to sit with it — the meanings between the lines,” said Gabriela Linares, a School of Music student.. “Sometimes the poems are so complex they mean different things to different people. Finding what’s personal to you takes a song from good to great.”

Eight students from the Chautauqua School of Music Voice Program will perform in their second recital of the season at 7 p.m. EDT on Wednesday, July 22, on the CHQ Assembly Virtual Porch. 

Cesar Parreño, a student from the Juilliard School, will perform two French pieces, starting with Olivier Messiaen’s 1936 “Paysage,” which translates to “landscape.”

“It’s about the beauty of a landscape you used to go to with your loved one,” Parreño said. “Wherever she may have gone to, you wish you could be there with her, so you spend your time remembering the beauty of the earthly places you have been together.”

However, Parreño said the piece is not sad in tone, but rather laced with somber notes and hints of hope. 

“The music is sweet and tender and a little bit joyful,” he said. “There is a sense that you know you’ll be with your loved one again, someday, in time.”

Next is Francis Poulenc’s “C,” the first piece from his 1943 “Deux Poemes de Louis Aragon.” Aragon, a French poet, is considered to be a founder of the surrealist movement. Written during World War II, “C”  mixes sentiment, nostalgia, and passion into a heartrending statement. 

“It’s a very abstract poem,” he said. “It has a very somber, ghost-like sound to it. The ambience is impossible to explain, you have to listen to feel the depth.”

Matthew Goodheart, a student from Cincinnati College Conservatory of Music, will perform three pieces. The first two are in Italian, including Francesco Paolo Tosti’s “Ideale,” a “must have” in any tenor’s repertoire. “Ideale” is the most emotional of his three songs and details the emptiness a person feels when love is gone, according to Goodheart.

“It is about someone that you love, how close you were and how you followed that person for as long as you can remember and now they are gone,” he said. “You learn they were the ideal person for you, but you can no longer have them.”

The somber tone of the song is coupled with words closely associated with happiness, including “rainbow,” “peace,” and “radiance.” Goodheart said despite the word choice, the song is “quite sad” throughout.  

“There is really no redeeming happy quality to it.” Goodheart said. “It’s the hardest of the songs that I sing, too, because you have to find the fine line between using your emotion and singing it well on a technical level.”

Next is Franz Schubert’s “Halt!” from “Die schöne Müllerin,” a song cycle based on poems by German poet Wilhelm Müller. 

“It’s very high energy the whole time and extremely fast paced, but I love it because it sits in a place in my voice that I feel really comfortable singing,” he said.

Next is Michael Head’s “Sweet Chance that Led my Steps Abroad,” based on a poem by William Henry Davies, “A Great Time,” first published in 1914.

“It’s about thinking about things that have happened in your life that are so great, but maybe in the moment you didn’t realize just how great they are,” he said. “Looking back, though, you realize that moment will never happen again, making an ordinary moment feel special.”

Linares, a student from Oberlin Conservatory, will perform “When I am Laid in Earth,” an aria from the opera Dido and Aeneas by English composer Henry Purcell. In the opera, the characters grapple with themes of death and leaving love behind. This particular aria comes at the end of the opera when one character flings herself toward death after being abandoned by her true love. 

When you find what you connect with, you can convince the audience you are telling your own story, not someone else’s through a song that doesn’t belong to you,” Linares said. “That’s when they believe in it the way you believe in it.”

She will also perform Antonin Dvořák’s “Zigeunerlieder” Op. 55. “Zigeunerlieder” was composed based on a collection of poems, Gypsy Melodies, by Czech poet Adolf Heyduk. The poems are grounded in folk verse, particularly that of Slovak and Czech. Linares said there is an emphasis throughout the piece on freedom as something to be valued above all else, one she feels connected to within her “uncertainty of identity.” 

“Dvořák was comparing his people to gypsies and how they were trying to make the best of their situation, yet could never grasp not being able to have their own land,” she said. “That really connected with me because I am from Puerto Rico and my people feel alienated all the time as we go back and forth between feelings of belonging in the U.S.”

In passionate themes of war, peace, love and loss, Linares said connections can be made “spanning centuries.” According to her, those connections are where the “magic happens.”

“When you find what you connect with, you can convince the audience you are telling your own story, not someone else’s through a song that doesn’t belong to you,” Linares said. “That’s when they believe in it the way you believe in it.”

Avett Brother bassist Bob Crawford joins Bishop Gene Robinson for a conversation on ‘Faith on Stage’

Avett_DailyPhoto_2

The Avett Brothers’ 2012 song “Live and Die” begins with the lyrics, “All it’ll take is just one moment and / you can say goodbye to how we had it planned.”

The band’s upright bass player Bob Crawford is credited as a songwriter on “Live and Die,” off the Avett Brothers’ album The Carpenter, and Gene Robinson has a feeling he knows the exact moment that changed Crawford’s plans forever.

“His faith journey took on a real serious note,” said Robinson, Chautauqua Institution’s Vice President for Religion, when Crawford’s daughter, Hallie, experienced her first seizure. Doctors subsequently found a tumor in the 2-year-old’s brain.

“His whole life changed at that moment,” Robinson said. “(I’m curious about) how it changed his perception of God, his relationship with God, and what it’s like to be in the public eye and go through something like that when many people in the world are watching and listening.”

Robinson and Crawford will interview each other in a live Skype conversation at 2 p.m. EDT Wednesday, July 22, on the CHQ Assembly Video Platform. It’s a departure from the Week Four Interfaith Lecture Series theme, but the program, titled “Faith on Stage,” has been long in the works. With the thrice Grammy-nominated band originally set to perform July 22, 2020, on the Amphitheater stage, Robinson and Vice President of Performing and Visual Arts Deborah Sunya Moore worked together to bring band members on the 2 p.m. platform as well, to be interviewed about their faith journeys for Crawford’s podcast, “The Road to Now.” With COVID-19 postponing the Avetts’ third Amp performance to Aug. 4, 2021, Robinson and Crawford moved forward with the 2020 program that they could.

“We’re going into it with a wonderful spirit,” Robinson said. “I’ll be interviewing him, he’ll be interviewing me, and even our own audiences will learn something new about each of us.”

“The Road to Now” podcast, co-hosted by Ben Sawyer, brings together “historians, politicians, journalists and artists to the table for conversations that illuminate the map that brought us to where we are today.” An offshoot called “RTN Theology” focuses on faith, art, religion, and theology. But one word sticks out for Robinson in the podcast’s name, a word he thinks relates back to those opening lines of “Live and Die” — “now.”

“The name of the podcast speaks to this, that the only moment you can be sure of is now,” Robinson said. “(Those lyrics are) what happened to him with that experience. You can say that about so many things, that change everything. It struck me as a life lesson that we have these plans and we pretend that they’re all going to happen until they don’t.”

Crawford took time off from the Avett Brothers’ 2011 tour to be with his family while his daughter underwent chemotherapy; while Robinson will ask for an update on Hallie in their conversation, in 2017, the Sun Sentinel reported she had been in remission since 2013.

“It’s my impression that we come closer to God and our relationship with God deepens almost completely when we are at our wits’ end,” Robinson said. “Being in an extreme situation, we are more able to apprehend God. I’m looking forward to asking Bob his perspective on that.”

Crawford has spoken with several media outlets about how his family’s experiences have deepened his faith, and impacted his work. The band’s 2017 album True Sadness is an indication, written as band members were dealing with death, divorce, and illness.

“My daughter lost the right side of her brain. She’s severely disabled. It’s never going to be OK. My wife and I are constantly working through it,” Crawford told the Sun Sentinel. “But she’s such a joy to be around, you know? So I think that true sadness is where we walk through life, feeling the sweetness of joy. We experience that while also suffering a little bit, feeling the pain and fragility of life.”

To put that pain and fragility into a song, Robinson said, while living in the public eye, is admirable.

“My own experience is that there’s the public Gene and the private Gene, and yet my public face is a face related to religion,” said Robinson, the first openly gay bishop in the history of Christendom — his appointment to lead the Episcopal Diocese of New Hampshire was met with controversy and death threats, ultimately resulting in his being put under FBI protection in the days leading up to his consecration in 2003. “I think for any public person, you just have to preserve (yourself), and not lose track of who you actually are. And one of the things I admire about Bob in particular, and the Avett Brothers in general, is that they take that really personal stuff and put it into their music. It’s one thing to have someone in Nashville write a great song and then you record it; it’s another thing to not only write your own, but to have it come out of your own lives.”

In an interview with the Sioux City Journal, Crawford said he was “comfortable where life is, for all the tragedy and upheaval.”

“This life we live, I don’t know how you can handle it without God,” he said. “We’re all kind of over our heads.”

This program is made possible by “The Lincoln Ethics Series,” funded by the David and Joan Lincoln Family Fund for Applied Ethics & the Carnahan-Jackson Religious Lectureship. 

Chautauqua Theater Company takes to the virtual stage for workshop of Heather Raffo’s ‘Tomorrow Will Be Sunday (working title)’

Raffo_Koons

In a time where virtual connection is more important than ever, a play originally written for the stage has found itself transformed into an online collage of content, whether that be monologues shot in an apartment, scenes filmed with an Iphone on a street in suburbia or a piece of virtual art.

Raffo & Koons

Tomorrow Will Be Sunday (working title), a piece by award-winning playwright and performer Heather Raffo, explores the idea of invisible connections in a suspenseful thriller that follows people on the move around the world and the strings that tie them together.

Chautauqua Theater Company will introduce its second New Play Workshop project Tomorrow Will Be Sunday (working title) at 8:15 p.m. EDT Wednesday, July 22, on the CHQ Assembly Virtual Porch.

The play was developed through a McKnight Residency at The Playwrights’ Center in Minneapolis and is currently still in the workshop phase, with a premiere date yet to be set. Its time at Chautauqua and the NPW program is funded by the Roe Green Foundation.

Director Jenny Koons feels especially connected to the content of the play, and has enjoyed her collaboration with Raffo. 

“Heather talked a lot about the invisible web and network that connect people around the world, both economically and interpersonally, and the ways that people move,” Koons said. “It really resonated, because for the past few years I have worked on projects that have done a lot of research into the refugee crisis, and this moment of time that we are in where mass numbers of humans are moving around the world for different reasons.”

Originally intended to follow a traditional format, Tomorrow will be Sunday (working title) has been transformed by both Koons and Raffo as a response to the transition onto CTC’s virtual platform. The play will be a series of scenes, monologues and images that do not seem connected at first but begin to fit together as the narrative expands. 

“Once you start to step back, you start to see connections,” Koons said. “There’s something about it that feels really kaleidoscopic and is trying to embrace the sprawl of globalization and the myriad of ways that we are connected to each other.”

Each of the 34 CTC Conservatory members collaborating on the project brings something unique to the play, which Koons believes has helped shape the production into the format she called a collage. 

“It is less a single narrative and more attempting to zoom way out, to see a lot of people instead of just a few,” she said. 

Though Koons has enjoyed the process of directing a play virtually, she has found it difficult to recreate the element of closeness that would be present in a traditional NPW.

“The thing that is challenging for all of us in this moment is how to build community and intentional gathering spaces in this time that feels so deeply isolating,” Koons said. “In olden times, we would have been in a room of 40 people, which has an energy and a spirit and a breath to it which is palpable, and establishes a level of trust and play and playfulness that is harder to conjure in a virtual space.”

However, the shared struggle of reaching out across the internet has inspired much of the production and workshop process, especially when it comes to Raffo’s analysis of global connection. 

“In the last four months, so many of the ideas about invisible ties, networks or connections are suddenly so glaringly visible in how we are all, as a planet, encountering this moment in time,” Koons said. 

Each aspect of the play has been shaped by each of these challenges, said Koons, who believes that this is an entirely unique production that could not be recreated in different circumstances. 

“Regardless of what it is that we share, it will be such a capsule of this moment of time,” Koons said. “In a way, the play could only happen like this in this moment.”

It’s never too early: Young Readers program aims to engage Chautauqua’s youth with weekly themes

IMG_4735 (1)
PHOTO COURTESY OF KAREN SCHIAVONE

Exploring the world and collecting new and exciting experiences is one of the best parts of being a child or a young adult, but with the world on lockdown, many young people around the world find themselves confined to a single space, unable to venture out into the great unknown — at least physically.

This is where reading comes in. 

At a time when so many traditional avenues of discovery are stifled, books are a way to hold an entire universe in one hand, reading a way to live a hundred lives from the comfort (or confinement) of home. This is something that the Chautauqua Literary and Scientific Circle understands, and the CLSC Young Readers program is designed to open the doors of literary exploration for Chautauqua’s youth. 

CLSC Young Readers creates a forum for discussion and discovery through a program — this year exclusively online — that aims to explore the Chautauqua weekly themes through reading. Each Friday, the “YR Further Reading list”, which previews the selected book for the coming week is posted on the CHQ Clubhouse social media accounts. On Tuesday, the YR Discussion prompts are posted, followed by the Creative Activity on Wednesday. 

The books are chosen by Manager of Community Education Karen Schiavone, each one corresponding to the weekly theme. 

Week One’s theme was “Climate Change: Prioritizing Our Global and Local Response,” and the selected book was Cast Away: Poems for Our Time,by Naomi Shihab Nye, a collection of poems that discuss the planet and the connections each person has to it. Week Two was themed “Forces Unseen: What Shapes Our Daily Lives,” and the featured book was Scary Stories for Young Foxes, by Christian McKay Heidicker, a collection of thrilling middle-grade adventure stories. 

Week Three, themed “Art and Democracy,” was accompanied by the selection of the book Little Women by Louisa May Alcott, a literary classic following four young women experiencing life in New England during the Civil War. Currently, Week Four is featuring Cog by Greg Van Eekhout. The story, about robots and a boy built to learn, goes hand-in-hand with the week’s theme: “The Ethics of Tech: Scientific, Corporate and Personal Responsibility.” 

“It is a goal of ours to select books that kids may not have yet been exposed to through school or home, with the idea that they’ll be able to discuss them with their families, especially since their adult family members may be reading CLSC selections that also relate to the Chautauqua weekly themes,” Schiavone said. 

In most cases, the books chosen are geared toward children ages 9 to 14, though Schiavone said there are often children participating who are older or younger. However, “sometimes there is a book that is so compelling that we feel must be on the list that may fall outside that typical age range,” Schiavone said.

The CLSC Young Readers program was moved online along with the rest of Chautauqua’s programming for the 2020 season, which has challenged the traditional format of the program, but has also opened the program to a wider audience — something that Schiavone believes is important, especially in the current global and social climate. 

In any year, reading is so important for so many reasons, particularly during the summer when learning loss is a risk,” Schiavone said. “This year, I think it’s even more important for youth to engage this way, as reading is an opportunity to get away from the screen and also to have a chance to talk to family and other youth online about current events as they relate to our book selections.”

Schiavone looks forward to the discovery fostered by reading, and is glad to be able to provide Chautauqua’s youth with the chance to delve deeper into the literary side of Chautauqua. 

“We’ve always said that Chautauqua is a community of readers, and it’s never too early to engage kids as well,” Schiavone said.

Dr. Gerard Magill presents technology’s current and future moral conundrums in lecture

magill

The fall of Adam and Eve by “playing God” could mirror how human civilization eventually falls, Dr. Gerard Magill said in his lecture at 2 p.m. EDT Monday, July 20. 

Magill connected the Biblical story of the fall of Adam and Eve from the Garden of Eden to the possible ends of the world in his lecture “Technology, Ethics, & Imagination” on the CHQ Assembly Video Platform. His presentation was the first in the Week Four theme for the Interfaith Lecture Series, “Ethics in a Technologically Transforming World?”

Magill contributes his expertise in multiple roles in ethics at Duquesne University in Pittsburgh. Since 2007, Magill has held the Vernon F. Gallagher Chair for the Integration of Science, Theology, Philosophy, and Law at the university, where he is a tenured Professor in the Center for Healthcare Ethics. Magill is also a board member for Duquesne’s Carl G. Grefenstette Center for Ethics in Science, Technology, and Law. His name is credited in 10 books on medical ethics, often with an added layer of religious morality, as an author, co-author and editor.

Perceiving any future outcome requires imagination, Magill said in his lecture, because the “image” only exists in theory until it takes place in reality. He used the situation of considering someone for marriage as an example.

“We get this image: ‘I can spend my life with this person. This is mesmerizing me. And I get the data and the rationality and the reasonableness (to support this),’” Magill said. “So you come to the conclusion, ‘Yes, we should marry. I’m sure about this.’”

Magill said a person imagines a future with someone just as people imagine God. But when thinking about God, people can set logic and reasoning traps for themselves when it comes to technology.

“What we assume to be normal is the result of major breakthroughs that past civilizations would have been mesmerized by,” Magill said.

The existence of technology that borders on playing God, Magill said, does not always mean it should be used. But people can also use new technology to protect life, God’s creation.

“Save the life that can be saved,” Magill said multiple times throughout his lecture.

He used “maternal and fetal” conflicts as an example of a difficult ethical situation. In a hypothetical situation where a doctor discovers that a womb is cancerous and would kill the mother before the baby could be born, there is a difference between foresight and intent.

“(The mother and the doctor) foresee with assuredness that the baby will die, but it cannot survive out of the womb,” Magill said. “But that knowledge has got nothing to do with intent. Because the mum knows in advance that the baby will die does not mean the mum wants the death. It’s a very important ethical distinction.”

Magill used another example of a woman who was against abortion, but had pulmonary hypertension — a condition that is common in women and can be fatal. Faced with life-threatening complications, she agreed for the doctor to evacuate her womb at 11 weeks.

“They could have tried to save both, but it’s almost certain that both would have died,” Magill said.

The debates surrounding frozen embryos, early adoption, in-vitro fertilization, and surrogate motherhood provide another ethical challenge. Frozen embryos become a situation of rescue ethics because they carry potential for creating kids that can be adopted by a third party.

Magill then switched gears to describe the conundrums in end-of-life situations created by technology. He began with the case of Terri Schiavo, who fell into a permanent vegetative state after cardiac arrest. For 15 years, her parents and her husband Michael fought in court over her parents’ wishes to keep her alive indefinitely with a feeding tube, while her husband said that she would not have wished to stay alive in that way. Eventually the court ruled to allow her to be removed from life support.

Magill said there are thousands of patients like Schiavo who can be kept alive for years. He cited the Christian prayer for a peaceful death, to die quietly in one’s sleep, which stands in conflict with modern technology’s ability to extend life past this point. But technology can also allow the outcome of a peaceful death through managing disease until the end.

“It’s simply about removing technology at the correct point to let the body slip away peacefully, without suffering, without pain and with the dignity of the family being there,” Magill said.

Magill then switched back to technology that can control the beginning of life: human genomics.

About 20 years ago, hundreds of millions of dollars funded the sequencing of the first human genome. To go to a facility and sequence your own human genome today takes five hours and $1,000. Magill said it will take less and less over time to do this, and could eventually only cost $100.

Currently, hospitals track 27 traits in newborn babies to ensure proper healthcare, though it’s possible to map a baby’s entire genome sequence. Magill said that Americans are more resistant to this, while Europeans are already doing this because of what he called a “solidarity mindset.” By collecting the full genomes of a mass population, doctors can detect early-onset conditions and disease likelihood, while also locating treatment solutions within the same genome.

Making health decisions for minors, who are not legally allowed to make their own decisions until they become 18, can also become thorny. Magill used the example of a young teen who wanted to stop her cancer treatments but lost her case in court with her parents. When she turned 18, she decided to continue the treatments after all.

But the case of the Nash family brings up the ethics of making decisions on behalf of an embryo. Their child, Molly, was projected to die early from Fanconi anemia. To save her, they chose to implant via IVF an embryo without Fanconi anemia traits specifically to birth a second child, Adam. The umbilical cord and blood from the pregnancy was donated to Molly, and Adam would later donate stem cells to Molly.

Magill said this could easily become a difficult situation. The younger sibling had to donate blood over time and later donate spinal fluid, which is painful.

“What if he said, ‘I can’t do this for my sister anymore’?” Magill said.

Magill continued with CRISPR, a gene editing tool that works like computer shortcuts to edit genes. First it finds all parts of the DNA in every cell of the body that it’s looking for, then replaces it with a desired trait. A doctor in China, against the wishes of his own country, the United States and the United Kingdom, used CRISPR to edit the genes of two young girls to protect them from HIV. The doctor altered other genes in the process and is now in jail for “playing God.”

But the newest platform for ethical discourse, Magill said, is data science. Artificial intelligence uses algorithms to collect huge amounts of data for targeted marketing and other purposes. And AI is already transitioning into machine learning, where computers are able to speak, train and guide each other without humans. Eventually, they will be able to work thousands of times faster than a computer chip.

The use of 5G technology allows for machines to collect data at record speeds and reach conclusions faster. China is moving the fastest in implementing 5G, and its citizens already use devices — instead of currency — to pay for goods and services. China reaps massive amounts of data this way.

Pharmaceutical companies are using data science and genomic science to develop a vaccine for COVID-19. Through genomic science, scientists were able to find a possible trait for a vaccine quickly, while computational models — not doctors — quickly deduced possible solutions. 

But the collection of health data, which is normally protected by healthcare organizations, reaches Google, Facebook, Apple and other companies through their partnerships with those healthcare organizations. While these huge corporations are working for free to help these organizations with algorithms, their minimum requirement is access to 100% of the data ,with nothing anonymized. Magill said these companies in turn use it for themselves.

“If (machines) begin to do things for us — supposedly ‘for our welfare’ — we are now talking about in the not-so-distant future the concept of transhumanism, human enhancement,” Magill said.

However, he did say this would likely not happen in this generation.

“We are in the portal from the old world, where it was kind of straightforward medicine — ‘I went to see the doctor, got the stethoscope, got the injection, got the surgery’ — through this portal to machine learning that is going to be able to do things and suggest things and be able to treat the somatic and the general level, thereby changing the species and moving it forward,” Magill said.

After 500 million years of history leading up to this technology, Magill said it is now possible to end the human species within 100 years of gene editing going too far.

But Magill shifted the topic again to the matter of the eventual vaccine for COVID-19, which he said was not as vicious as other viruses but still a threat. 

“This is not about me getting sick and dying,” Magill said. “This is about populations getting sick and dying. These can be stark numbers.”

He said the rise in pandemics is directly linked with pollution, which actually allows viruses and other problems to arise. As early as 2050, rising sea levels will force thousands to move out of Eurasia, which will be an even larger number coming into Europe than its most recent struggle to support Syrian refugees fleeing war.

“It’s not just genomics that can kill us, it’s not just algorithms that can threaten us,” Magill said. “It’s also just the air we breathe, the planet we live in. We can pollute it to the point of destroying it.”

Considering all the ways humankind could die, Magill said humankind could fall the same way Adam and Eve did.

“If we killed ourselves in these ways, the planet would recover,” Magill said. “… It’s just that it would survive without humanity. And maybe that’s the story of the fall. Not of the past, but of the future.”

Rana el Kaliouby, CEO of Affectiva, Discusses The Need for Ethical Practices In Creating Artificial Intelligence

elKalioubyscreenshot

When she was studying for her Ph.D. at Cambridge, Rana el Kaliouby, CEO of Affectiva, realized she was spending more time with technology than with any human being. 

“I realized that this machine was emotion blind. It had absolutely no idea how I was feeling,” el Kaliouby said. “It took actions or decisions that were not at all congruent with my emotional and mental state, but perhaps even worse, it was the main mode of communication I had with my family back home.”

El Kaliouby began to think of the possibility of devices understanding emotions the same way humans do. With artificial intelligence becoming more mainstream, such as in self-driving cars and in assisting in health care, she said the emphasis with AI is with efficiency, “and there is no consideration for the human elements.”

“I really want to kind of bring that balance. I want to marry the IQ and the EQ in our devices. … This has the power not only to reimagine our relationship with technology and human-machine interfaces,” el Kaliouby said, “but, more importantly, reconnect humans in more powerful ways and bring empathy into the equation in terms of human-to-human connection and communication.”

El Kaliouby is the CEO of Affectiva, an emotion recognition software analysis company, and author of the memoir Girl Decoded: A Scientist’s Quest to Reclaim Our Humanity by Bringing Emotional Intelligence to Technology. She presented her lecture “Humanizing Technology with AI” at 10:45 a.m. EDT Tuesday, July 21, on the CHQ Assembly Video Platform as the second part of Week Four’s theme of The Ethics of Tech: Scientific, Corporate and Personal Responsibility. El Kaliouby discussed how artificial intelligence can be used to improve human communication and make the world safer, and how these aspirations can only be attained through ethical practices and collecting diverse information.

El Kaliouby grew up in the Middle East, and studied computer science at The American University in Cairo.

“At the time, I got so fascinated by the role technology plays in connecting people and how it changes the way we connect and communicate, and that’s been a common thread across my research, and my work over the last 25-plus years,” el Kaliouby said.

Ninety percent of human communication is nonverbal, which she said includes facial expressions, gestures and vocal intonations. El Kaliouby said people have been researching non-verbal communication, like Guillaume-Benjamin-Amand Duchenne, Charles Darwin and more recently Paul Elman in the ‘70s mapping out every facial muscle into code. For example, she said, when people smile, they activate the Zygomatic muscle and when they furrow their brow, they use the Corrugator muscle.

Becoming a verified face reader requires hundreds of hours of training, so el Kaliouby and her team uses computer vision and machine learning to automate that process. They provide the algorithm or computer with hundreds of thousands of examples of people smiling, smirking or frowning, and the AI looks for similarities. Her team started with facial expressions, then added vocal intonations, then activities like eating, drinking and sleeping. 

“The more data you give it, the better it becomes,” el Kaliouby said. “The more diverse data you give it, the better, more robust and more accurate it becomes.”

One of the applications of this technology is helping people with autism communicate. El Kaliouby said that these individuals often avoid looking at the face altogether because they find it too overwhelming.

“So they’re completely missing out on that 90% of communication we’re talking about, which impacts their ability to make friends if they’re in school, their ability to keep jobs if they’re adults, so it has a lot of dire consequences,” el Kaliouby said.

After earning her Ph.D., el Kaliouby worked at MIT and created a device, similar to the Google Glass, that functioned as a real-time coach and helped children on the autism spectrum understand facial expressions. The device made the process into a game for the children, giving them points when looking at faces. This project is still a research project, but el Kaliouby said they found that the children were improving at communication.

El Kaliouby and one of her coworkers, Rosalind Picard, left MIT in 2009 and founded Affectiva. They realized that there were many applications for this technology, such as detecting driver drowsiness, and using facial and vocal biomarkers to detect stress, depression, suicidal intent and Parkinson’s disease. 

“Imagine if we can detect all of that just off of your cell phone. Very transformative, but at the same time, we recognize that this data is very personal,” el Kaliouby said. “There is a lot of potential for abuse.”

She said Affectiva set up core values. First, they would reject any business they felt that did not understand what the technology should be used for, even if that meant turning revenue away. Second, any person who chose to give Affectiva data would be compensated. Third, the company would focus on ethical practices and make sure their algorithms are not biased.

“This is really important for me. This is the biggest issue right now in the space of artificial intelligence. It’s not that the robots are going to take over the universe,” el Kaliouby said. “It is that we are just building bias into these systems and then deploying them at scale unintentionally, but with really dire consequences.”

Affectiva has collected 9.5 million facial videos from 90 countries. These videos include people of different genders, ages, ethnicities, and even people wearing face masks and hijabs.

El Kaliouby said a few years after starting Affectiva, one of their principles was tested. The company was running out of money. Two months away from not making payroll, a venture arm of an intelligence agency offered them $40 million in funding to research lie detection, surveillance and security.

“I remember going back home one night and just kind of imagining what the world would look like if we took that money and Affectiva pivoted to working on this,” el Kaliouby said. “I just really didn’t feel that that was in line with why we started Affectiva. Our start was in autism and bridging this communication gap between people, between companies.”

In 2020, the company started an international program where they pair young people with Affectiva employees. 

“These kids are asking awesome questions, and really kind of challenging us around what we’re building and how we’re building all this technology. That gives me a lot of hope in the future,” el Kaliouby said.

The lecture then shifted to a Q-and-A session with Chief of Staff and Vice President of Strategic Initiatives Shannon Rozner. The first question was how Affectiva retrained algorithms to analyze people wearing facemasks.

El Kaliouby said that people rely on the lower half of their face for communication, particularly because that is where the mouth is. She said when people genuinely smile out of joy, or a Duchenne smile, they activate muscles around their face, causing wrinkles like crow’s feet.

“What we’re seeing is that when you when you do cover the lower half of your face, you need to exaggerate some of your expressions so that they can manifest in the upper half of the face, but also use things like head gestures and hand gestures that can accentuate some of these nonverbal signals,” el Kaliouby said.

Rozner then asked if el Kaliouby could explain the process of collecting data, from the initial gathering to analysis by the AI.

El Kaliouby said that for training an algorithm to detect a smile, they have two ways of gathering data. One is having people from all over the world watch a video on a laptop, and, with the person’s permission, use the camera to record their reaction. The other way is to have people put dash cameras in their cars and record their daily commute for a few weeks. 

Then the videos go to human annotators who watch the video in slow motion and label parts that have smiles. The annotators’ findings serve as the correct answers, or validation data set, and if three out of five annotators find that a person is smiling, then the person is most likely smiling. El Kaliouby said the AI’s findings are tested against the annotators and this process is repeated for different expressions and emotions, and even if the person is eating or drinking.

“The repertoire of things we can train the machine is endless. I find that really exciting,” el Kaliouby said.

The final question was what el Kaliouby’s ideal world in 50 years looked like, and what the average person’s role in creating that world might be.

She said the power of consumers is monumental, and that people need to choose companies that are committed to the ethical development of AI. El Kaliouby said that people being educated about AI will go a long way, and help create not only a more productive and automated future, but a more empathetic and human one, too. 

“I really hope in 50 years, we have rebuilt our technology in a way that gives us a sense of connection,” el Kaliouby said. “Not the illusion of a connection, but the real sense that we are connected across borders and across our differences. I’m excited about that.”

Nick Thompson, editor-in-chief of Wired, discusses idealized origins of the internet and the current contrasting reality

Screen Shot 2020-07-21 at 9.27.34 AM

The online world started to take shape in 1990, when computer scientist Tim Berners-Lee invented the World Wide Web, and many people felt that technology would stop the censorship and oppression seen in Soviet Union and prevent another Cold War. 

“The internet seemed to make life more efficient, allowed people to connect with their friends and their high school exes,” said Nick Thompson, editor-in-chief of Wired. “It made a lot of people a lot of money. The most sophisticated argument about why the internet was a good thing was that it created the possibility for non-zero (or win-win) human interactions.”

As the ways people can communicate become more complex, he said everyone becomes better off. The internet was seen as the next step of this process, and Moore’s law, which states that the power of computer systems doubles each year and a half, shows how much progress the technology makes.

Thompson is the editor-in-chief of Wired, an editor of newyorker.com, a co-founder of the multi-media publishing company the Atavist, and the author of The Hawk and the Dove: Paul Nitze, George Kennan, and the History of the Cold War. He presented his lecture “The Tech Boom, Backlash, and Boomerang” at 10:45 a.m. EDT Monday, July 20, on the CHQ Assembly Video Platform as the first part of Week Four’s theme of The Ethics of Tech: Scientific, Corporate and Personal Responsibility.” He discussed the idealized beginning of the internet and how those ideals of promoting democracies have been called into question in recent years, spurring on debates and “reckonings” around issues of privacy, authoritarianism and truth.

Thompson graduated in 1997 from Stanford University and his first job was with CBS, “where I was actually fired within 60 minutes of arriving. The only employee ever to work at CBS for less than an hour, I believe.” He then went to West Africa, where he was kidnapped by drug lords and eventually was released. The third thing Thompson did after college was join an open source software computer company, whose main aim was to combat Microsoft and monopolies. 

He started to doubt that the internet was a proponent of liberal democracies and increased moral understanding of society. In the Arab Spring 10 years ago, for example, protestors used Facebook to organize, but the same technology caused rifts in the protests because social media also amplifies the angriest and loudest voices.

“Maybe this is just pushing society, or maybe it is actually amplifying some of our worst tendencies,” Thompson said. “Maybe it is not making places like Egypt better, (maybe) it is actually making authoritarian governments worse.”

Thompson also said people using the internet to subvert democracy was seen in the 2016 presidential election with trolls from Macedonia creating fake news websites for ad money, and the Russian disinformation campaign.

“This has led to the election of someone whose fundamental philosophy of how the world works is the antithesis of the fundamental philosophy of the people who built the platforms of how the world should work,” Thompson said

The 2016 election led to many debates and reckonings about technology, including within companies themselves. Thompson said that people who work at Facebook started to question the platform they had built, the unintended effect of spreading misinformation and what they could do about it.

Another question was if phones, computers and the internet were helping society. 

“The smartest people in the world made these with the best technology and the most money,” Thompson said. “But did they make these devices to enrich our lives or just suck away our attention?”

Thompson cited a study which showed that many people regretted using apps, like Facebook or Reddit, for long periods of time, but enjoyed apps that they used for less time, such as Evernote.

He said the third question about the role of technology in democracies was what was real. Thompson said that when people realized how much disinformation and fake accounts existed on Facebook and Twitter, “the internet seemed evermore … a place where you couldn’t really trust who was who.”

“Think about places you’ve been where you can trust, where you can put your suitcase down and you know that it’s not going to be stolen. Where you can buy a ticket to something and know that it will work at the door. Well, those societies work,” Thompson said. “(A society does not work when) we can’t really believe that somebody is who they say they are.”

People also debate artificial intelligence. Thompson said that computers can analyze every chess match in history and create new, creative ways of playing that humans have never considered. But AI’s success is dependent on how much data it has and what’s in the data, which is where problems occur. 

“For example, you train an AI system for criminal justice, you train it on historical sentencing data, and it will be racist,” Thompson said. “You train an AI on how it should rearrange things in your home, and you train it on historical photographs, and it will learn to identify women doing one kind of task, and men doing another kind of task.”

Thompson then discussed authoritarianism and privacy in relation to modern technology. China, which he used as an example, has a huge advantage in AI because the biggest technology companies are state-owned and the government can collect any data it wants because there are not many privacy restrictions. Certain cities in China also have a system called a social credit score, which Thompson said is calculated based on whether a person pays fines and bills on time, as well as scores of their friends and their political allegiances. This score can determine whether a person is allowed to buy certain goods, such as bus tickets. 

Thompson said that all these recent debates and reckonings have led to deeper conversations about making the tech industry better, “one (in which) the technology is enriching our lives, not making them harder. Where we have trust that the outcomes are fair and just. The data we are collecting leads to artificial intelligence that actually makes us live longer, productive or healthy lives.”                                                                

The lecture then shifted to a Q-and-A session with Matt Ewalt, vice president and Emily and Richard Smucker Chair for Education. The first question was if there were any troubling topics or themes that emerged in the last two weeks that relate to Thompson’s talk.

On July 15, many popular verified Twitter accounts were hacked, such as those belonging to Barack Obama, Bill Gates and Jeff Bezos, and tweeted out a Bitcoin scam. He said that the hacker was able to get access through a single control panel.

“We’re very lucky that the person who perpetrated this was seemingly just doing a Bitcoin scam, But what’s interesting about it is why was Twitter security so lax,” Thompson said. “How come to reset Barack Obama’s email, you just needed one person to have access to a control panel? You didn’t need two, or three, or 10?”

Ewalt’s next question was how much screen use was healthy for children, and if eliminating screen time may make children unprepared for a future with a larger emphasis on technology. 

Thompson has three boys, ages 12, 10 and 6, and he views technology similar to the food pyramid. Activities at the top are commonly viewed as bad, like using a phone at the dinner table and violent video games. The next level down is more ambiguous, because lessons can be taught there, such as through instructional YouTube videos. Thompson said the base of the pyramid are videos and activities that are “genuinely good.”

“For whatever reason, my 6 year old wants to learn German. Well, not for whatever reason, he’s obsessed with the Barcelona goalie (Marc-Andre) Ter Stegen and it wants to be able to talk to him,” Thompson said. “So we use a German learning app called Memorize which is gamified and we learn German phrases in the morning together, which is hilarious. And no doubt, good for him.”

He said that each parent needs to have a conversation with their children about screen time. His 10-year-old son really wanted a PlayStation, which would be near the top of the pyramid as a platform for violent video games. But his son was also feeling left out of his friend group. Thompson bought his sons a PlayStation and tries to keep limits on how long they can play, has them play games that stretch their imagination and even plays with them himself. 

The last question was how Thompson would advise consumers to stay informed, think about their decisions and help shape society.

Thompson said he does not always live up to this, but every word typed and link clicked changes Google’s algorithm in a small way, so he has “the sense of the internet and technology becoming something like a collective global consciousness. In a way, Google is the repository of human thought.”

“I do think that each of us has a certain obligation in how we act. We have a certain obligation to, and how we teach, our children to act, how we talk to our friends about how we act,” Thompson said. “So we all play a role in shaping this sort of phenomenal amazing thing that is (the) digital internet.”

2020 CVA School of Art Students and Emerging Artists Exhibition unites work of artists participating from around the country

IMG_7385

Ashlyn Diaz sat in her Brooklyn apartment, conflicted. It was early June, and outside her window thousands of New Yorkers were marching.

Diaz, an artist who recently earned her BFA in drawing from the University of Florida, felt the groundswell of anger and activism that swept across the country since the May 25 police killing of George Floyd. She felt pulled to join the protests, both to advocate for her own rights as a Black woman and to stand in solidarity with others, but as the danger of police brutality loomed large in her mind, the nebulous and invisible threat of disease was just as strong.

“I found myself in a vulnerable space and not really sure how to advocate or participate in the modes of resistance taking place,” she said. “There’s this threat to health in going outside and being in large crowds (right now), but at the same time there’s this threat to my own life in not advocating for changes that need to take place.”

With black poster board and acrylic paints on hand, Diaz got to work.

“I relied on using my voice in the way that I knew how,” she said. “I resorted to creating something.”

Diaz’s piece “Unarmed” is one of 38 paintings, sculptures, tapestries, video pieces and more featured in the 2020 Chautauqua Visual Art School of Art Student and Emerging Artists Exhibition. The exhibition will open virtually at 10 a.m. EDT Monday, July 20, on the CVA website and will be available to view through a 3D virtual tour until Aug. 19.

Chautauquans on the grounds can view the exhibition in person starting at 1 p.m. Tuesday, July 21, in the Gallo Family Gallery of the Strohl Art Center.

The exhibition, curated by Susan and John Turben Director of CVA Galleries Judy Barie, Sydelle Sonkin and Herb Siegel Artistic Director of the Visual Arts Sharon Louden, and the School of Art’s core faculty, features work from nearly all Students and Emerging Artists enrolled in this year’s virtual School of Art. The intergenerational cohort is made up of artists from across the country ranging from 21 to 64 years old working with a diverse array of mediums.

This year, for the first time, all 38 participants were awarded full funding to attend the program, thanks to CVA’s partnerships with more than 30 institutions and arts organizations.

According to Diaz, “Unarmed” is a self-portrait in so far as she used her own body as a reference, but her intention is that it represents something much more universal about the experience of being Black in America.

“The idea of going outside (right now) is like, what sense of protection do I have in being out there?” she said. “(But) at the same time, what sense of protection do I have in not being out there? I feel like any Black person can relate to that position of being totally stripped of protection and safety.”

Artist Quinn A. Hunter’s work similarly examines Black identity, through an explicitly female lens. Hunter recently graduated from Ohio University with her MFA in sculpture and expanded practice. She is currently living and working in Athens, Ohio.

Her piece, “34 Hours of Negotiation Between the World and Me” is a woven tapestry created with artificial hair extensions. The tapestries black- and brown-striped pattern is meant to invoke cornrows, in more ways than one.

“They operate as what we think of as corn rows in the sense of farming, but also cornrows in the way that Black people tend to braid their hard close to their scalp,” she said.

Negotiation” refers to the struggle for Black women to escape the stereotypes and cultural narratives thrust upon them.

“We don’t necessarily get to define who we are as much as the world gets, unfortunately, to define us, and we (have) to operate in that culturally defined space,” Hunter said. ”(Identity) is not necessarily a statement; it’s a negotiation.”

Her use of braided, artificial hair is a reference to the politicization of Black hair, another way Black women are stereotyped.

“I have natural hair, meaning my hair is the way it grows out of my head,” Hunter said. “For most cultures that isn’t necessarily a political stance, but because of the Black Panther movement, having an afro is seen as this radical political stance.”

For her, the labor expelled in its creation is as essential to the piece as the tapestry itself. It took 36 hours to create the work, therefore, 36 hours of negotiation took place.

“I am talking a lot about the erasure of Black women from spaces, and how weaving hair is trying to re-inscribe their labor back into place that has already been erased,” she said. “For me, this labor is redoing the labor that has been done by the Black women before me.”

Los Angeles-based photographer, video and performance artist Mathew Chan also explores identity in his work, although he uses a significantly different, and newer, medium: deepfake technology.

Deepfake technology is used to virtually manipulate videos by superimposing different faces, onto the bodies of the video’s original subjects. The technology gained notoriety in 2017, when amateur creators began creating deepfake pornography by placing celebrity faces on porn actors.

“The idea of using that technology for my own practice happened last year,” Chan said. “It’s an amalgamation of a lot of processes that I work with already, like photography, appropriation and collage.”

In his video piece, “Draw me like one of your French girls (Chop Suey),” Chan took a scene from James Cameron’s “Titanic” and superimposed his face over actor Leonardo DiCaprio’s.

This is part of an ongoing video series where he is inserting himself, an Asian man, into the white leading-man roles of popular films. With this project, Chan seeks to subvert narratives about masculinity and race portrayed in popular movies.

“(Growing up), there was (almost) never a film where there was a male protagonist played by an Asian male lead,” he said. Even in action films starring Jet Li or Jackie Chan, the main characters were never portrayed in the same way as white male action stars.

“Their masculinity was, in a way, taken away from them; (they) were kind of neutered because they were never given a romantic interest or pursuit,” he said. “In every Bond movie — I don’t even need to explain it — but whenever we have an Asian male lead, that aspect of the character is withheld, and I think that’s a purposeful thing, that’s a purposeful choice by Hollywood.”

By inserting himself into a romance like “Titanic,” Chan is seeking to see himself represented in a narrative that was absent growing up.

“It’s no longer Leonardo; it’s me now, and even seeing myself in the scene, ‘acting it out’ is kind of surreal,” he said. “It’s a very weird feeling.”

In the scene he chose, the poor artist Jack is drawing a portrait of upper-class Rose. Instead of subtitles, throughout the video a recipe for chop suey flashes across the screen. Chan added this to comment on the historically transactional relationship between Asian Americans and white America.

“Chop suey was created for Chinese Americans to sell to white patrons,” he said. “It parallels the scene to me. They both represent this idea of transaction between patron and laborer. There’s a socioeconomic and a racial relationship that I found interesting to pair together.”

Sculptor Katie Shulman graduated this spring with her MFA from Syracuse University. She uses dyed bed sheets, knitted bra straps and found objects to create her own “material language.” Shulman normally makes large-scale, fiber-based sculptures, but for this exhibition she was challenged to create a smaller piece.

“I made this piece specially for the show, because how I usually work is so outside of the (exhibition’s) size requirements,” Shulman said. “It has the same elements of all the work I’m doing right now.”

She creates improvisationally, and is inspirited by the physicality and bodily forms. Her sculpture “Hybrid Body” is an amalgamation of two kinds of forms she works with: “hard and soft bodies.”

“This felt like such a perfect distillation of another iconic shape that I could make by combining what I usually do,” Shulman said. “It turned out to be this new category that I can employ very specifically in future sculptures.”

The bottom fell out of all of our lives,” Shulman said. “Having this program exist online has been a godsend, truly. It’s been so stabilizing. It’s been a space of abundance, rather than a space of scarcity.”

With her work, she hopes to create instant visceral reactions that make viewers think about their own bodies.

“You might think of it as gross as guts, or as normal as an arm flailing and bending over,” Shulman said. “It’s super abstract, but if you look at it and think, ‘(This) makes me feel a certain way in my own body,’ that’s good for me.”

In the midst of so much uncertainty, both personally and professionally, she said attending the School of Art has been “a lifeline.”

“The bottom fell out of all of our lives,” Shulman said. “Having this program exist online has been a godsend, truly. It’s been so stabilizing. It’s been a space of abundance, rather than a space of scarcity.”

Although the 2020 Students and Emerging Artists won’t set foot on the Institution’s grounds this year, Shulman finds peace in the knowledge that their work will stand together.

“It’s so beautiful that the show goes on when so many things can’t,” Shulman said. “I touch every part of everything I make. It’s so joined with my body and my movements. It’s at Chautauqua, so therefore I am.”

Aaron Bryant, museum curator, Discusses How Museums Can Reveal the Humanity Behind Items.

BryantScreenshot

Aaron Bryant, curator of photography and visual culture at the National Museum of African American History and Culture, will never forget the first time he visited a museum. In his art class in fifth grade, every student created a painting based on an illustration, and Bryant’s work of an African market was chosen to represent his school at the Baltimore Museum of Art.

“I remember they had painted Rodin’s ‘The Thinker’ outside of the museum and, as a kid standing next to the sculpture, how imposing it was because I only came up to his knee,” Bryant said. “I’ll never forget how I was only as his knee at the time before we walked into the museum.”

Bryant has been a researcher for around 35 years, and started out doing corporate research.

“I love to research, and I’ve always loved the arts and history,” Bryant said. “And so, at a certain point, I began to ask myself would I feel I’m making a contribution, if I took my skills and business analysis and research, to work for a nonprofit organization.”

Prior to working at the NMAAHC, he was curator of collections and exhibitions at Morgan State University’s James E. Lewis Museum of Art in Baltimore. Bryant talked on Friday, July 17, on the CHQ Assembly Video Platform, with Vice President and Emily and Richard Smucker Chair for Education Matt Ewalt. The conversation, titled “Preserving History In Real Time,” was the last of Week Three’s theme of “Art and Democracy.” Bryant focused on the importance of bridging the gap between high culture and popular culture to make museums more inviting to everyone, as well as showing the humanity behind objects.

History happens everyday, and Bryant said rapid-response collecting is a way to keep track of current events in the United States, and preserving information and visuals for future generations. Bryant said these artifacts can be “anything that helps us to preserve the memory of the moment. It’s really about the object representing an experience, a human experience.”

He said with the recent protests against racial injustice, museums are collecting signs, banners, T-shirts and even face masks with messages written on them. Bryant prefers to have a direct donation in order to know and capture the story behind the item.

“You see someone with a particular sign and you just go up to them and you talk to them about the sign,” Bryant said. “And you ask them, ‘Why are (you) here?’ ‘Why is this important to you?’ You start having a conversation and then that helps to give a better context to the object itself, and creates a way to connect the objects in some sort of human experience.”

Artifacts from large events, like presidential inaugurations, were always collected for exhibits, but after 9/11, Bryant said rapid-response collecting started when American museums started to think how they could respond to a larger range of historic events.

Ewalt asked why photographs were not sufficient, and why museums needed physical objects.

Bryant said that with large murals, people do not get the sense of scale with pictures. With three-dimensional objects, people need to be able to walk around it, Bryant said, to “get a sense of its magnitude and its presence within the physical space.”

But photography is important as well; Bryant said that images serve as evidence that an event happened.

“They can also be very creative in terms of artistic expression and emotion, giving some sort of emotion,” Bryant said. “They can have emotion tied to them, so they’re important in that way.”

He said transformative periods in history, such as the Civil Rights Movement, are part of everyday life of those living them. 

“The Civil Rights Movement didn’t just come out of nowhere and actually was part of an evolution,” Bryant said. “I think about history as really representing an arc in the human progress. Every point along that arc is really connected in some way.”

Ewalt asked about Bryant’s previous work addressing gaps in representation in civil rights and photography.

“When we think about civil rights photography, if you’re familiar with the names, those names will generally be men,” Bryant said.

He said these men were often taking photos for newspapers, magazines and other publications. Women photographers whose work were part of the Civil Rights Movement are not represented in the “canon” of the movement.

“I would also say, ironically, (there is an) absence of African Americans in many ways,” Bryant said. “We know of African-American photographers can be seen at the time, but it seems that the canon of civil rights photography was really defined by white men.”

Ewalt then asked how videos from cell phones and social media can be presented in a museum alongside physical objects.

“I think many museums across the board are still trying to figure out how … we grapple with cellphone images as objects,” Bryant said, “and then in terms of how it might fit in with the object itself in many different ways when you collect these materials.”

Video is particularly important with how it relates to the history of protests, like in the early ‘90s with Los Angeles police’s beating of Rodney King and the later LA uprisings.

Going off of Week Three’s theme of “Art and Democracy,” Ewalt asked why museums are essential to democracy.

Bryant said that museums are places where people feel welcome and open to engage in a civic dialogue, but this has not always been the case. 

“There has been a barrier between mass and popular culture and everyday lives versus high culture — celebrating the Princeton galleries, for example. ‘Welcome to this museum that is really a tribute to my collection of the things that I’ve collected as I traveled all over the world,’” Bryant said. 

Bryant said the National Museum of African American History and Culture selects objects that “really celebrate and elevate everyday life and everyday people.”

Ewalt’s last question was how museums traditionally presented an object as opposed to the newer way of showing the humanity behind the object.

Traditionally, Bryant said, objects were presented with information that was specific to the object itself, like who made it, what it is made of and its function. The newer way is presenting the story behind the object, such as how the “Red Violin” is presented, which Bryant recommends to everyone.

“You learned that when it was first made that the violin maker made it for his wife, who was going to give it as a gift to their unborn child, (and that) both his wife and his child died in childbirth,” Bryant said. “He decided that he was going to get rid of the violin, because it held on to so many memories, and then it goes to the next person.”

Each owner of the violin had a different story to tell and, Bryant said, the instrument held a deep significance in each of their lives. An actor in his late 60s told Bryant something similar, that with each character he played, he learned more about humanity. If he acted well, he was able to share that knowledge with the audience and maybe become a better person himself.

“That’s my curatorial practice, that’s my approach,” Bryant said. “How do I become a better person? How do I learn about humanity and how do I help people connect to their own?”

1 55 56 57 58 59 117
Page 57 of 117