Alton Northup
Staff writer
When Vauhini Vara did not have the words for her sister’s death, she had artificial intelligence help her find them.
She initiated a back-and-forth exchange with the AI language model GPT-3, the predecessor to the chatbot ChatGPT, and turned the process into her essay “Ghosts” for The Believer. The nine-part essay, selected by Alexander Chee for inclusion in The Best American Essays 2022, is an exercise in examining her relationship with her sister and the grief that followed her death, executed through a haunting human and AI collaboration.
Vara, a technology journalist and author, continued the Chautauqua Lecture Series Week Six theme, “A Life of Literature,” at 10:45 a.m. Thursday in the Amphitheater. She discussed her essay, her experience using AI, and the role of AI in literature in her lecture, “If Computers Can Write, Why Should We?”
Vara first crossed paths with AI in 2017 when she was assigned to write a profile on Sam Altman, the chief executive officer of OpenAI, the company that would go on to create ChatGPT. At the time, the AI research laboratory was still operating as a wholly non-profit organization and had little public recognition or products out of beta-testing.
“At one point, I sat in on a meeting between Altman and some entrepreneurs he’d funded, and I watched as he pulled out his phone and he showed these entrepreneurs a video of a robot solving a Rubik’s Cube with one hand, and one of the entrepreneurs asks Altman when he thinks AI is going to start replacing human workers,” Vara said.
Altman equivocated at first, she said, until he brought up horses.
“He says, ‘For a while, horses found slightly different jobs and today, there are no more jobs for horses,’ ” she said, quoting Altman. “ ‘I don’t have to tell you all the difference between horses and humans – which is that humans are human – which is why we tend to be particularly worried this time around about how technology is changing the world.’ ”
Three years later, OpenAI launched GPT-3 for public use. A language prediction model, the program is a neural network machine learning model that uses internet data to predict what will most likely come next in a series of words.
Vara wrote Altman to get early access to the program, and she started experimenting with its capabilities. Initially, she found the program a helpful tool to combat writer’s block.
“What got to me is that GPT seemed to be able to unstick me,” she said. “In this context, a tool that could generate words for me seemed like a revelation.”
The ability to generate words in the absence of her own gave her an idea. She entered in the prompt: “My sister was diagnosed with Ewing sarcoma when I was in my freshman year of high school and she was in her junior year.”
The program generated a response that told a story of two teenage sisters and their struggle as one undergoes cancer treatment. At the end of the response, it wrote: “She’s doing great now.”
Vara had to explain to the bot that her sister had died, and she prompted it to write again. With each prompt, she said she felt the bot got closer and closer to her feelings.
“All of this was moving, and sometimes exhilarating, but it was also frightening,” she said. “I started to understand that the essay was only partly about my grief; it was also about what it meant for me, and for all of us, to make use of a technology that promised to help us describe the world and our experiences – our deepest experiences – from our perspective.”
The essay was published in 2021, and it almost immediately went viral. Vara said she received letters from people who said it was the best expression of grief they had read, and others predicted a future where writers collaborated with AI. But amid the buzz, she had concerns over the ethics of AI.
“I started to worry that I’d become in people’s eyes some kind of evangelist for AI literature,” she said.
Researchers feed AI language models hundreds of billions of sample texts to train them, which are then used to write the most likely response to a prompt. Often, these texts are used without the original author’s permission, and companies rarely reveal their library. But Vara said these models have evident biases based on their responses.
“People sometimes describe AI in really romantic terms as representing all of human consciousness,” she said. “We know from research that each of the 8 billion human consciousnesses on earth aren’t equally represented in these models.”
White men in the United States and the United Kingdom are overrepresented in AI model training, Vara said, according to a study that prompted a model to answer questions about novels from these countries.
She experienced this herself while experimenting with the setting of her debut novel, The Immortal King Rao, a finalist for the 2023 Pulitzer Prize for Fiction. The novel tells the story of King Rao, a child born to a Dalit family in 1950s India, who grows up to lead a tech company and, eventually, a global AI-run government.
While planning her novel, she visited her father’s village in India, which inspired the setting, and interviewed family and neighbors about living there. In the experiment, ChatGPT had no problem setting the scene for her – if she was OK with historical inaccuracies and a perspective that ignored life as a Dalit, the lowest level of the caste system. The program had created a stereotype of her father’s village.
“What we end up then is with a product that not only produces text that doesn’t represent a single human consciousness the way a human writer does, it also presents a warped version of the world,” she said.
AI fails to reflect the complexity of a human author because “there is no human author.” While Vara co-wrote an essay with AI, the final piece ultimately had just three sentences written by the program. What matters most about “Ghost,” she said, is that she has to set the record straight.
“I’m asserting my own consciousness by writing against what GPT-3 has produced,” she said. “In the essay, GPT-3’s role diminishes over the course of the nine attempts — which is to say by the end of it, I’ve taken control of the narrative.”
The final three sentences of the essay, which lists what her sister taught her in life, reads that she taught her “To do math. To tell stories. Once upon a time, she taught me to exist.” Vara said the sentences sounded nothing like a human, but she kept it as a joke – because it is one the AI could never understand.
“I hope … we stay aware as readers, as a society, of what it would mean to cede ground to computers in a form that has traditionally been meant for humans to convey to other humans what it’s like to be human living in this world. Because no matter what, humans will have something AI doesn’t,” she said. “Only humans are human.”