Machined Soul

Rev. Joe Cleveland

October 22, 2023
Unitarian Universalist Congregation of Saratoga Springs

Delivered in the spirit of Ayudha Puja. Listen to his sermon here.


Rev. Joe Cleveland, banjoist, guitarist, singer, writer, and minister of UU Saratoga for ten years.

Every Sunday, I seem to get a message. This week it says: Your screen time is down 27% from last week. Should I feel proud or good about that? I always feel a bit sheepish when it tells me that my screen time has gone up. But I feel pressures from my culture/society pushing in opposite directions: I shouldn't be on my phone so much; I should be making TikTok videos. I should spend more time outdoors; I should spend more time reading or watching the news.

But even if I go outdoors, my cell phone will be in my pocket. And sometimes it will be in my hand. These devices are our constant companions, now.

They are shaping our lives.

The poem we just heard, "My son calls me while I'm in line at the Stop & Shop," (1) dramatizes the most obvious way cell phones have changed things: We're always reachable. That's not necessarily a bad thing. The parent in the poem is "glad to be able to heed the call," even though they're a little annoyed at it. But the part of the poem that stands out to me isn't the idea that "the whole world / is an easy touch away and should be." The part that stands out to me is the bit describing the child "who / wears new technology like a second skin." The child is not really separate from technology. Technology is part of who they are. It is nearly a physical part of them, part of their body.

This is the bit that gets me feeling a little weirded out.

There is so much technology that is doing such good for us. In this room right now, there are probably a few pacemakers. A friend of mine who is diabetic doesn't test his blood and give himself injections like he used to. Now there is a little machine that is attached to him that monitors his blood sugar levels and lets out a little beep to let him. know when things need his attention. I have a niece who is deaf in one ear and at some point there may be wiring that goes right to her brain to help her compensate for that.

Technology today is forcing the question on us: what does it mean to be human? And how is technology today an aid or a threat to being human? Our tech is taking on a life of its own.

A life of its own. What does it mean to be alive? What does it mean to be human?

Our technology and our tools and machines have always shaped what human life is. We make tools, and with them we change how we live and how we think of ourselves.

The New York Times columnist Ezra Klein says that there is no more profound human bias than the expectation that tomorrow will be like today. It is a powerful heuristic tool because it is almost always correct.

"Tomorrow probably will be like today. Next year probably will be like this year. But cast your gaze 10 or 20 years out. Typically, that has been possible in human history. I don't think it is now." (2)

What Klein is trying to emphasize is how quickly artificial intelligence is likely to change what human life is like. He quotes a former member of OpenAI - the group behind ChatGPT as saying:

"The broader intellectual world seems to wildly overestimate how long it will take A.I. systems to go from large impact on the world' to unrecognizably transformed world, '[. .] This is more likely to be years than decades, and there's a real chance that it's months. (3)

Order of Service / Page 1. Click to enlarge.

This is from a column that Klein wrote months ago, so maybe things aren't quite moving at that pace, but there is no denying that things are changing quickly.

There are a lot of reasons why I'm wary of artificial intelligence. What with ChatGPT and Google Bard and Microsoft Bing and tons of other chatbots out there, I wonder what it means for anyone who is a writer. There are bot-written novels and books by the score on Amazon and other places. And I used to teach writing - when ChatGPT was launched November of last year, I can only imagine the game-changer that was for writing teachers or any teacher who wants to assign an essay or written report.

One of the main concerns about artificial intelligence is that it just repeats or even reinforces racial and cultural stereotypes. Artificial intelligence systems have to be taught, and that means they have to be loaded up with information. There is a system called COMPAS that runs algorithms that predict a defendant's likelihood of recidivism and is used in some courts to determine the sentence that a defendant receives on conviction. ProPublica - the nonprofit investigative journalism group - looked into COMPAS and found that its 

sentencing assessment was far more likely to assign higher recidivism rates to black defendants [sic] than to white defendants. These systems do not target specific races or genders, or even take these factors into account. But they often zero in on other information-zip codes, income previous encounters with police-that are freighted with historic inequality. These machine-made decisions, then, end up reinforcing existing social inequalities, creating a feedback loop that makes it even more difficult to transcend our culture's long history of structural racism and human prejudice. (4)

Order of Service - Page 2. Click to enlarge.

And there are lots of other ways that this happens: Amazon delivery trucks avoiding Black neighborhoods, facial recognition software that "has led to unlawful arrests and detainment of Black people,"(6) and more.

A.I. and technology might not only change us, it might reinforce how we disrespect and dehumanize others.

The writer Meghan O'Gieblyn explores the interaction of technology and religion. She came across a piece in Wired magazine 

in which a woman confessed to the sadistic pleasure she got from yelling at Alexa, the personified home assistant. She called the machine names when it played the wrong radio station [. . .] Sometimes, when the robot misunderstood a question, she and her husband would gang up and berate it together [...] 

Then one day the woman realized that her toddler was watching her unleash this verbal fury. She worried that her behavior toward the robot was affecting her child. Then she considered what it was doing to her own psyche-to her soul, so to speak. What did it mean, she asked, that she had grown inured to casually dehumanizing this thing?" (7)

What is the relationship we want to have with our robots? It seems like such a funny question to ask. Our robots? But that's what these AI systems are. We need to treat them with some care. We need to come to some understanding of what they are.

And that means we need to reflect on who we are, what it means to be human, and how we want to live.

Meghan O'Gieblyn remarks on how the woman who was yelling abuse at Alexa uses the word "dehumanizing" to describe what she's been doing to Alexa. While she had started out calling it a robot,

Somewhere in the process of questioning her treatment of the device — in questioning her own humanity — she had decided, if only subconsciously, to grant it personhood.

Is Alexa a person? How do you talk to your GPS when it's giving you directions?

O'Gieblyn says that "We are constantly, obsessively, enchanting the world with life it does not possess.* But I don't think I quite share O'Gieblyn's concern. She reasons that we do this because humans have evolved to do this: When we encounter something, see a shadow up ahead, we have to make a guess about what that something is. It might be a boulder, or it might be a bear, or it might be something else. From a survival standpoint, living things are important for us, but other humans are the most important of all. So when we encounter an uncertain object, it's in our best interest to "bet high,' guessing that the object is not only alive but human." And in this way, we by default "enchant" the world.

O'Gieblyn was raised with a conservative Christian theology that she rejected when she was in college, and I wonder if she feels a bit worried about allowing herself to live in an enchanted world, because that was the world she was raised in: God and soul were all around, but after rejecting the Calvinist Christianity she grew up with, it seems like she feels she has to let go of enchantment, too. But what she also demonstrates in her book God Human Animal Machine, is that while scientists and theorists of A.I. tend to be materialists, the idea of a soul keeps creeping back in.

She starts her book by telling the story of how she once got a robot dog, an Aibo made by Sony. And she tells of how she started feeling maternalistic for it, and how she developed an affection for it. It would bark and come to the door to greet her when she came home. It would nuzzle into her hand when she gave it a scratch under the chin.

But it was just a robot. After a little while, her husband started getting creeped out by it and told her it had to go because it is not real. But you can feel her regret as she puts Aibo in his pod and boxes him up to send back to Sony.

I think defaulting to enchanting the world might be a good thing. One thing that I've read many people say about these A.I. systems that are being created is that even the people making them don't really have a full understanding of how they work. This is something that makes Ezra Klein worried. We need to be treating these systems with a degree of care and concern that it's not clear we're showing right now. I don't necessarily mean that we need to be asking A.I. systems if it's ok before instructing them to do something. (An engineer at Google, Blake Lemoine, was fired for advocating this.) But confronting these new systems with awe and even a little reverence would probably be a good thing. Companies and investors are racing and racing along the AI track.

The M.I.T. professor Sherry Turkle observed in 1984 that, "In the presence of the computer, people's thoughts turn to their feelings. [...] We cede to the computer the power of reason, but at the same time, in defense, our sense of identity becomes increasingly focused on the soul and the spirit in the human machine." A doctoral candidate and a writer who focuses on technology and culture calls what Turkle observed a "romantic reaction" and wonders: "If we are to make real progress on the question of ethics in technology, perhaps we must revisit the kind of romanticism that Dr. Turkle described."

I wonder if the scientists and engineers-as well as the rest of us-approached this technology with an openness to awe if we might better appreciate how A.I. might be used and abused. I wonder if we approached the technology with a practice of reverence if that might help us to pause if only for a moment to reflect on the ethics and morality we want ourselves and our creation to embody before the new technology becomes for us a second skin.


REFERENCES

(1) Kathe L. Palka, "My son calls me while I'm in line at the Stop & Shop." The Writers Almanac, American Public Media, November 5, 2012

(2) Ezra Klein, "This Changes Everything," The New York Times, March 12, 2023. 

(3) Ibid.

(4) Meghan O’Gleblyn, God Human Animal Machine: Technology, Metaphor, and the Search for Meaning. New York: Doubleday, 2021. Kobo ebook. 

(5) Ibid.

(6) Bunny McKensie Mack. "Systemic Racism in AI: How Algorithms Replicate White Supremacy and Injustice." Teen Vogue. September 19, 2023.

(7) Meghan O'Gieblyn. God Human Animal Machine

(8) Ibid.

(9) Ibid. 

(10) Linda Kinstler. "Can Religion Guide the Ethis of A.I?" The New York Times. July 16, 2021.