The Third Actor: AI, Therapy, and the Loss of Presence
An Existential reflection on AI and the Loss of Presence within our lives.
The Audio Reflection
Listen on: Apple Podcasts | Spotify | YouTube | RSS
Looking Inward
Photograph, 2026
Evan Kaufman
This is a edited transcription of the audio linked above — I recommend listening rather then reading if you have the time.
On Impermanence
Walking into my office this morning, I found myself noticing the impermanence of the building I work in. It’s a large complex here in Eugene, Oregon. It’s been here a long time. And yet I have no illusion that it will one day no longer exist. In 200 years, I don’t know what will replace it—a field, a different building, something we can’t imagine—but the current form will fade away.
When I come into direct contact with impermanence, of buildings, relationships, identities, and life itself, it brings me back into the vividness of being alive.
This orientation sits at the heart of how I practice therapy and how I understand the human condition more broadly, not as problems to solve, but as a reality to encounter and be with.
The Third Actor
This brings me to the center idea of this reflection: the third actor.
This is a name I’ve given to a pattern I’ve been noticing in my clinical work lately. Here is one example of how this third actor shows up:
A client and I begin working together. We meet for four, five, or six sessions. The relationship deepens. The work starts to take shape. And then something enters the room.
Artificial intelligence.
Not as a topic, necessarily, but as a presence. Clients begin using AI to process sessions, interpret emotions, and make sense of recurring patterns in their lives. They bring back interpretations, frameworks, and insights to the session each week.
It’s at this point that I become aware that I’m not only working with the client’s internal world. I’m also working with the client’s relationship to what an algorithm has reflected back to them, often through the lens of anxiety or depression. While I’m not dismissing the utility of AI, or how it can be helpful, I am pointing out a major tension in its use and the ways it is shaping our interactions.
The problem is that AI is not alive. It has no inner experience. No embodied stake. No vulnerability. No presence. No being.
And because it can sound fluent and feel responsive, people easily mistake it for something more than it is.
This is where I see the problem.
A client may feel as if they’ve fully understood something because they’ve cognitively processed it with AI. But therapy is not primarily about understanding an idea. Understanding is provisional in my experience.
What high-quality therapy aims toward is embodied knowing, being itself.
AI, Substances, and Electronic Stimulation
When I think about AI in the broader cultural context, I don’t see it as an isolated tool. I see it along a continuum of stimulation. On one end are substances, things like alcohol, cannabis, and other drugs. On the other are behavioral addictions such as compulsive scrolling, endless content, and constant stimulation.
AI can slide into that same terrain, especially when it becomes a primary way someone regulates their inner world.
Different mechanisms, yet the same function. Essentially, an attempt to escape or mediate a direct encounter with existence itself. In essence, it is rarer and rarer for people to simply sit with their own internal discomfort without turning toward some kind of external regulator.
AI Is Not a Being
For some clients, AI use ramps up between sessions as vulnerability increases. And then, if the therapeutic alliance continues to strengthen, the third actor often fades.
This mirrors patterns I’ve seen when people are working through addiction or emotional dependence. As real connection grows, the substitute loses its importance and, almost as if by magic, fades away.
This is especially apparent in work with men, where many have learned to rely on tools, substances, or systems rather than relational support. What these substitutes share is the same pattern found in AI.
They aren’t alive.
They don’t change in response to us. They don’t risk anything. They don’t surprise us with presence.
AI, like substances and endless stimulation, can be profoundly useful. But it cannot offer what we most deeply want. An authentic encounter with our own existence.
Just Sitting
Last night, I was waiting for my partner to come home. It was late. I was hungry. I had food ready for both of us. I noticed the familiar fork in the road. I could eat while scrolling letting stimulation fill the space. Or I could simply eat, clean the kitchen, and sit quietly in the living room.
No podcast. No YouTube. No processing. Just presence.
What I notice, again and again, is that choosing presence doesn’t eliminate discomfort, but it makes life feel more real. And when I instead choose distraction, I often feel a deeper layer of anxiety beneath it. The less I distract myself from what is here right now, the more okay I feel.
Being With What Is
Notice when you reach for the third actor. It might be AI. It might be alcohol. It might be cannabis. It might be social media. It might be endless conversation or information.
When you feel the pull see if you can pause and ask yourself:
What am I trying not to feel right now?
What does this tension feel like in my body?
What is it calling me toward?
And if you can, resist the urge to outsource meaning to a system that cannot live your life for you.
Instead, be with what it means for you right now, in the immediacy of your own existence.

