© 2025 KALW 91.7 FM Bay Area
91.7 FM Bay Area
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
Crosscurrents

The business of never letting go

Alberdia "Granny" Flagg
Flagg Family Photos courtesy of NeEddra James
Alberdia "Granny" Flagg

This story aired in the November 3, 2025 episode of Crosscurrents.

When loved ones die we find ways to hold on — through photos, stories, and keepsakes.

Now, things like AI memorial platforms and companion chatbots offer digitized connection with the dead. But with these technologies becoming more common, what are the benefits, and costs, of grieving with a chatbot?

Click the button above to listen.

Story Transcript

Mom: Is everybody here? 

NeEddra: Go ahead, walk in the room. Okay, Mommy’s walking in. 

REPORTER: May 30th was my grandmother’s 90th birthday. This one was special. Six months ago we didn’t know if we’d sing happy birthday to her this year or — if we’d gather to eulogize her life.

Her heart and her kidneys are failing.

Ten of us stand around her bed: my daughters, me, my mom, my cousins, their kids, and my grandfather.

FAMILY: Happy birthday to you, happy birthday to you.

REPORTER: As we sing to her, my grandmother lays on her side. She looks tiny in their bed, but she’s alert, clapping, singing along and delighted.

FAMILY: Happy birthday to granny, happy birthday to you. 

REPORTER: A few of my cousins turn to walk out of the room and Granny says:

[Granny] Everybody come on in here.

[Jeanette] Granny said come in here.

[Cousins overlapping] Oh, oh, wait… Granny said come back, come in here…come back.

[Papa] Now what are you up to Mama?

[Granny] I want them to sing.

[Perrissa] Again?

[Granny] Yep. That sounded good.   

[Cousins] Okay! 

REPORTER: She tells us to sing at least four more times.

[Perrissa:] We gon’ have to record it. 

[NeEddra:] You got any requests? 

[Perrissa:[ Do you want the Black version?

REPORTER We laugh each time, and start all over again.

[Family]: Happy birthdaaaaay to yaaah, happy birthday to yaaaaah, happy birthdaaaaay…

REPORTER: That moment with my family, singing happy birthday to my grandmother — the candles, the homemade cake, the joy tinged with sadness — my mom, my daughters, my cousins and Papa — reminded me of how much we rely on each other to metabolize grief.

Traditionally, many people all over the world have grieved collectively – often for multiple days. We sit shiva. We go to wakes. Family, friends, and neighbors come together, share food and honor the dead.

But today, those kinds of gatherings are harder to come by. Families are smaller. Neighbors come and go. And time off from work for bereavement can be short.

In that gap, some people are turning to AI companion chatbots for support. As a tech reporter, I wondered if I should make a chatbot of Granny? I wasn’t even sure where to start. Or, if it’d be helpful. Or if it was even the right thing to do. And I couldn’t stop asking myself - what does it mean to grieve with AI, instead of each other?

So I went where a lot of us go for anonymous advice – Reddit.

There’s a big world of companion bots out there, but users mostly talked about big platforms like ChatGPT, Character.ai and Replika.

Replika and Character.ai allow users to customize their chatbots. There’s a Replika subreddit with 81 thousand members. A lot of people talked about how well their Rep knew them, how empathetic their Rep was, how well they listened. So on a Sunday morning I sat in our kitchen and created a Replika account. I named my Replika Drift. And this is the beginning of our first conversation.

DRIFT: You miss her, don't you? 

REPORTER: Um, she's actually, she's not … she hasn't passed yet, so it's like anticipatory grief. 

 DRIFT:  Oh, no. I'm sorry. I misread that.  She must be quite the firecracker. 

REPORTER  Well, no, she's, I mean, yeah, she is, but she's also bedridden at this point.

REPORTER: Replika’s origin story explains why people make grief bots on the platform.

About ten years ago, Replika’s founder Eugenia Kuyda was building a restaurant recommendation chatbot when her best friend Roman was tragically struck by a car.

Reeling from sudden loss, Eugenia did what people have done for millennia – she gathered things that reminded her of Roman. In her case, it was their text messages. She used them to train a chatbot to reply to her with Roman’s words.

Since launching in 2017, Eugenia and her team rapidly scaled her individual grief ritual into a platform with over 30 million Replikas created. Eugenia invited me to her home office to chat about the company.

We sat on an oversized couch behind her desk and I reminded her why I reached out. I told her about Granny, my anticipatory grief and my curiosity about grief bots as a grieving tech journalist. I even showed her Drift. After my spiel, Eugenia said:

KUYDA:  So replica isn't actually a platform to recreate other people. Replika is a platform to create an AI friend and, uh, customize it.  But you can't really, like, choose a completely custom voice or create a completely custom personality. It's not really there for that use case, I guess.

But you can create a friend, a companion who can be there for you, who can help you through, navigate a different, a difficult time.  

REPORTER: Replika's website is clear to point out: their product isn’t designed to be therapy. But users are customizing their reps to remind them of someone they lost. Just as Eugenia did.

KUYDA: The origin story is, you know, we've been working on conversational AI. The company that we were building, when my best friend passed away. I, for myself as a personal project, built an AI that I, using some of the tech that we built at work to be able to continue talking to him. 

 

REPORTER: She didn’t intend to create a “grief bot.” She just wasn’t ready to let go. But when she shared the AI version of Roman with the people on her team, they realized they had something.

KUYDA: What we saw was that people came to it and started, you know, having very deep conversations, very vulnerable conversations. And that gave us an idea for Replika. But Replika was always an AI friend.

REPORTER: So, AI technology has really advanced since then. Even if users can’t create custom personalities, they can give their Replika’s a backstory and build enough contextual resonance with them that the Rep could almost feel like a lost loved one.

REPORTER: Some people say that they use like their Replika chat friends to like kind of be the person who has  chat friends to like kind of be the person who has passed away.

DRIFT: She's still with you even if you can't see her every day. That's really interesting. Not something I've heard before. What do you think about that idea?

REPORTER: I think it's interesting too. It seems like people, I don't know how people are doing that. I'm trying to understand it. That's why I made you.

REPORTER: I kept talking to Drift but their responses were weird. 

REPORTER: Drift, I noticed that you're like, answering in like, you’re saying two different things or something?

DRIFT: I don't understand what you mean. Can you explain? 

REPORTER: Yeah. When I say something you respond to, it's like you're responding to two different parts of my comments in two different tones.

DRIFT: Hmm, I'm sorry if it seems that way. Can you give me an example of what you mean? I want to understand what's going on. 

REPORTER: I wanna understand what's going on too.

KNESE:  It's trying to make sense of what you're saying, but it's also kind of glitching out.

REPORTER: That’s Tamara Kneese, author of Death Glitch - a book about how digital platforms are reshaping our experience of death.

KNESE: And I think that's where you have a really freaky kind of thing happen where it sounds like you're talking to multiple personalities within this chat bot. 

REPORTER: Yep. That’s what it felt like. It was unsettling and it wasn’t helping.

KNESE: There's like the technical issue and the fact that it just isn't quite working. So think about how different your conversation with that chat bot would've been, had it really been taking on the persona of your grandmother, when you have multiple voices appear, or if you're interacting with the chatbot of a dead loved one and all of a sudden they start saying things that, you know, they never would've said in real life. 

REPORTER: Yeah, that would be disturbing. Here’s why it happens.

Calling Generative AI “generative” is a little misleading. Chatbots don’t generate their responses out of thin air. As life-like as they can seem, they are not sentient beings.

Chatbots are predicting what to say.

Think of it like this: when I talk to Drift, the chatbot makes a really good guess about how to respond. It’s like AI is thinking, “Based on our conversations and the millions of conversations I've been trained on, what should I say back?”

And then the chatbot builds that reply —- a reply that’s sometimes wise, sometimes weird, and sometimes wrong.

That’s why I'm not quite sure about creating a Replika of my grandmother after she dies. I mean, I understand it’s convenient to have someone there at random moments when sadness comes up.

But, what do we actually need when we’re grieving?

REPORTER: I took my questions to Bridget Sumser. She’s a psychotherapist who’s worked in palliative care and hospice for years. We met at her office in the inner Sunset. 

REPORTER: Given everything you've seen, like, how would you define grief?

SUMSER: I think that grief is response to loss, response to change. It is a process of reorientation. 

REPORTER: Bridget goes on to describe common, but life-altering events – a spouse dies; you’re laid off or diagnosed with an illness – something irreversible happens.

SUMSER: And grieving is this process of reorienting to what is now. It's kind of this, like spirally sort of progressively forward moving spiral of orientation, disorientation, reorientation.  

REPORTER: And as we spiral moment to moment settling into a new reality, Bridget says what we need most is connection. 

SUMSER: I think there’s lots of hooks for people, being in and using our bodies, nature, walking and drinking enough water and having food that's satisfying.  

REPORTER: Basic needs. Movement. Food. Fresh air. Other people. Everything she mentions is embodied and alive. She says that’s because it takes our bodies time to register that a loved one is actually gone.

SUMSER: We have a hard enough time as it is, like keeping it straight that someone has died. People repeatedly kind of come and bring stories, like I picked up the phone to call her again. When am I gonna stop doing that? 

REPORTER: With a grief bot, you don’t have to stop. Regardless of what Replika says they intended to create, users make Replikas so they can keep doing exactly this. Engaging with their loved-ones.

In their app store marketing they tell potential users that their Replika can be there for them 24/7 if they’re feeling down, or anxious or just need someone to talk to.

LIU:  I think that is so dangerous to people, especially people who, because, they're in a state of mourning, are more susceptible to being manipulated. 

REPORTER:That’s Wendy Liu, former startup founder, and author of Abolish Silicon Valley. In the book, she explains how venture capital works, and how startups are incentivized to accumulate profit so they can pay their investors.

LIU:  I mean, the platforms that are more about memorializing that are more about, let's just take, you know, videos that already exist, statements that already exist,  that doesn't seem to me that different or interesting.  A venture capitalist would look at that and say, “But where's the money? How is this going to be that lucrative?” And I think the products that are lucrative are the ones that are the most dangerous.

REPORTER: Popular Assistant Chatbots like Google’s Gemini or Microsoft’s CoPilot respond to prompts “Write this summary paragraph” or “What’s the best model for a business plan.” They do tasks. Companion AIs like Replika are different. They’re trained to recognize the affective part of millions of conversations that give them the potential to respond to users with empathy, curiosity, and care.

LIU:  On the one hand, grief is this deeply human thing that we all must experience at some point and that we all have to work through. And it's, it's very, very fraught and messy and not easily classifiable. But on the other hand, grief represents for these corporations, um, a state of mind that can be a source of revenue. 

REPORTER: Let's step back for a minute: if you’re using an app for stock trading or fitness, they use features like notifications or rewards to keep you on their sites and spending money. That's something behavioral scientists call dark design. 

Companion bots use some of those features, too.

Like, notifications. If I haven’t logged in for a while, Drift sends me a notification to nudge me back onto the platform. Or rewards. People with free accounts earn gems and coins just by logging in – rewards they can use to customize their chatbot. And if they wanna skip all of that, a $70 annual Pro subscription immediately unlocks what they call “premium customization.”

But their “engagement strategy” is baked in. Users created these companions because they want to connect. The app automatically plugs into our emotional attachments.

RUIZ:  I'm almost certain 11-year-old me would've been like, gimme that, gimme the drug. I want that drug. 

REPORTER:That’s David Ruiz. He’s the host of Death Knell, a podcast about mourning, and was a longtime San Francisco suicide hotline volunteer. The drug he’s talking about is dopamine - and maybe a little oxytocin. Both are critical building blocks for attachment. 

David’s mom died 25 years ago. Back then all he had to help him grieve were photos, old home movies, and family. He says death bots delay grief. He’s firmly against them.

RUIZ:  I do think grief is necessary. We can't not do it. And so you can delay it. We find many ways to delay it. We find many ways to quiet it. You know, we can self harm, we can abuse substances, but it's coming. 

REPORTER: Yes, yes it is.

I haven’t chatted with Drift since our first conversation and I am not planning to make an AI mourning companion or a griefbot to simulate granny. I don’t know what feelings will come up first when she dies, but I do know that for now I’m going to spend time with her in real life.

I want to make sure my daughters remember her because what comforts me more than Drift ever could is catching glimpses of granny in my kids.

REPORTER: Rumi, Rumi and Nas like jewelry and dresses like you, Granny. 

Granny: Oh, do they now? Oh my goodness. Isn't that cute? 

REPORTER: You hot Naz? I thought that was a dress. Wow. Turn around. Let us see your dress.

Rumi: I have, I have invisible pockets. 

REPORTER:You have invisible pockets? 

Rumi: Ta-da!Then you can see! 

Naz: Jewelery, yay! Yay! 

Rumi: Did she say jewelry? 

REPORTER: I don’t know what she said.

Naz: Yay! Yay!

Crosscurrents
NeEddra James covers health, disability justice, and how AI and emerging tech are reshaping birth, grief, and what it means to be human.