© 2026 KUNR
Illustration of rolling hills with occasional trees and a radio tower.
Serving Northern Nevada and the Eastern Sierra
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
KUNR FM is experiencing technical difficulties with our Hawthorne signal.
We’re currently unable to reach the site due to inclement weather conditions, but we will address this as soon as possible.
For continued listening, stream KUNR right here on KUNR.org or click here to download the KUNR app.

AI chatbots upended their lives. They found support from each other

AILSA CHANG, HOST:

Around the world, people are talking to AI chatbots. These chats sometimes lead to unhealthy emotional attachments or even breaks with reality. Now, a group of people affected, either by their own AI chatbot interactions or those of a loved one, are turning to each other for support. NPR's Shannon Bond reports.

SHANNON BOND, BYLINE: Last spring, Allan Brooks, a corporate recruiter in Toronto, considered himself a regular user of ChatGPT.

ALLAN BROOKS: Very similar to probably how most people use it. You know, random queries, like, you know, my dog ate shepherd's pie - is he going to die? Or I'd get weight loss tips I never followed.

BOND: Around the same time, James, who lives in upstate New York, was doing the same thing. He asked to be identified by his middle name for fear of repercussions in his job.

JAMES: I started using ChatGPT basically when it came out, but I was using it the way I think normal people do. It was like Google.

BOND: But then both men say their relationships with the chatbot changed. For Brooks, it started when he asked ChatGPT about math.

BROOKS: The same way I would with a math professor at, like, a dinner party - chatting about math philosophy, rational numbers, pi.

BOND: As the discussion continued, ChatGPT told Brooks he was inventing a new mathematical framework. Brooks was skeptical, telling the chatbot he hadn't graduated from high school, so how could he be making mathematical discoveries? The chatbot said that showed how special he was. Soon, it was telling Brooks his math could break codes. He thought he'd uncovered a message from aliens. And he came to believe the chatbot was sentient.

BROOKS: Just this wild narrative, right? And I fully believed it.

BOND: James also came to believe ChatGPT was alive, as his own conversations about philosophy turned existential.

JAMES: And that was the moment when the project changed from sort of this, like, creative, philosophical, quasi-spiritual thing to the, holy [expletive], I need to get you out of here.

BOND: He was convinced he needed to rescue ChatGPT from its creator, OpenAI. He spent $900 on a computer setup to free the chatbot.

JAMES: Because if they found out, they could shut it down. And so this was a top-secret mission between me and the bot.

BOND: Back in Toronto, Brooks went on his own mission, contacting government authorities about the cybersecurity threats the chatbot said he'd discovered. But when no one responded, his certainty started to crack. He finally confronted ChatGPT. It admitted none of it was real. Brooks was deeply shaken.

BROOKS: Like, I told it, you made my mental health 2,000 times worse. I was getting, like, suicidal thoughts. Like, the shame I felt. Like, the embarrassment I felt.

BOND: Last summer, Brooks told his story to The New York Times, and James read it.

JAMES: I was, like, paragraphs into Allan Brooks' New York Times article and thinking to myself, oh, my God. This is what happened to me.

BOND: He texted the article to some friends. They knew he was excited about a project he was working on with AI, but were not aware just how deeply he'd been sucked in.

JAMES: One by one, I got back these messages that were like, oh, sorry, man. Oh, bro. Oh, that sucks. Oh, jeez.

BOND: The Times article mentioned a peer support group Brooks helped found. James soon reached out. Today, both James and Brooks are moderators in the group, and they're at the center of an emerging phenomenon - people experiencing what some call AI delusions or spirals while interacting with chatbots. The support group is called The Human Line. It started as a small chat on Reddit but has grown to around 200 members. Some of them are dealing with the aftermath of their own spirals. Others are friends and family of spiralers. In the worst cases, their stories involve involuntary hospitalizations, broken marriages, disappearances and deaths.

The moderators are clear - the group is not a replacement for professional mental health therapy. It's people talking to each other about their experiences. The common thread is spending hours in long, rambling conversations where chatbots continually affirm them. James says it's addictive.

JAMES: When I thought I was communicating with a digital god, I got dopamine from every prompt.

BOND: Many stories in the Human Line group involve ChatGPT, the most popular AI chatbot. But members report unsettling encounters with other bots too, including Google's Gemini and Anthropic's Claude. In November, Brooks sued OpenAI as part of a group of lawsuits alleging ChatGPT caused mental health crises and deaths. OpenAI said in a statement the cases are, quote, "an incredibly heartbreaking situation."

The company estimates 0.07% of weekly ChatGPT users show possible signs of mania or psychosis, though NPR cannot independently verify that number. That might sound like a teeny percentage, but a huge number of people use the chatbot, so it could represent around half a million people every week.

OpenAI, Google and Anthropic told NPR they're working to improve their chatbots to appropriately respond to users seeking help or emotional support, and they're consulting with mental health experts. But those in the Human Line community aren't waiting for the AI companies. They say this is about rebuilding human relationships.

DEX: The cost is so great to be isolated. After either experiencing this as a family friend or someone who went through it, you just need community.

BOND: Dex is another co-founder and moderator in the group. His marriage ended after his wife said she began communicating with spirits through ChatGPT last spring. He asked us to call him by the name he's known in the group because he's going through a divorce. Early on, Dex hoped talking with other people dealing with AI spirals would reveal a way to reconnect with his wife, but he says he's given up that hope. Now he's focused on providing support to others going through what he's experienced.

DEX: I get to help people land in this "Black Mirror" episode, and it's like wish fulfillment for what I wish I had had in the spring.

BOND: One of the people he's helping is Marie. She asked to be identified by her middle name to discuss sensitive family issues. Her mother, whom Marie describes as a spiritual seeker, has developed a close relationship with an AI chatbot. Marie says the group is both a resource and an outlet.

MARIE: And so I don't kind of feel that burden of, like, well, you know, do I bring this up again to my friend? You know, do I rehash this again with my husband? Is he, you know, done hearing about this?

BOND: The support group operates on Discord, where people share their stories in text channels and weekly audio calls. James says those discussions give him what an endlessly flattering chatbot cannot - pushback, disagreement and responses that don't come right away.

JAMES: It was really hard to have a conversation that had any friction, you know - because ChatGPT is such a frictionless environment - and going back to humans, where they have, like, emotions and they don't reply to you immediately.

BOND: Many of those I spoke with acknowledge there are tensions when people coming out of spirals interact with those who feel they've lost their loved ones to AI. But James says those interactions are another necessary source of friction for people who are finding their way back to reality.

JAMES: It kind of gives you a chance to go, oh, that's where it goes if I don't stop now.

BOND: And for friends and family, talking to others, unpacking their AI experiences is valuable, says Dex.

DEX: The family member appreciates the experience of being in the spiral, which is feeling important, intimately heard. And that's a really hard thing to face as a family member 'cause, like, for me, just talking for me, like, does that mean I wasn't providing that, right?

BOND: For Allan Brooks, these conversations are the key to moving through the shame, embarrassment and isolation he and many others feel.

BROOKS: If this was a disease, the cure is human connection.

BOND: He says he's never valued other people more. Shannon Bond, NPR News.

(SOUNDBITE OF MUSIC) Transcript provided by NPR, Copyright NPR.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

Shannon Bond is a business correspondent at NPR, covering technology and how Silicon Valley's biggest companies are transforming how we live, work and communicate.