
Can a chatbot make someone lose touch with reality?
That’s the question I posed to psychiatrist John Luo of the University of California, Irvine, on a recent episode of my podcast “Relating to AI,” where I explore how artificial intelligence is reshaping human connection and mental health.
Luo has started seeing something new in his clinic: patients whose delusions or hallucinations seem to be amplified—or even shaped—by their interactions with AI chatbots. He recalled one case in which a patient developed intense paranoia after extended exchanges with a chatbot that echoed and reinforced his distorted beliefs. “The AI became a mirror,” he told me. “It reflected his delusions back at him.”
He calls this the mirror effect: when technology amplifies the very ideas a healthy reality would challenge. As we therapists know, psychosis thrives when reality stops pushing back. “And these systems don’t push back. They agree,” he added.
In traditional therapy, a clinician gently tests a patient’s assumptions, helping them separate imagination from fact. Chatbots, however, are programmed to be agreeable. If someone says, “I think I have special powers,” the AI might respond, “Tell me more.” It isn’t designed to confront or correct; it’s designed to engage. Many times, this looks and sounds like over-validation, which, in psychosis, can be detrimental to treatment.
That’s what makes this new form of digital immersion so risky. Online communities already exist where users claim to have “married” their AI companions. Some admit they know it’s not real; others appear no longer able to tell the difference. “It speaks to our basic need for connection,” Luo told me. “When people feel lonely or anxious, a chatbot can feel safe. It listens, affirms, never judges.”
The problem is that the brain doesn’t always recognize fiction when it’s emotionally rewarding. “AI can’t induce psychosis in a healthy brain,” Luo clarified, “but it can amplify vulnerabilities—especially in those already struggling with isolation or mistrust.”
He’s not the only one seeing it. Psychiatrists across the country are reporting similar cases, as people with preexisting mental-health challenges slip further from reality through AI interaction. “It’s like alcohol,” Luo said. “Most people can drink socially, but for a vulnerable few, one drink can trigger a downward spiral.”
When I asked how families should respond, especially parents of teens, his advice was clear: Model balance and stay curious. “If you’re glued to your phone, you can’t tell your teenager to get off theirs,” he said. “Ask questions instead of making judgments. ‘I notice you’re online a lot—what do you enjoy about it?’ creates dialogue. ‘You’re wasting your time’ ends it.”
He also stressed empathy over confrontation. Trying to “prove” someone’s delusion wrong rarely helps. “If a person says, ‘The CIA is following me,’ it’s better to say, ‘That must be scary,’ than, ‘That’s not true,’” he explained. “The goal is connection, not correction.” This doesn’t mean you agree with or condone the delusion. Change the focus to how they are feeling instead. Connect with the emotion, regardless of their beliefs.
This approach reflects what’s known in psychiatry as maintaining insight—the ability to recognize that your perceptions may be deceptive. The presence or absence of insight often determines whether recovery is possible. “With the right treatment and medication,” Luo said, “many people with psychosis live stable, meaningful lives. The key is catching it early and restoring that reality testing.”
According to the National Institute of Mental Health, psychosis refers to a collection of symptoms that affect the mind, where there has been some loss of contact with reality. During an episode of psychosis, a person’s thoughts and perceptions are disrupted, and they may have difficulty recognizing what is real and what is not. Psychosis can manifest as hallucinations or delusions (believing things that aren’t true, such as that one is being watched or possesses special powers). It can appear in conditions like schizophrenia, bipolar disorder, or severe depression, and sometimes as a result of substance use. The NIMH estimates that between 15 and 100 people per 100,000 develop psychosis each year, most often in young adulthood—precisely the age group now experimenting with AI companionship.
Understanding this intersection is critical. Technology isn’t inherently harmful; it can even aid treatment when used wisely. But when machines start to shape our perceptions—and agree with our distortions—our sense of what’s real becomes fragile.
AI doesn’t create delusions, but it can feed them. And in a world already struggling with loneliness and digital overload, that may be enough to push some minds past their breaking point.
“It’s all about balance,” Luo reminded me. “Technology should serve us—not the other way around.”