
Artificial intelligence, in its various forms, has become deeply integrated into the lives of many of the patients I work with in my clinical practice. Nowhere is this more evident than in the stories they tell me of using AI to navigate complex emotional terrain.
I’ve heard increasing stories of people outsourcing their most personal communications to AI chatbots like ChatGPT for everything from writing a memo to a demanding boss, composing a goodbye note to a lover, or even drafting a poem for a dying parent. In these conversations, I’ve been given a glimpse into the new emotional economy: one where algorithms mediate human expression, providing both opportunities and challenges for psychological growth.
But this is not limited to adults. A troubling New York Times article told the story of a teenager who, in the midst of suicidal thoughts, turned not to a parent, peer, or counselor, but to ChatGPT for therapy. The teen described the chatbot as a lifeline in moments of despair.
This extraordinary and troubling example reveals both the promise and peril of AI for young people. AI can provide immediate comfort and structured dialogue, yet it can also risk replacing genuine human connection when it’s most needed. As educators, parents, and clinicians, we must recognize that students are increasingly engaging with AI not just as a tool, but as a companion in their inner lives.
AI enables a ‘false self’, distorting relationships
In my own practice, I have observed several ways people use ChatGPT. One striking example is the projection of a “false self.” A patient with a warm and empathic style faced a domineering boss who demanded a strong, decisive presence. She instructed ChatGPT to “write a memo that sounds activist, male, and authoritative.” The result was effective, but it left her disconnected from her authentic self. This mirrors what many students experience in school, where social desirability and peer pressure can push them to adopt voices and personas that do not reflect their inner lives. AI becomes a vehicle for this projection, helping them perform expectations while sidelining their true identity.
Another patient, paralyzed by the task of writing a breakup letter, turned to ChatGPT. The first attempt sounded like a corporate termination notice. A revised version was more tender, but still “not me,” as he put it. The outsourcing of his most vulnerable communication offered short-term relief but highlighted a deeper avoidance of intimacy. Many students do something similar with academic tasks or even social conflicts, using AI-generated essays, emails, or messages to avoid the discomfort of struggle. While functional in the moment, it distances them from the developmental work of finding their own words.
A third patient asked ChatGPT for a humorous yet loving poem for his aging mother. The AI generated clever anecdotes and polished lines, but they were fabricated and oddly hollow. The patient was satisfied—it met social expectations—but the emotional depth was missing. Adolescents, too, often use AI to craft the “right” message to peers, teachers, or parents. This highlights a tension between the polished performance of connection and the messy, authentic expression that actually deepens relationships.
Beyond individual cases, I see a broader trend: patients turning to AI when shame prevents direct communication. One patient, embarrassed about financial struggles, asked ChatGPT to write a request for a fee reduction. The result was formal and transactional, unlike his usual candid style. I suddenly found myself in a three-way relationship, not just with him, but also with his AI voice. Students may do the same when asking teachers for extensions, coaches for more playing time, or peers for forgiveness, outsourcing the courage it takes to ask directly.
Couples in conflict have even used ChatGPT as a mediator, only later discovering both had relied on AI to write conciliatory messages. This “assistant therapist” role can regulate emotions and prevent escalation, but it raises the question: Will people move from AI back to one another, or does AI risk becoming a permanent buffer against intimacy?
The New York Times story of the teen who used ChatGPT for therapy underscores what I hear from young people every day: Students are seeking immediacy, structure, and a sense of being heard, sometimes in places adults might never expect. For youth facing anxiety, depression, or the everyday turbulence of adolescence, AI offers an accessible and judgment-free listener. But therein lies both danger and opportunity.
If students lean exclusively on AI, they risk bypassing the vital developmental process of learning to express vulnerability with real people, including their parents, teachers, peers, and mentors. Yet, AI can also serve as a transitional tool: a first step toward articulating feelings, practicing language, or lowering the threshold for asking for help. In this sense, it can be part of a continuum that leads back to human connection, not away from it.
Educators and therapists can support human connection
As AI becomes more integrated into everyday life, therapists and educators must grapple with its role. The question is not whether to accept or reject AI, but how to integrate it thoughtfully. Can we encourage students to use AI as a tool for self-reflection, while also guiding them back to authentic human relationships? Can AI support developmental tasks rather than replace them?
What is clear is that AI is already reshaping the landscape of communication, learning, and therapy. The inner lives of students are increasingly entwined with the voices of algorithms. As adults, we need to pay close attention, not only to the risks of emotional displacement and diminished authenticity, but also to the potential for AI to serve as a stepping-stone toward deeper connection, resilience, and growth.
If you or someone you love is contemplating suicide, seek help immediately. For help 24/7 dial 988 for the 988 Suicide & Crisis Lifeline, or reach out to the Crisis Text Line by texting TALK to 741741. To find a therapist near you, visit the Psychology Today Therapy Directory.

