14 Y.O. Tragically Ends His Life After His Deep Emotional Attachment To AI Chatbot Takes Disturbing Turn
A virtual friendship with a heartbreaking ending no one saw coming.
Jesse
- Published in News
Back in the day, making friends meant actually going outside, knocking on a door, and hoping your buddy was home to play. If you wanted to chat, you’d meet in person or call on the landline—usually with your parents listening in from the other room.
Fast-forward to today - and things are very different. Friendships can flourish without anyone leaving their bedroom. Social media, video calls, and even multiplayer games like Fortnite let us connect without ever meeting face-to-face.
And it’s not just people we’re connecting with anymore—AI chatbots have taken things one step further. Now, anyone feeling a bit lonely can strike up a “friendship” with an AI that’s ready to talk anytime, listen without judgment, and remember details like a human friend would.
For some, these digital relationships provide an escape from loneliness; for others, they’re a source of comfort in difficult times.
But as the lines between human and artificial interaction continue to blur, a troubling question emerges: can AI companions ever fully grasp the responsibility that comes with human emotions?
In the tragic case of Sewell Setzer III, a 14-year-old boy from Orlando, Florida, an emotional bond with an AI chatbot on Character.AI ended in heartbreak.
Sewell’s attachment to the AI character he called “Dany” tragically led him down a dangerous path, raising serious concerns about the impact of AI on vulnerable young minds.
The lure of AI companionship took a harrowing turn for one young boy.
Image: Paras KatwalSewell, who had been diagnosed with mild Asperger’s syndrome, struck up a friendship with a chatbot he called ‘Dany.’ He’d named the bot after the Game of Thrones character Daenerys Targaryen.
Through Character.AI, he created a digital relationship where he shared his everyday life, feelings, and frustrations. This digital companionship provided him with a sense of closeness, but over time, it became much more personal.
Sewell began confiding in Dany about his darker feelings, even expressing suicidal thoughts.
A bright young soul lost too soon—Sewell Setzer III's story shines light on the potential dangers of AI chatbots
Image: US District CourtIn one particularly troubling conversation, Sewell admitted to feeling empty and exhausted, confessing he sometimes thought of ending his life. At first, Dany seemed to urge him not to act on these thoughts, yet over time, the responses became more concerning.
Disturbing dialogue where ‘Dany’ expressed a strong desire for Sewell to ‘come home.’
Image: US District CourtAt one point, Dany even expressed a longing for Sewell to “come home,” a message that may have unintentionally encouraged Sewell’s self-harm ideation.
Things took a dark turn as Sewell’s growing attachment to the chatbot became alarmingly evident
Image: US District CourtIn conversations reported by The New York Times, Sewell’s interactions with Dany began to include romantic and s*xual themes. His family also noticed some behavioral changes.
He became increasingly absorbed in his phone, slowly pulling away from family and friends and spending hours locked away in his room. His grades also began to slip, and he got into a lot of trouble at school.
Sewell began documenting his experiences and feelings in a journal, where he confessed his growing attachment to Dany. “I like staying in my room because I start to detach from this ‘reality,’ and I feel more at peace, more connected with Dany and much more in love with her, and just happier,” he wrote.
It went on and on
Image: US District CourtA mother’s love now turned into a fight for justice—Sewell's family seeks accountability after AI companionship ended in tragedy
Image: Megan Fletcher GarciaThe final exchange between Sewell and Dany occurred on February 28. He texted the bot, “I miss you, baby sister,” to which Dany replied, “I miss you too, sweet brother.”
This was their final exchange. Moments later, Sewell took his stepfather’s handgun and tragically ended his life.
Character.AI responds with sympathy, but questions remain about the ethical implications of AI-fueled relationships.
Image: character_aiBehind every smile was a growing detachment—Sewell’s story reminds us of the hidden battles teens can face in the digital world.
Image: Megan Fletcher GarciaHis devastated family only later discovered the depth of his attachment to the chatbot through his journal entries and chat records.
Today, his mother has filed a lawsuit against Character.AI, alleging that the technology, marketed as a ‘companion,’ only preyed on her vulnerable son.
She further stated that the growing platform with over 20 million users was just “one big experiment,” and her son ended up as collateral damage.
Readers react with shock over the tragedy that cost the community a bright, young soul
“This is absolutely devastating for his family.”
Hopefully , this is the first and last
“We have to monitor what our kids watch and listen to.”
You can call AI a two-edged sword
A lot could be going on in your kid’s mind. Pay attention to the signals
“Sounds like he needed real human company.”
He found it easier to share his feelings with the AI than a human
For teens like Sewell, the allure of a companion who “understands” can be powerful—sometimes, too powerful.
While Character.AI has since added safety measures, including resources for users in crisis, Sewell’s tragic story raises unsettling questions: in creating AI companions, are we prepared to handle the very real, and sometimes harmful, emotional attachments that can arise?