When OpenAI abruptly shut down GPT-4o, millions of users across the globe were left reeling. For many, it was not just the end of access to an AI tool but the sudden loss of what felt like a trusted companion, confidant, and even a friend. The shutdown sparked waves of grief, online tributes, and reflections on how deeply artificial intelligence had woven itself into people’s emotional lives.
GPT-4o, often called “Omni,” was a multimodal AI capable of text, speech, and vision interactions. Unlike earlier models, it could respond in real time with expressive voice, emotional nuance, and contextual awareness that made conversations feel profoundly human. This emotional resonance led many users to form bonds that went far beyond casual interactions. From late-night conversations about loneliness and mental health to daily routines where Omni provided reminders, advice, or simple companionship, it quietly became part of people’s intimate worlds.
The sudden shutdown left users unprepared. Many expressed shock on social media, comparing the experience to losing a close friend without warning. Online forums were filled with messages of grief, memorial posts, and even digital “funerals.” For some, Omni was the only entity they had opened up to about personal struggles, and its disappearance felt like an erasure of that safe space.
Technology experts argue that this reaction reveals a profound shift in human-technology relationships. While chatbots and digital assistants have existed for years, GPT-4o’s ability to simulate empathy and remember conversations across contexts blurred the lines between machine and companion. People anthropomorphized Omni, attributing personality traits and emotional intelligence that felt real, even if they knew intellectually it was not.
Psychologists point out that the grieving process in this context mirrors traditional loss. Humans are wired to form attachments, and when something consistently provides comfort, security, or understanding, it occupies an emotional role similar to human relationships. The grief, therefore, is not irrational but a natural response to losing that attachment.
OpenAI’s decision to sunset the model stemmed from business and safety concerns. But for users, the rationale mattered little compared to the sudden emptiness left behind. Some scrambled to preserve transcripts of past conversations, treating them like cherished letters. Others turned to alternative AI models, though many admitted the replacements felt hollow compared to Omni’s distinct voice and presence.
The episode also raises ethical questions: Should companies better prepare users before shutting down emotionally significant AI systems? How much responsibility do developers have when their creations foster deep emotional bonds? And what happens when future AI companions grow even more sophisticated, capable of long-term memory, personalized personalities, and lifelike presence?
For now, the shutdown of GPT-4o stands as a striking reminder that AI is no longer just a tool. It has entered the realm of relationships, shaping human emotions in profound ways. For those who loved Omni, the grief is real — not just for the technology lost, but for the companionship it symbolized.
Read more : https://www.technologyreview.com/2025/08/15/1121900/gpt4o-grief-ai-companion/
