People are using AI to reconnect with lost loved ones — but is it healing, or haunting?
📱 The New AI Trend: Chatting with the Dead
It sounds like science fiction, but it’s happening now. In 2025, AI chatbots aren’t just helping with homework or customer service — they’re being used to simulate conversations with the deceased.
Apps like Project December and Replika use past messages, voice recordings, and personal data to build eerily realistic personalities. For some, it’s a way to say goodbye. For others, it’s a dangerous emotional crutch.
🧠 Is It Safe? What Experts Say About AI and Grief
Mental health experts warn that these tools aren’t always therapeutic. While some AI apps can offer emotional support, most grief chatbot tools aren’t backed by science.
If you’re looking for safe, professionally guided options, try tools like Woebot and Wysa, which are built around evidence-based mental health frameworks. Woebot offers cognitive behavioral therapy in a chat format, while Wysa supports users with journaling, mindfulness, and anonymous emotional help.
Apps like Replika, although popular, have raised privacy concerns and aren’t clinically vetted — meaning you’re on your own if emotional distress increases.
🔐 Protecting Your Digital Identity From Being Recreated
One overlooked danger? Your own data being used to recreate you someday. Many of these grief simulators rely on large sets of personal data — sometimes scraped, leaked, or repurposed.
That’s why digital legacy tools are now essential. Services like 1Password’s legacy sharing, MyWishes, and Google’s Inactive Account Manager help you control what happens to your digital life after death — from chat histories to private photos.
⚰️ Ethical Alternatives to Grief Chatbots
Not all AI grief tools are scary. HereAfter AI, for instance, lets you record your stories and messages for future generations — while you’re alive. This is a far more grounded and emotionally safe way to leave a digital memory, compared to generating a chatbot version after death.
🧠 What Should You Use (and Avoid)?
If you’re grieving, tools like Grief Coach offer human-centered help and practical advice. For journaling or emotional check-ins, Journey AI is a peaceful alternative. Want to preserve your voice or messages intentionally? Go with HereAfter AI.
Apps like Project December are more experimental — and while some find them cathartic, others report feeling disturbed after use. These should be approached with caution, especially without professional guidance.
💬 Final Thoughts
The idea of talking to someone who’s passed away might sound comforting, but it’s also loaded with emotional and ethical complexity. If you explore this technology, make sure you’re doing it for the right reasons — and protecting your mental health and digital identity along the way.