What Is a Deadbot?
A "deadbot" is an AI chatbot trained on a deceased person's digital footprint — their messages, social media posts, voice recordings, and personal stories — to simulate conversation with them after death. The concept has evolved from a thought experiment to a growing field called thanatechnology.
The History of Deadbots
Eugenia Kuyda creates 'Roman' — a chatbot trained on her deceased best friend's text messages. Widely considered the first 'deadbot.'
Microsoft patents a concept for creating AI chatbots from a deceased person's digital data. Sparks intense public debate.
Amazon's Alexa team demonstrates the ability to read a bedtime story in a deceased grandmother's voice.
Multiple startups launch commercial deadbot services. The term enters mainstream vocabulary.
EU proposes 'deadbot regulations' requiring consent and transparency. Academic research surges.
AfterLive and others refine the approach: consent-first, family-controlled, grounded in real memories rather than simulated personalities.
How Do Deadbots Work?
Modern deadbots use large language models (LLMs) — the same technology behind ChatGPT and Claude — fine-tuned or prompted with a specific person's communication data.
The process typically involves collecting the person's text messages, emails, social media posts, voice recordings, and personal stories. This data is used to create a "personality profile" that guides the AI's responses.
More advanced platforms like AfterLive go beyond text — incorporating voice cloning, emotional patterns, and family-contributed memories to create a richer, more authentic presence.
The key distinction: quality deadbots are grounded in real data, not hallucinating responses. They'll say "I don't know" rather than invent memories that never happened.
The Ethics of Deadbots
Consent Matters
The best deadbot platforms let people prepare their own digital presence while alive — or require family authorization with clear guidelines.
Grounding, Not Hallucination
Responsible platforms base responses on real memories, stories, and recordings — not fictional extrapolation of what someone 'might' say.
Therapeutic, Not Manipulative
Deadbots should aid the grief process, not exploit it. Features like grief counselor integrations and usage guidelines are essential.
Transparency
Users should always know they're talking to an AI. The goal is connection and comfort — not deception.
Data Sovereignty
Family memories must remain under family control. Encryption, deletion rights, and clear data policies are non-negotiable.
AfterLive's Approach
We prefer the term "digital presence" over "deadbot" — because AfterLive isn't about simulating death or pretending someone is still alive. It's about preserving what made them who they were.
Every response is grounded in memories you provide. The AI will never fabricate events, invent opinions, or pretend to be something it's not. It's a memory tool — not a resurrection tool.
All data is end-to-end encrypted and controlled entirely by the family. You can export, pause, or delete everything at any time.
Preserve Their Memory With Care
Start free. Upload memories at your pace. Privacy-first, always.
Get Started — Free