A single mistake affected 531 patients—not because people didn’t care, but because the system made it easy to fail silently.
Listen:
Check out all episodes on the My Favorite Mistake main page.
This episode marks the first installment of a new occasional series we’re adding to the podcast: “Mistake of the Week.” Think of it as Season 2 of My Favorite Mistake, running in parallel with our interview series. Season 1 continues next week with a new guest and another great conversation.
In this inaugural “Mistake of the Week,” Mark looks at a story out of MaineHealth, where 531 very-much-alive patients received condolence letters stating they were dead — complete with estate instructions. A software glitch triggered the mass mailing, but the real story is about systems, safeguards, trust, and the human response to error.
Mark unpacks what happened, what “fully resolved” might really mean, and how organizations can turn public embarrassment into meaningful learning.
If you received this episode through your podcast app and not a séance, you’re doing fine.
Subscribe, Follow, Support, Rate, and Review!
Please follow, rate, and review via Apple Podcasts, Podchaser, or your favorite app—that helps others find this content, and you'll be sure to get future episodes as they are released.
Don't miss an episode! You can sign up to receive new episodes via email.
This podcast is part of the Lean Communicators network.

Other Ways to Subscribe or Follow — Apps & Email
Transcript:
Hi, welcome to My Favorite Mistake. I'm Mark Graban, and this is your mistake of the week.
This story comes from Maine where more than 500 living patients received letters saying they were—well, not living. Maine Health, the state's largest health system, accidentally mailed condolence letters to 531 patients who were very much alive. The letters were meant for families of the deceased, complete with instructions for how to settle their loved one's estate.
One woman from Sanford told reporters, “Why would I say I was dead? It was really shocking and upsetting”. It's understandable. It's one thing to get a medical bill. It's another to get a message saying “Your time's up”.
According to the health system, the mistake happened on October 20th when a computer program that generates estate vendor letters glitched. The patients' medical records were fine; nobody was actually marked deceased, and no care was disrupted. Maine Health apologized and said the issue has been, quote, “fully resolved”.
Now, that probably means the software error is fixed, the mailing list cleaned up, and an apology letter sent. But in healthcare, or in any setting, maybe “fully resolved” should mean more than that. It should mean understanding how the process broke down. Not just patching the software, but asking the deeper “why” behind the error.
Because when mistakes like this happen, it's rarely one person pressing the wrong key. It's about how systems are designed—or not designed—to prevent the wrong key from doing harm. Automation often doesn't remove human error; it relocates it. When it fails, the consequences spread faster and further, and trust—the most fragile part of any health system—takes the hit.
Errors like this also put frontline staff in tough positions. Nurses, clerical teams, and call center employees become the human face of the mistake. Apologizing, clarifying, and helping people regain confidence in their care—that's emotional labor most people never see.
So what's the takeaway? If we respond to mistakes by blaming individuals, we guarantee they'll happen again. If we respond with curiosity, we make the system better. That's true whether the mistake is sending 500 condolence letters or something far more serious. And if you receive this message through your podcast app and not a seance, you are doing fine.
For My Favorite Mistake, I'm Mark Graban. Remember, mistakes don't define us, the way we respond to them does.

