BOSTON -- This one barely qualifies as a "whoopsie." I'd never have noted the little mishap if it hadn't happened to my own aunt or if it hadn't come in the wake of a report that between 44,000 and 98,000 deaths occur from medical accidents every year.
There in the clinic was Mrs. Alexander, a k a "the vertigo." She was one curtain away from Mrs. Fernandez, a k a "the diabetes." But it was 15 minutes into the conversation before the very pleasant young doctor discovered that Mrs. A wasn't Mrs. F.Nothing happened, mind you. Mrs. A didn't get a shot of insulin. Mrs. F didn't get a CAT scan. But it was one of those reminders of the meaning behind the words that grace the title of the institute of Medicine report: "To Err is Human."
Finally, after years of neglect, we are out to reduce err pollution. First came the Institute report. Then, on Tuesday, came the presidential announcement that any private health plan selling to the feds would need safety initiatives.
We are being asked to look at the disastrous effects of medical mistakes. We all know the chilling examples from the newspaper. The doctor who amputated the wrong leg or removed the healthy kidney. The prescription written or filled for the wrong medicine, the wrong dosage.
Until now, the focus of "patients' rights" has been on the right to sue, the right to assess the blame of individual doctors when things go wrong. Now the focus is changing from evil to error, from flawed doctors to flawed systems. As the godfather of error-prevention efforts, Harvard's Dr. Lucian Leape, likes to put it, what happens when good people do bad things?
More people die each year from medical mistakes than from auto accidents or breast cancer or AIDS. Indeed, in the most popular analogy, fatal medical accidents are the equivalent of three jumbo jets going down every two days.
But Dr. Donald Berwick, who served on the institute committee, says, "the great safety systems of the world do not depend on exhorting people to be more vigilant or asking them to try harder not to screw up or scaring them with what happens if they do screw up."
The safety plans that surround aviation or nuclear plants or hazardous work, says Berwick, "build dikes around human fallibility." Aviation has developed systems to reduce the risk of mistakes and make it easier to identify them. And it has imposed rules about openness in dealing with crashes and close calls.
Compare that to medicine. There's no black box in the operating room. Doctors have a code of silence that makes even the police "blue code" relatively garrulous.
In medicine, Leape says, "there's a standard of perfection that's unrealistic but makes it very difficult for doctors and nurses to deal with errors when they occur."
The Boston Globe