Why We Don’t Do Autopsies Anymore
Also check out Body of Evidence: 7 Questions with a Forensic Pathologist, an interview with a Cook County medical examiner.
I‘m in the morgue, not my favorite place. Heads loll, limp and ghoulish, as their bodies are hefted on and off the long chrome dissection tables. Big slabs of human beef, darkening by the minute, glisten under hot white lights. Right now, I’m looking at Ms. Dubois, an elderly Alzheimer’s patient with a giant heart clot we had seen on her echocardiogram a week earlier in the Emergency Department. Now, here, her heart has been cut out of her chest and put on a small examining table. My team and I gather around with the pathologist. The rest of Ms. Dubois lies next to us, her body respectfully covered with sheets, her face only a little more vacant than it was yesterday before she died. It wasn’t her dementia that killed her. Alzheimer’s disease empties you out, leaving behind a husk where the person used to be, but it doesn’t kill you. Something else — a heart attack, a stroke, pneumonia — does that. We’ve come down to the morgue to see what got Ms. Dubois.
With a twist of his wrist, the pathologist flips open Ms. Dubois’s disembodied heart, filleted along a plane that best shows her mitral valve. One of the doctors gasps when she sees Ms. Dubois’s artificial heart valve. The rest of us, including myself, are dumbstruck.
I haven’t seen a valve like Ms. Dubois’s since I was a medical student, observing open-heart surgery for the first time — it looks like a miniature metal birdcage with a Ping-Pong ball inside. But it’s not the antiquity of her heart valve that shocks us at the autopsy table. Until she wound up in the dementia unit of a local nursing home a few years ago, Ms. Dubois’s old birdcage valve had served her well for decades.
Jeez, one doctor says now, peering intently into Ms. Dubois’s heart. The whole thing came off!
Hmph, says another, unable to avert his eyes from the sight of it. I didn’t know that could happen.
I didn’t know it could happen either. The thrombus itself, a round red glob of congealed blood about two inches in diameter, is impressive enough, huge as heart clots go. But we’d known how big it was right from day one, and we’d also known why it was there; by mistake, her blood-thinner medication had been under-dosed for the past few weeks at her nursing home, putting her at risk for a stroke. Part of the clot could break off from her heart and travel through her circulation into her brain. On top of her advanced dementia, a stroke was the last thing Ms. Dubois needed. Treatment with an intravenous blood thinner, as we had done these past several days, almost always helps to dissolve the clot and prevent such complications. No one had anticipated that the whole clot could break off. But there it is, the entire blood clot, dislodged from the wall of Ms. Dubois’s heart, stuck in the struts of her old metal valve. It fills the old birdcage completely, obstructing any output of blood from her heart.
No wonder she couldn’t be resuscitated after her cardiac arrest. There’s no way to treat this; prevention is the only hope. That’s why we’d needed to see Ms. Dubois’s autopsy. Had she died without an autopsy, the doctors on my team probably wouldn’t remember her a few months from now, just another old lady whose time had come. But they’ll remember her now. They’ll remember that this is what can happen when mistakes are made, when meticulous care is not taken to manage risky medicines like blood thinners. I don’t know whether the nursing home will get sued for its error, but I make a mental note to call our hospital’s legal counsel after I call Ms. Dubois’s son to inform him about the autopsy findings.
Almost 100 years ago, Sir William Osler, the most renowned and revered physician since Hippocrates, actually requested his own autopsy. Osler told family and friends that, given his lifelong interest in the case, he wished he could attend his own autopsy to see it for himself. Typically Oslerian, this choice of words was as precise as it was good-natured: The word autopsy literally means “to see for oneself.” Osler died at his home in England after a protracted battle with bacterial pneumonia in 1919, nine years before the discovery of penicillin. (In his renowned pre-antibiotic-era textbook, The Principles and Practice of Medicine, Osler had dubbed pneumonia “the captain of the men of death.”) Osler wanted his own physician, Dr. A.G. Gibson, to know why he had died, in the hope that this knowledge might help other patients in the future. So, as Osler famously had done more than a thousand times for his own patients, Osler’s doctor performed Osler’s autopsy. In the kitchen of the Osler family home.
In Osler’s time, autopsies were bellwethers and benchmarks. A high autopsy rate strengthened a hospital’s reputation; it indicated that the medical staff wanted to learn as much as possible about their sickest patients — the ones who died — in an effort to improve doctors’ diagnostic capabilities, perhaps gain scientific insights, and avoid error in the future. Since 1761, many hospitals had operated busy “autopsy theaters” where plaques on the wall read Hic est locus ubi mors gaudet succorso vitae (“This is the place where death rejoices to come to the aid of life”). Through the first half of the 20th century, about 50 percent of all patients who died in U.S. hospitals had autopsies; most major teaching hospitals exceeded that rate. In fact, in many teaching hospitals as recently as the 1970s, interns and residents openly competed to achieve the highest autopsy rate among their patients who had died. At Morbidity and Mortality (M&M) Conferences around the world, premortem diagnoses and treatments were compared with the indisputable gold standard, the patient’s post-mortem examination findings. In the days before CT scans and MRI machines, this was how doctors, young and old, looked inside their patients and “saw for themselves” why their patients had died. And, in some cases, how they might have been saved.
Osler’s own autopsy revealed no surprises. It showed that his doctors had done all they could have done, given the (primitive) state of medical science at the time. In this regard, Osler was luckier than most. In his era, autopsies frequently revealed misdiagnoses and missed opportunities to save the patient.
Remarkably, despite spectacular advances in medical care, such misdiagnoses remain commonplace today. In 1983, researchers at Brigham and Women’s Hospital in Boston — by all accounts, then and now one of the best hospitals in the world — asked whether autopsies were still worth doing. Conventional wisdom thought not. Medicine’s diagnostic armamentarium had grown dramatically since Osler’s time. Powerful new imaging technologies — ultrasound, nuclear scanning, computed tomography (CT), angiography — had transformed the practice of medicine, allowing doctors to peer inside living patients more clearly than ever before. Because autopsies are labor intensive, cost money (today, about $2,000), and sometimes make patients’ loved ones uncomfortable, regulatory agencies eliminated minimum mandatory autopsy rates as a criterion for accreditation of U.S. hospitals. The Brigham hospital researchers weren’t sure this change was a good idea and designed a study to examine whether modern diagnostic technologies had made misdiagnosis a thing of the past. They reviewed post-mortem examinations performed at the Brigham in 1960, 1970, and 1980 to compare the “yield” of autopsies in those decades.
These researchers found that autopsies in 1960 had revealed a major missed diagnosis in 22 percent, almost 1 in every 4 patients. Of these, about one-third (8 percent) showed that a correct diagnosis premortem could have led to the patient’s cure or improved survival. The remaining two-thirds (14 percent) found diagnostic errors that contributed to the patient’s death but probably could not have been treated successfully (in 1960). In other words, 1 of every 12 patients dying at one of the world’s best hospitals could have been saved had their doctors made the correct diagnosis. In addition, another 1 in 7 patients who died had diseases that, unrecognized by their doctors, contributed to their death. Little wonder that doctors at the Brigham in 1960, like those in Osler’s time, tried hard to obtain permission for autopsy whenever a patient died. This was how they learned. Doctors learned from their mistakes.
But what shocked many in the medical community was the finding that the rate of missed diagnoses documented by autopsy at the Brigham hospital hadn’t decreased at all 20 years later! In 1970 and in 1980, the rate of major missed diagnoses was 23 percent and 21 percent, respectively, no different from rates in 1960. This lack of improvement did not mean that the Brigham doctors were failing to learn from their autopsies. Clearly they were learning, because most of the fatal diagnoses missed in 1960 — blood clots in the lungs, bacterial infections such as pneumonia or meningitis, various cancers — were missed much less frequently in 1980. But, during those 20 years, medical progress had created new diagnostic challenges. For example, previously rare infections had become increasingly prevalent as complications of new treatments (immunosuppressive drugs) and new diseases (AIDS). As a result, autopsies in 1980 revealed significant changes in the specific type, but not the overall rate, of major missed diagnoses. The Brigham researchers concluded that, despite medical progress — indeed because of medical progress — the “autopsy remains a vital component in the assurance of good medical care.”
And yet today, three decades later, the autopsy rate in U.S. hospitals is less than 5 percent. Many hospitals perform none at all. In 2004, a new generation of researchers found that fatal diagnostic errors have declined somewhat in the past 40 years but estimated that, if autopsies were performed on 100 percent of patients who die in U.S. hospitals today, the rate of major missed diagnoses would range from a low of 8.4 percent (1 in 12 deaths) to a high of 24.4 percent (1 in 4 deaths). Even if one accepts the lower estimate in this range (8.4 percent), it means that more than 70,000 people die in U.S. hospitals every year with major missed diagnoses; about 30,000 of these patients would leave the hospital alive if their diagnosis were not missed. More chilling, these potentially preventable deaths are not included in the Institute of Medicine’s sobering estimate that up to 98,000 patients die annually in U.S. hospitals due to medical error.
The implications are grave, and not just because autopsies are an indispensable quality improvement tool in hospitals. Autopsies establish the cause of deaths, thus ensuring the accuracy of national vital statistics. Today, in the absence of autopsies, it is estimated that at least one-third of all death certificates are incorrect. Autopsies also keep medical educators honest, showing medical students and physicians-in-training the final truth about their patients who die. Autopsies reassure family members of the deceased, protect against false medico-legal liability claims, evaluate the effectiveness of new treatments, improve our understanding of the natural history of disease, and identify new or emerging diseases. Research about Alzheimer’s and other brain diseases, for example, depends on autopsies. (How else can one study cells deep in the brain, inaccessible during life?) Societal responses to public health threats, whether new diseases (such as HIV and SARS) or bioterrorism attacks (such as anthrax outbreaks), depend critically on autopsy findings, too.
Experts at the Mayo Clinic have concluded that “a wide range of medical, legal, social, and economic causes” are responsible for the decline of nonforensic autopsies and proposed no fewer than 46 interventions to reverse this trend. But much of this problem, like other ills afflicting U.S. healthcare today, boils down to three things: money, public misinformation, and doctors’ conflicts of interest.
First, follow the money. Payment for autopsies was built into Medicare’s reimbursement to hospitals decades ago because Medicare beneficiaries account for 75 percent of all deaths in the United States. Perversely, then, hospitals can increase their profits by not spending those resources on autopsies. Pathologists also can make more money by not doing autopsies, devoting their time instead to more lucrative services for the living.
Second, the public doesn’t care, because the public doesn’t understand the importance of autopsies. Many people refuse to consent to autopsies, mistakenly believing that the post-mortem examination disfigures the body or delays funeral arrangements. When doctors take the time to explain these things, autopsy rates tend to rise.
Finally, many doctors are conflicted themselves about the risks and benefits of autopsies. The risks to the doctor may seem obvious: If an autopsy shows that the doctor missed an important diagnosis, this would seem to increase the likelihood of medical malpractice complaints. In fact, lawsuits are less likely when deceased patients undergo autopsies. And, despite all evidence to the contrary, many doctors continue to believe that the accuracy of modern diagnostic testing is so great that it renders post-mortem diagnosis largely superfluous. In the majority of cases, they’re right, since about 80 percent of autopsies confirm the accuracy of doctors’ premortem diagnoses. But is this as good as we can do? Certainly not. We can and must do better.
Niels Bohr, the legendary Nobel laureate in physics, defined an expert as one “who has made every imaginable mistake in a very narrow field.” Ultimately, this is the most important benefit of autopsies: to improve doctors’ diagnostic expertise by letting them “see for themselves” their diagnostic mistakes. The quickening disappearance of the medical autopsy today poses a critical, unanswerable question: How will doctors achieve greater diagnostic expertise — how will we learn, and improve — if we don’t know what we’re missing?
Osler is turning over in his grave.
This article is featured in the May/June 2017 issue of The Saturday Evening Post. Subscribe to the magazine for more art, inspiring stories, fiction, humor, and features from our archives.
Dr. Brendan Reilly is a former executive vice chair of medicine at New York-Presbyterian Hospital/Weill Cornell Medical Center. A widely published clinical researcher and educator, Reilly has served as the chair of medicine and physician-in-chief at Chicago’s Cook County Hospital, which, during his 13-year tenure there, was the inspiration (and setting) for the hit NBC television series ER.
From One Doctor by Brendan Reilly, M.D. Copyright © 2013 by Brendan Reilly, M.D. Reprinted by permission of Atria Books, a division of Simon & Schuster, Inc.