Shortening Medical Education
In a recent article in JAMA, Emanuel and Fuchs argue that medical education (including undergraduate education, medical school, residency, and fellowship training) can be reduced by up to 30% in order to save money and get new physicians practicing sooner. They arrive at these figures by proposing that the average of 14 years currently spent training (4 years of college, 4 years of medical school, 3–4 years of residency, and 3–4 years of fellowship) could be trimmed to 10 years without diminishing the skills or knowledge of the trainee. I would argue that the paper does not justify the assertions made, but more importantly, I would question whether this is even a worthwhile pursuit.
The authors first suggest that the way in which medicine is practiced has changed — this is undoubtedly true. Few, if any, individuals possess capacity to master the various skills and knowledge to care for any clinical problem, effectively teach and mentor trainees, perform valuable research, and improve the quality and practices of the healthcare environment. It takes a multidisciplinary team to provide everything necessary in patient care and medical education. I agree absolutely with this. What I don’t understand is how this premise leads to the conclusion that we should therefore shorten medical education.
Emanuel and Fuchs point to the many combined programs in the US (and abroad) that combine undergraduate work with medical school in a total of 6 years, rather than 8. I am not familiar with data showing whether these students fare any better or worse than “8-year” students, and I will take the authors word that the limited available data do not show a discrepancy. I looked at some of these programs when I was applying to college, and frankly I would never consider doing it. I think it’s a shame when students major in “Premed” — there’s no such thing. When embarking on a medical career, there will be many opportunities in college that one will never be able to experience again. I would encourage prospective physicians to not cut themselves out of even a single day. But that is my opinion, and I agree with the authors that these programs should continue to exist for those who desire them.
Next, they suggest that there is data that medical school can be cut from 4 years to 3 years, but seem to fail to show data backing their assertions. They point to three programs that trim certain requirements (amounting to 6 months per program), but the overall program lengths still remain at 4 years. I would agree that we don’t know which exact requirements will be absolutely necessary for all students, or for any given student Trimming 6 months of basic science work at one program, and 6 months of required clinical rotations at another program do not add together as evidence that one can trim an entire year from a students education. Furthermore, the example programs they use include Duke and Harvard. And let’s be honest — to suggest that because the average Harvard medical student can do something means that all medical students across the nation can do the same thing is a completely asinine proposition. Just because some students complete their PhD coursework at the age of 16 is not evidence that anyone could accomplish that feat. The authors then note that three programs have done just this, but do not offer any data on how successful graduates of this program have been.
Moving on to residency — there is a great deal of diversity in residency programs in terms of length, flexibility in rotations, and how much “fluff” may be included. I would not doubt that most programs could strip out a “cush” rotation or two. But the problem is that it is an unfounded argument to go from the premise some rotations in a residency being less valuable than others to the conclusion that the overall length of residency can be reduced without any untoward effects. One concept in the way people become experts in a field involves the idea of “time on the ice” — the idea being that a certain amount of expertise is developed simply by being around the environment (the ice being a reference to hockey, IIRC). In the case of medicine, it takes a certain amount of time and exposure to the hospital, the clinics, patients, staff, other physicians, etc. to simply absorb what it means to “be” a physician. Similar concepts were described in Malcolm Gladwell’s book, Outliers. The “10,000 hour” rule suggests that it takes a certain amount of time and practice to become good at something.
Residency training has undergone drastic cuts in hours over the last few years, and will probably continue to do so. Without getting into a big argument over whether this is good or bad, suffice it to say that residents graduating from a 3-year program today will have seen dramatically fewer cases and have dramatically less experience than residents graduating 10 years ago. We have yet to see what the long term effects of these cuts will be, but it certainly casts into doubt the premise that residencies under these work hour restrictions can be cut by 25–33% with no ill effects, as the authors propose:
The third year of internal medicine or pediatric residencies or the research year in surgical specialties could be eliminated without compromising the clinical quality of trainees.
Anyone who has spent even a modicum of time around residents should be able to recognize the difference between the average resident at the end of their second year, and the average resident at the end of their third year. It’s ridiculous to propose that this year is not necessary. I am certain that there are some residents who would do just fine if they cut short their residency, particularly if they move on into a fellowship program where the remain under relatively close supervision. But this is not the proposal made by the authors.