During the last few months, the media has been overflowing with stories about the staggering American economy and soaring consumer (and national) debt. One story that has been marginalized in the wake of the current fiscal crisis is the soaring cost of medical education and the dangers that this poses to the future of the healthcare industry.
I’m sure all of us remember that warm spring morning years ago when we got our acceptance letters to medical school. Soon thereafter, that glamour faded and sticker shock set in as tuition statements started rolling in. Well, for future medical students, the pain is only going to get worse. The NEJM reported this week nearly a quarter of medical school students graduating in 2008 have more than $200,000 in debt, a drastic increase from the nearly 10% of medical students with that much debt just four short years ago.
The underlying problem is that grants and scholarships have not been able to keep pace with the skyrocketing cost of obtaining a medical degree. Universities feeling the pinch from shrinking endowments have had to scale back on their own financial aid programs and increase tuition to stay in the black. Furthermore, governmental aid through loan subsidization has been steadily declining, an unfortunate situation as the current recession deepens. Unsubsidized loans have become the backbone of most financial aid packages today.
One can argue that student debt is just an academic argument because physicians will be able to pay back their loans, but there are several problems with that logic. Already, there is a dearth of primary care physicians. One way that medical schools have tried to address this problem is by increasing the number of graduates per year by expanding medical school class size and the creation of several new schools of medicine. However, new graduates, faced with hundreds of thousands of dollars of debt, have begun to eschew primary care specialties for fields with more earning potential. What does this mean for the future of primary care? Fewer doctors, more patients, and more frustration.
Second, Dr. Steinbrook in his NEJM article writes that medical students are essentially becoming economically and socially homogenized. Prospective students from lower income families may eventually find the cost of a medical education too prohibitive to even dream of becoming a doctor. Already, half of the current students enrolled in American medical students come from the top 20% for family income. If the price tag for educating a doctor continues to spiral upward without concurrent aid for those in need, the medical profession runs the risk of mirroring a plutocracy.
While we are talking about Primary Care and preventative medicine, it seems that screening colonoscopies may not be as effective as we once believed. A Canadian study published in the Annals of Internal Medicine looking at the utility of colorectal cancer screening made headlines recently. The authors’ conducted the study because real-world data seemed to indicate that the long-held belief that screening colonoscopies could prevent colorectal cancer (CRC) by as much as 90% (as based on clinical trial data) was in fact wrong. In their case control study looking at more than 60,000 patients, they showed that colonoscopies prevented only about 60% of CRC, and these were almost entirely left-sided lesions. While 60% is still a huge reduction, making colonoscopies a very effective tool in preventing colon cancer, the American Cancer Society does not want to change their screening guidelines in light of these new results. However, what is more troubling in the Canadian study was the large number of right-sided lesions that were missed.
As mentioned above, much of the reduction in colorectal cancer mortality was due to finding and removing left-sided lesions. The odds ratio for CRC death from a right-sided lesion after complete screening colonscopy was 0.99, whereas it was 0.33 for left-sided lesions and 0.63 overall. The researchers noted several factors that may have contributed to the lack of effectiveness in finding right colonic lesions, namely poor bowel prep, inexperience of the endoscopists (many of the physicians performing colonoscopies in the Canadian study were family practitioners and general internists, not gastroenterologists), and inherently different appearance and biological features (including faster growth rates) of right-sided lesions. Unfortunately however, better technology through CT colonography does not appear to be a solution to this problem, as the “virtual colonoscopy” is just as capable of missing right-sided lesions as conventional optical colonoscopy. So in summary, we should still be advising patients on the benefits of colonoscopy, but perhaps we should no longer be as confident that our patients will be totally protected.
Speaking of screening colonoscopies, the American Cancer Society just reported on the wide discrepancy in deaths from colorectal cancer between blacks and whites. Although the incidence of colorectal cancer has been steadily dropping (66.3 per 100,000 in 1985 down to 46.4 per 100,000 in 2005), the death rate from CRC is 22% higher among blacks than whites. Much of the blame is placed upon the differences in screening rates. The ACS reports that 45.8% of whites over 50 years of age are screened, while only 36.9% of blacks are. Hopefully, with these new findings, we could persuade more people to undergo the appropriate cancer screening.
To wrap up, let’s shine some light on a few of those medical myths running rampant through the public, myths that we ourselves are partially at fault for perpetuating. We’ve all been told of the seemingly limitless possibilities that we could achieve if we only realized our true mental powers. After all, humans only use 10% of their brains, right. False. Although grounded in no hard science, this myth has been around for more than a century, torturing underachievers the world over. Advances in neuroscience and functional neuroimaging have repeatedly shown that we use much, much more than a measly10%.
How about telling patients to drink 8 glasses of water per day. Water, it’s good for you, sounds healthy. As one who doesn’t relish the thought of drinking half a gallon of water every day and hunting for the men’s room more often than the guys in the Flomax® commercial, I’m happy to hear that this is also untrue. We usually get enough fluid intake from the water, juice, food, even coffee that we consume on a daily basis.
One more myth to debunk before the year is out: shaving you hair makes it grow back faster, darker, and coarser. Now, I’ve known this was false since I was a teenager as the first signs of adult hair began to sprout on my upper lip and chin. All the shaving in the world didn’t make that moustache and beard grow in any faster. Surprisingly, a clinical trial examining this question was done in 1928 confirming that shaving had no effect on the rate of hair growth.
So what did we learn this week from the world of science: debt, bad; colonoscopies, good but not as great as we thought; and shaving is not a solution to male pattern baldness. Time for me and the 90% of my dormant brain to settle in for a long winter’s nap.
Reviewed by Joshua Remick MD, Clinical Correlations Section Editor